The Treasury

Global Navigation

Personal tools

Treasury
Publication

Adult literacy and economic growth - WP 04/24

                                                                                                                                                                                                                                                                                  6.4  Community and family literacy programmes

A number of community and family literacy programmes have been evaluated, although most studies rely on self-reported measures of impact. Many are also methodologically flawed. Beder (1999) reviews the best of these evaluation studies, dating back to 1968. However, with the exception of St. Pierre, Swartz, Gamse, Murray, Deck and Nickel (1995), the findings of these studies are too unreliable to use.

St. Pierre et al (1995) evaluate the ongoing Even Start Family Literacy Program in the United States. Even Start is offered to adults with poor literacy skills and to their children. The programme includes adult literacy training, together with early childhood education and parenting education. As part of the Even Start evaluation, 200 families were randomly assigned to be in either Even Start or in a control group. After 18 months, adults in both the participant group and the control group had made gains in measured literacy skills and the difference between the groups was not statistically significant. The increased literacy skills amongst the control group was puzzling, although nearly a quarter of controls had in fact participated in other sorts of literacy programmes. People who participated in Even Start were significantly more likely to obtain a GED after 18 months, however: 22.4% of Even Start adults attained a GED compared to 5.7% of adults in control group families.

In a more recent study, not included in Beder’s (1999) review, Brooks, Davies, Ducke, Hutchison, Kendall and Wilkin (2001a) study the progress made in literacy by adults in dedicated, mainstream basic skills programmes in England and Wales. A sample of 1,224 learners from across the two countries were given two reading tests based largely on IALS, and were given estimated composite IALS scores between 0 and 500. The period between tests varied, but did not exceed 20 weeks of literacy provision. The mean score on the first test was 214.3 and on the second was 225.4. The difference between these means of 11.1 points (which is 0.22 of a standard deviation on this particular scale) was statistically significant and is quite considerable given the brief periods of tuition many learners experienced.[35] More than half of those taking the second test received less than 40 hours tuition and only 17% received more than 60 hours. On the results of the first test, 48% of the sample were in Level 1 of IALS, and in the second this had reduced to 43%. Although some students did worse on the second test, considerably more did better.

6.5  Conclusion

Table 7 summarises the results of the studies reviewed in this chapter. Taken as a whole, they provide good evidence that adult basic skills programmes can increase educational attainment, as measured by receipt of a GED; provide some evidence that programmes can lead to increases in earnings (total earnings, if not hourly earnings); and provide little evidence that programmes can increase people’s literacy skills, with two notable exceptions – the San Diego GAIN programme and the evaluation by Brooks et al (2001a). Only in Krueger and Rouse (1998) is the cost of a basic skills programme discussed. The other studies reviewed above either do not report information on costs, or are concerned with the costs of a wider programme of which adult education is only one component.

Table 7– Summary of the findings reported in Chapter 6
  GED receipt Measured literacy gain Earnings
NEWWS [y] =  
Alameda GAIN [y] [y] n.s. [y] n.s.
Los Angeles GAIN [y] n.s. (marginal) [y] n.s. [y] n.s.
Riverside GAIN [y] n.s. [x] n.s. (marginal) [y]
San Diego GAIN [y] n.s. (marginal) [y] [y] n.s.
Tulare GAIN [y] [x] n.s. [y]
JTPA [y]   [y] n.s.
Washington Workforce Training     [y]
Krueger and Rouse, manufacturing     [y]
Krueger and Rouse, service     [x] n.s.
Even Start [y] [y]  
Brooks et al   [y]  

[y] denotes positive effect; [x] negative effect; = no difference; n.s. differences not significant

It is clear that literacy skills and GED receipt do not necessarily go hand-in-hand: the combination of modest increases in GED receipt and a lack of significant gains in measured literacy and numeracy skills is found in many of the studies. There are a number of possible explanations for this apparent discrepancy. One is that GED receipt requires building up knowledge, and applying existing skills to specific topics, rather than any improvements in literacy or numeracy skills. Another is that people taking the literacy tests did not face the same incentives to achieve as people taking the GED. It may also be that the literacy tests used in the assessments were not sufficiently sensitive, or not appropriate for measuring the types of literacy skills learned on these training courses.

Literacy skills and earnings also do not appear to be necessarily connected. The treatment group in the Riverside GAIN programme experienced significantly higher earnings than controls over subsequent years but appeared to have lower literacy skills. Conversely, the treatment group in the San Diego GAIN programme had higher literacy skills than the controls but there was no significant difference in earnings. In the GAIN and JOBS programmes, adult education is part of a wider package of services including work experience, job search assistance and vocational training. It might be that these services, and not the basic education components, drive any subsequent gains in employment and earnings. Supporting this hypothesis is the fact that employment-focused JOBS programmes have had better employment and earnings outcomes than education-focused programmes (Hamilton et al 2001). On the other hand, it might be that it takes a long time to translate increases in literacy into increases in earnings.

The experimental studies reviewed above – the NEWWS, GAIN, JTPA and Even Start evaluations – are the most sophisticated of the studies discussed in this chapter but their random assignment methodology means that they examine the effects of being referred to, or eligible for, a particular basic skills programme rather than actually taking part in the programme. This needs to be kept in mind when interpreting the results of these studies. Many of the people in treatment groups did not actually participate in adult education programmes, or did so only for a brief period; conversely, some of the people in control groups took part in adult training courses of their own volition. It may have been the case, in fact, that adult education did improve the literacy skills or earnings of participants, but that these improvements were too small or made by too few people to affect the treatment-control comparisons made in the studies. The evaluation of the Washington Workforce Training Study and the workplace literacy study of Krueger and Rouse (1998), on the other hand, look at the experiences of people who started basic skills programmes (and may or may not have completed them), compared to those who didn’t start. Brooks et al (2001a) goes even further, studying the literacy gains of people who stayed in a literacy programme over a period of time and didn’t drop out. This might go some way towards explaining why Brooks et al report a considerable increase in literacy after a short period of tuition and the employment-related programmes, and Even Start, generally report an insignificant increase in literacy after a much longer period of tuition.

Another reason for the difference in literacy gain reported in Brooks et al and in the JOBS and GAIN programmes is that JOBS and GAIN are mandatory. The adult students in Brooks et al sought out programmes and enrolled voluntarily, and therefore exhibited a motivation to learn. Students in a welfare-to-work programme may, at least initially, be motivated to attend classes less by the desire to learn than by the desire to avoid reductions in their welfare benefits. Finally, the welfare-to-work programmes tested the literacy skills of their samples after two-to-three years, while Brooks et al tested their sample when they were still participating in a literacy programme. It would be interesting to know whether the literacy gains that Brooks et al report persist over time or decline. In fact, a good deal more about the effectiveness of adult literacy programmes in England and Wales should be known over the next few years, as the current expansion of adult literacy programmes in those countries will be accompanied by a series of evaluations.

Notes

  • [35]The authors, on the other hand, downplay these results, calling them “undramatic but worthwhile” (p.1).
Page top