LSE report provided real positives for Academies programme under last Government
But concerns remain that academic results may not be quite as they at first seem
The research, by Stephen Machin and James Vernoit of the London School of Economics, recently, as we have reported, produced some conclusions which look very positive for the Academies programme. Their study found that “turning a school into an academy improves its performance – and that of neighbouring schools”. The study was based on an analysis of pupil-by-pupil results of schools turned into academies under Labour, in the years 2002-9, when most of the institutions converting had low GCSE results. Their report includes a caveat that it does not relate to academies which have converted since the coalition came to power. Its isn’t straightforward, though, to compare Academies performance with other schools. Over the past few years, governments have looked at the GCSE (or equivalent,) results of academies, and compared them to those of the schools these academies replaced. On average, the government assert academy results are improving at a faster rate than those of the predecessor schools. NAO reports certainly seem to find this. Therefore, it can be argued, that the academies policy has so far been a success. It probably has been but we have to be careful about how data is used and to ensure we compare like with like. Warwick Mansell has blogged on this issue and encourages us to drill down a bit into the data (he is rather good at it by the way) before we reach any firm conclusions. Machin and Veniot found the quality of the intake of these Academies has improved over time. In other words, the academies under study were taking in pupils with better key stage 2 results than had been achieved by pupils entering the schools the academies replaced. Generally speaking if you improve a schools intake, then the educational outcomes improve. Its not rocket science. However, the research also found after taking this pupil intake factor into account, the results achieved in the academies were still better than those achieved by a control group of schools. It has often been claimed by critics that Academies cream off the best pupils from neighbouring schools and leave in their wake underperforming sink schools with falling rolls. But the LSE study found that the results in these neighbouring schools at GCSE also improved. The paper also suggests that this was probably the result of greater competition from an academy nearby spurring improvement, by the neighbouring schools.
But what about the academic results? There have been claims articulated, amongst others, by the respected Think tank Civitas, that Academies have boosted their results artificially and have been shy historically about let us know what exams their pupils sit. The charge is that that disproportionately they use non-GCSE qualifications. Under the system in operation in recent years, other courses are counted as “equivalent” to GCSEs, for league table and results purposes. This is the case for the main measure used in this study: the proportion of pupils in each school achieving five A*-C grades at GCSE or vocational equivalent, including maths and English. Some of the GCSE-equivalent courses have been given high weightings in the results formulae – worth up to four GCSEs – which means that they can have a heavy influence on the overall published results. Schools encouraging high numbers of pupils to take these courses – whether they are doing so because of their own need to boost results, because of students’ needs or a bit of both – are therefore likely to get a results improvement out of doing so. Might not academies, then, under greater pressure to produce results gains, simply be turning to these courses to a greater degree than other schools? Well, not according to the LSE study which dismisses claims that Academies are in effect gaming. The LSE study says its figures do not show the improved results at academies are the product of gains in “unconventional” subjects.
Warwick Mansell though contests the LSE findings on this score. He has revealed that on the 2010 figures, “GCSE-equivalent” courses have contributed far more to academies’ headline results than they have at non-academy schools.(although it should noted that there are plenty of schools that do not have Academy status who have also been accused of gaming) Mansell also makes a valid point, it seems, in respect of the new Ebacc. Remember the Ebacc was introduced partly to flush out schools that were gaming-ie using GCSE equivalents to inflate their performance. So how come Academies rate so poorly on the new Ebacc measurement introduced by Gove if there has been no gaming going on. Against this measure nearly a third of academies with results to report had a score of zero per cent on the English Bacc. The Ebacc, of course, records the proportion of pupils in each school with A*-Cs in English, maths, two sciences, a language and history/geography. Furthermore, the proportion of academies with that zero score on the EBacc was twice as high as it was with a comparison group of schools with similar intakes. The LSE report covers Academies performance under the last Government and clearly delivered some good positive news for the programme. But the issue of whether or not Academies have actually performed as well as at first seems has not been entirely resolved and hasn’t been helped by the fact that until recently Academies were not subject to the Freedom of Information Act. With Civitas shortly to deliver another report covering this area it is unlikely that the debate will go away any time soon.