LSE report provided real positives for Academies programme under last Government
But concerns remain that Academic results may not be quite as they at first seem
The research, by Stephen Machin and James Vernoit of the London School of Economics, recently, as we have previously reported, produced some conclusions which look very positive for the Academies programme. The study found that turning a school into an academy improves its performance – and, crucially, that of neighbouring schools. The study was based on an analysis of pupil-by-pupil results of schools turned into academies under Labour, in the years 2002-9, when most of the institutions converting had low GCSE results. Its isn’t easy, or straightforward, though to compare Academies performance with other schools. To make useful comparisons you have to compare like with like. Are you comparing for example GCSE to GCSE performance or comparing GCSE with CGCE equivalent performance?
Over the past few years, governments have looked at the GCSE (or equivalents ) results of academies, and compared them to those of the schools these academies replaced. On average, they have tended to find academy results improving at a faster rate than those of the predecessor schools. Therefore, it can plausibly be argued that the academies policy has, so far at least based on available data, been a success or at least the first tranche of Academies under the last government . But there are some caveats.
Machin and Veniot found that the quality of the intake of these Academies improved over time. In other words, the academies under study were taking in pupils with better key stage 2 results than had been achieved by pupils entering the schools that the academies replaced. It is self-evident that if you improve a schools intake, then the educational outcomes of that schools are likely to improve. Its not rocket science. However, the research also found that after taking this pupil intake factor into account, the results achieved in the academies were still better than those achieved by a control group of schools. So that seems to address that particular concern.
It has also often been claimed by critics that Academies cream off the best pupils from neighbouring schools and leave, in their wake, underperforming sink schools with falling rolls. In a sense it’s a zero sum game-if one school improves its neighbours performance will decline. However, the LSE study found that the results in these neighbouring schools at GCSE also happened to improve as well. We already know that the Zero sum game theory doesn’t hold much credibility as far as schools are concerned in that there are many examples throughout the system where neighbouring state schools with exactly the same governance structures and intakes have huge differences in performance . Better performing schools do not have to rely on improving their intake by poaching bright children from neighbouring schools. Other factors are at play. The paper suggests that this, ie other schools improvement , was probably the result of greater competition from an academy nearby spurring improvement, by the neighbouring schools.
But what about the academic results? There have been claims articulated, amongst others, by the respected Think tank Civitas, that Academies have boosted their results artificially and have been shy historically about being open about what exams their pupils sit. The charge is that disproportionately they use non-GCSE qualifications or so- called GCSE equivalents to boost their position in league tables . Under the system in operation in recent years, other courses are counted as “equivalent” to GCSEs, for league table and results purposes. This is the case for the main measure used in this study: the proportion of pupils in each school achieving five A*-C grades at GCSE or vocational equivalent, including maths and English. Some of the GCSE-equivalent courses have been given high weightings in the results formulae – worth up to four GCSEs. This means that they can have a heavy influence on the overall published results. This raises ,of course, a worrying question. Might not some schools be entering pupils for particular exams because it is in the schools perceived interests to do so rather than that of the pupil ? Might not academies, then, under greater political pressure to produce results gains, simply be turning to these courses to a greater degree than other schools? Well, not according to the LSE study which dismisses claims that Academies are in effect gaming. The LSE study says clearly that its figures do not show the improved results at academies are the product of gains in “unconventional” subjects.
So that’s pretty clear then isn’t it? Well, maybe not. Warwick Mansell in his blog contests the LSE findings on this score. He has revealed that on the 2010 figures, “GCSE-equivalent” courses have contributed far more to academies’ headline results than they have at non-academy schools(although it should be noted that there are plenty of schools that do not have Academy status who have also been accused of gaming)
Mansell also makes a valid point, it seems, in respect of the new Ebacc. Remember the Ebacc was introduced partly to flush out schools that were gaming-ie using GCSE equivalents to inflate their performance. So how come Academies rate so poorly on the new Ebacc measurement introduced by Gove if there has been no gaming going on? Against this measure nearly a third of academies with results to report had a score of zero per cent on the English Bacc. The Ebacc, of course, records the proportion of pupils in each school with A*-Cs in English, maths, two sciences, a language and history/geography. Furthermore, the proportion of academies with that zero score on the EBacc was twice as high as it was with a comparison group of schools with similar intakes.
The LSE report covers Academies performance under the last Government and clearly delivered some good, positive news for the programme. But the issue of whether or not Academies have actually performed as well as at first seems has not been entirely resolved and hasn’t been helped by the fact that until very recently Academies were not subject to the Freedom of Information Act. So getting information from some of them at least (not all) was difficult. With Civitas shortly to deliver another report covering this area it is unlikely that the debate will go away any time soon.