ARE ACADEMIES PERFORMING BETTER THAN OTHER SCHOOLS?

ACADEMY RESULTS

LSE report provided real positives for Academies programme under last Government

But concerns remain that Academic results may not be quite as they at first seem

Comment

The research, by Stephen Machin and James Vernoit of the London School of Economics, recently, as we have previously reported, produced some conclusions which look very positive for the Academies programme. The  study found that turning a school into an academy improves its performance – and, crucially,  that of neighbouring schools. The study was based on an analysis of pupil-by-pupil results of schools turned into academies under Labour, in the years 2002-9, when most of the institutions converting had low GCSE results. Its isn’t easy, or straightforward, though to compare Academies performance with other schools. To make useful  comparisons you have to compare like with like. Are you comparing for example GCSE to GCSE performance or comparing GCSE with CGCE equivalent performance?

Over the past few years, governments have looked at the GCSE (or equivalents ) results of academies, and compared them to those of the schools these academies replaced. On average, they have tended to find academy results improving at a faster rate than those of the predecessor schools. Therefore, it can  plausibly be argued that the academies policy has, so far at least based on available data, been a success or at least the first tranche of Academies under the last government .  But there are some caveats.

Machin and Veniot found that the quality of the intake of these Academies improved  over time.  In other words, the academies under study were taking in pupils with better key stage 2 results than had been achieved by pupils entering the schools that the academies replaced. It is self-evident that if you improve a schools intake, then the educational outcomes of that schools  are likely to improve. Its not rocket science. However, the research also found   that after taking this pupil intake factor into account, the results achieved in the academies were  still better than those  achieved by a control group of schools. So that seems to address that particular concern.

It has also  often been claimed by critics that Academies cream off the best pupils from neighbouring schools and leave, in their  wake, underperforming  sink schools with falling rolls. In a sense it’s a zero sum game-if one school improves its neighbours performance will decline.   However, the LSE study found that the results in these neighbouring schools at GCSE also happened to  improve as well. We already know that the Zero sum game theory doesn’t hold much credibility as far as schools are concerned  in that there are many examples throughout the system where neighbouring state schools with exactly  the  same governance structures and intakes have huge differences in performance . Better performing schools do not  have to rely on improving their intake by poaching bright children from neighbouring schools. Other factors are at play.  The paper  suggests that this, ie other schools improvement , was probably the result of greater competition from an academy nearby spurring improvement, by the neighbouring schools.

But what about the academic results? There have been claims articulated, amongst others, by the respected Think tank Civitas, that Academies   have boosted their results artificially and have been shy historically   about being open about what exams their pupils sit.   The charge is that  disproportionately they use non-GCSE qualifications or so- called GCSE equivalents to boost their position in league tables . Under the system in operation in recent years, other courses are counted as “equivalent” to GCSEs, for league table and results purposes. This is the case for the main measure used in this study: the proportion of pupils in each school achieving five A*-C grades at GCSE or vocational equivalent, including maths and English.  Some of the GCSE-equivalent courses have been given high weightings in the results formulae – worth up to four GCSEs. This means that they can have a heavy influence on the overall published results. This raises ,of course, a worrying question.  Might not some schools be  entering pupils for particular exams because it is in  the schools perceived interests to do so   rather than  that of  the pupil ? Might not academies, then, under greater political  pressure to produce results gains, simply be turning to these courses to a greater degree than other schools? Well, not according to the LSE study which dismisses claims that Academies are  in effect gaming. The LSE study says clearly that  its figures do not show the improved results at academies are the product of gains in “unconventional” subjects.

So that’s pretty clear then isn’t it? Well, maybe not. Warwick Mansell in his blog contests the LSE findings on this score. He has revealed that on the 2010 figures, “GCSE-equivalent” courses have contributed far more to academies’ headline results than they have at non-academy schools(although it should be noted that there are plenty of schools that do not have Academy status who have also been accused of gaming)

Mansell also makes a valid point, it seems, in respect of the new Ebacc. Remember the Ebacc was introduced partly to flush out schools that were gaming-ie using GCSE equivalents to inflate their performance. So how come Academies rate so poorly on the new Ebacc measurement introduced by Gove if there has been  no gaming going on? Against this measure nearly a third of academies with results to report had a score of zero per cent on the English Bacc. The Ebacc, of course, records the proportion of pupils in each school with A*-Cs in English, maths, two sciences, a language and history/geography. Furthermore, the proportion of academies with that zero score on the EBacc was twice as high as it was with a comparison group of schools with similar intakes.

The LSE report covers Academies performance under the last Government and clearly delivered some good, positive news for the programme. But the issue of whether or not Academies   have actually performed as well as at first seems   has not been entirely resolved and hasn’t been helped by the fact that  until very  recently Academies were not subject to the Freedom of Information Act. So getting information from  some of them at least (not all) was difficult.  With Civitas shortly to deliver another report covering this area it is unlikely that the debate will go away any time soon.

One thought on “ARE ACADEMIES PERFORMING BETTER THAN OTHER SCHOOLS?

  1. I think the issue is somewhat due to the massive double standards in Govt pronouncements. One the one hand they have damned vocational education as the tool for the gamer and yet celebrated the results of the Chains, who happen to be the biggest users of the vocational equivalences.

    The data releases actually start to show an interesting pattern when you look for gaming and i think this is where we must take a moment to consider.

    1. Vocational qualifications are a brilliant method to raise the aspirations and the progression chances of some low to middle ability students. They lead to L3 qualification and for some to university.
    2. Comprehensive schools should create curricula that allow all students to progress. So should have academic pathways to push higher ability students onto the Russell Group universities, as well as appropriate curriculum down the ability range.
    3. Conscription onto high equivalence vocational courses for all students is morally wrong. So is early entry where once a student ‘gets’ a C they are no longer allowed to continue with that course, not matter what their ability
    3. The numbers of students completing KS4 in comparison to a schools normal cohort size needs to be looked at. ‘Losing’ a number of students to ‘educated at home’ status has a disproportionate impact on your headline figures

Leave a comment