Over sixty per cent of secondary schools already have academy status, and in March, the Chancellor set out plans to turn all schools into academies by 2020/2022.
The government argues that academies drive up standards by putting more power in the hands of head teachers over pay, length of the school day and term times. As academies, schools have more freedom to innovate and can opt out of the national curriculum.
But where was the evidence on which to base such a sweeping and enforced national change in the education system?
The Education Select Committee said, in its 2014 report, that “it’s still too early to know how much the academies programme has helped raise standards”, although “there is some evidence that sponsored secondary academies have had a positive effect on pupil performance”. On the other hand, it also said, “there is no conclusive evidence of the impact of academy status on attainment in primary schools”. Then again, “however measured, the overall state of schools has improved during the course of the programme.”
It seems obvious that to demonstrate success, academies must be able to do at least as well as schools generally in enabling their poorest students to get good GCSE results and meet new attainment measures.
But evidence, as things stand, does not prove that academies raise standards overall or for disadvantaged children. This judgement, that they are not raising standards for disadvantaged pupils, is particularly damaging because it raises deeper questions over their purpose.
The Sutton Trust in its report Chain Effects found that around a third of the academy chains it examined are dramatically transforming the prospects of their disadvantaged pupils, with results well above the national average. But it also found that many others are middling or worse, and says “their performance raises important questions about how the programme is run and how it might move in the future.”
So, some academies seem to benefit disadvantaged children and some don’t. How does that compare, one wonders, with community schools? The answer is that some maintained schools do, some don’t.
Academies, we must remember, were initially introduced by Tony Blair’s Labour government, inspired by City Technology Colleges. They were specifically intended to target the most disadvantaged areas, and this is what happened (by and large), until 2010. Evidence suggests that these first phase academies did relatively well when it came to improving pupil attainment, although we also know that a certain amount of gaming took place – not least choosing soft non-academic options for pupils to secure league table advantage.
Bill Watkin, when he was operations manager at SSAT schools network, said of this phase: “The early academies really did make a big difference to thousands of the most vulnerable young people and to many struggling communities. They brought about a transformation in attitudes and standards, they re-engaged parents and the wider community, they re-connected with employers; they provided a secure, orderly and healthy learning environment and a vibrant community hub.”
However, after 2010, and the arrival the coalition government, there was a rapid expansion of the scheme, with attention initially directed towards converting successful schools to academies.
This switch in direction immediately presented difficulties in judging academy performance. How do you compare the first phase of the academies scheme in which poorly performing schools were transferring to academy status, with the post Labour phase when many high achieving schools were transforming into academies?
It is also instructive to examine whether academies are outperforming other comparator (maintained) schools. This is where it gets a bit complicated. If a school becomes an academy it is simply not possible to know for sure what would have happened to it if it had not become an academy and remained as a community school. As Simon Burgess of CMPO points out in a blog in 2010, what researchers have to do is make assumptions to produce estimates of the effect of the policy. One way is to look at what happened to close comparator schools and to assume that something similar would have happened to the academy: for obvious reasons, this is called matching.
A 2014 NFER report presents comparisons between sponsored and converter academies and groups of similar maintained schools. This, it claims, is a more robust method for analysing the association between academy status and GCSE outcomes than comparing levels of school performance and of comparing trends.
The report’s analysis shows that the level of attainment progress made by pupils in sponsored and converter academies is not greater than in maintained schools with similar characteristics. In almost all analyses the difference in average GCSE outcomes is small and not statistically significant. It concludes: “It is still too early to judge the full impact of converter academy status on school performance because almost all converter academies have been open for three years or less, but this analysis shows that there are no short-term benefits in improved school performance associated with converter academy status.”
The Education Policy Institute published its own rankings this month showing how local authorities compare with multi-academy trusts which have at least five schools. The findings show that academy trusts are among the most, and least, successful at improving pupil performance, at both primary and secondary levels. In between is a spread of success and underachievement, with the analysis concluding that there is little overall difference between academy trusts and local authorities.
So the picture is mixed and it is certainly too early to draw firm conclusions. The nature of the academy programme has changed dramatically and it is true to say that there is some disappointment in the way that the academies programme doesn’t seem to have had the transformative effect that it was expected to have delivered to date, despite some areas of real excellence (Ark, Harris etc).
The government accepts that some academies are not performing as they should, and that is one of the reasons why it appointed its eight Regional Schools Commissioners (RSCs) in September 2014.
Nevertheless, the reasons why some academies are struggling may not be so hard to pin down.
The early academies had, arguably, more autonomy, active sponsors (the scale of the initiative now means sponsors are in short supply although some ‘good’ and ‘outstanding’ academies as opposed to businesses have become new sponsors of failing schools to drive improvement), more access to effective support networks and more funding.
Latterly, as autonomy has been reduced, with more prescriptive funding agreements, regulation tightened and some support networks disappearing, sponsors have all but dried up and the extra funding that was available is no longer forthcoming. It has become a much colder climate in which to operate.
The most successful chains tend to be those that were very picky with the schools they selected at the start and grew gradually, often avoiding the most disadvantaged rural and coastal areas. The least successful chains – judged on exam results (which may not be entirely fair) – tend to be those that took on the most deprived schools in the most disadvantaged areas. They also tend to be spread across large areas (which, by the way, they were encouraged to do) and have expanded rapidly.
Having set its course, the challenge now for the government will be to look much more closely at the academies’ programme and the balance between autonomy and accountability in order to work out how to incentivise the best chains to target the poorest areas and pupils.
The pupil premium ‘extra’ money targeted at the most deprived pupils doesn’t seem to a be a sufficient incentive to offset the reputational risks in taking over the most disadvantaged schools. If you manage a chain and your results aren’t quite what the Department expected, they are all over you like a bad rash, forbidding any further expansion. The Department used to be a critical friend, now it’s more like a brooding interventionist regulator.
The safest judgment on the academies programme at present – and fortunately it could change – is that it has had mixed results: real excellence at the top but underperformance compared to similar schools in the maintained sector, at the bottom. Bit of a curate’s egg, then.
For the long term future of the academies’ programme to be deemed a success – and its potential has yet to be realised – academies must show clear blue water between themselves and maintained schools, particularly in adding value to disadvantaged pupils. But for this to happen there may have to be a better balance struck between autonomy and accountability and a re-setting of the relationship between the Department for Education and those Trusts that run academies.
The irony is that the academies scheme was launched to release schools from red tape and burdensome bureaucracy, but the perception is growing that this old bureaucracy has simply been replaced by a new one. The Department for Education, Ofsted ,the Schools Commissioner and Regional Schools Commissioners are all part of an accountability framework that allow MATs very little real freedom. It raises a question too, over whether reforms are now actually school-led, which was also the government’s original intention.
Nicky Morgan, the Education Secretary,under pressure, stepped one pace back from forcing all schools to become academies. But it may be wrong to assume that this will necessarily slow the rate of academisation substantially . Schools that are perceived as failing or underperforming may still be faced with the prospect of being forced to become academies, while singleton academies will be encouraged to join multi-academy trusts so the trajectory remains – essentially – the same. Indeed, by 2022, there could be over 700 new multi-academy trusts in operation. Unless, of course, the new Prime Minister, replaces Morgan, and the new administration no longer places such a high priority on structural reforms.