MAYBE WERE NOT DOING AS BADLY AS WE THOUGHT ON THE PISA RATING
Thought to be plummeting in the international rankings Englands system may not be doing as badly as some critics suggest. And is improving faster than many
A common claim, oft repeated and based on OECD findings, is that England over the past decade has plummeted in the international rankings: from fourth to 16th in science; seventh to 25th in literacy; and eighth to 28th in maths. This, as it happens, is not the complete picture.
The Programme for International Student Assessment (PISA) PISA is run by the OECD and takes place every three years. It is a sample survey that assesses 15–16 year olds in three areas: literacy, maths and science. It is frequently referenced by our politicians more often than not to highlight the underperformance of our education system.
A couple of points are worth mentioning on Pisa. Firstly, quite a few academics have considerable reservations about its methodology. Secondly, even its supporters warn that information in its tables must be used very carefully indeed ,particularly if you seek to compare performance over a number of years. The number of countries participating has varied, and the quality of information available from each country varies too ie low response rates places a question mark over the validity of some findings, and so on. Pisa also measures particular aspects of education and is more about problem solving ability than for example raw knowledge.
In the latest Pisa report (2009) UK appears to come 25th in reading, or rather that is how it has been reported
The UK does indeed appear in the 25th row of the relevant table, but the DfE appears not to have noticed that:
1) China-Taipei and Denmark are placed above the UK in the international “league table” for reading because they start with an earlier letter in the alphabet. In fact they have exactly the same score (495);
2) Twelve other countries, nominally above England in the 2009 tables, have statistically insignificant higher scores. An NfER report makes this point explicitly: “Because of the areas of uncertainty described above, interpretations of very small differences between two sets of results are often meaningless. Were they to be measured again, it could well be that the results would turn out the other way round”;
3) China-Shanghai and Singapore are above the UK in the 2009 tables but didn’t take part in the 2006 survey, so the UK can’t be said to have “fallen” below them;
4) In any event, the OECD warned explicitly in its report against comparing the 2009 and 2006 Pisa results with earlier data, because the very low response rate for earlier years created great concerns about sample validity.
The NfER report in December 2010 concluded that “England’s [reading] performance in 2009 does not differ greatly from that in the last Pisa survey in 2006”. NfER reaches very similar conclusions for maths and science, for similar reasons, while noting that science achievement actually remains above the OECD average.
So , our performance may not be very good and we should not be in the slightest bit complacent and, of course, we should definitely be doing better, while raising our sights higher but it doesn’t quite all amount to us “plummeting in the international rankings”. It is also the case that our system is improving quicker, using Pisa figures, than the likes of the USA, Sweden , Canada , France ,Finland and New Zealand (according to a 2012 study by Professor Eric Hanushek and others)
A report from the IPPR made this general observation on rankings: ‘The sampling methods of international assessments have been criticised for being too small to reliably judge a whole system’s performance, and for being open to countries ‘gaming’ the sample by excluding pupils who are likely to perform poorly (Hormann 2009, Mortimore 2009) and only provide system-level data, which makes it hard to apply the lessons at a more local level. It is also the case that ‘Country-specific factors – including the nature of curriculum, testing and teaching – can mean some pupils are better prepared for the format of international assessments than others’.
The IPPR report wants us to develop a more considered and systematic approach to using international comparisons in the English school system. And how about giving more publicity to the TIMMS findings?
IPPR Report-Benchmarking the English School System-Against the Best in the World-Jonathan Clifton; July 2011
Achievement Growth-Professor Eric Hanushek, Paul E. Peterson , Ludger Woessmann (Harvard University-2012)
Other International benchmarks:
Trends in International Mathematics and Science Study (TIMMS) Run by the International Association for the Evaluation of Educational
Achievement, TIMMS assesses 9–10 year olds and 13–14 year olds on their skills in both maths and science. TIMMS takes place every three years and more than 50 countries participate. It focuses on curriculum and as a result tends to test pupil’s content knowledge rather than their ability to apply it.
Progress in International Reading and Literacy Study (PIRLS) PIRLS assesses 9–10 year old pupils on their reading literacy. Using a similar design to TIMMS, it focuses on assessing their knowledge and content of the curriculum. It takes place every five years and there are currently 35 countries participating. PIRLS is also run by the International Association for the Evaluation of Educational Achievement.