PISA AND TIMSS-DO THEY REALLY DEMONSTRATE THAT ENGLISH PUPILS PERFORMANCE IS IN RELATIVE DECLINE?
Some questions over how data is interpreted
England’s plummeting” PISA test scores between 2000 and 2009 begs the question – Is the performance of our secondary school pupils really in relative decline? John Jerrim in a new study for the Institute of Education seeks to find some answers. The Secretary of State Michael Gove is keen that we compare our education system to that of other countries and in particular the best in the world. The 2010 Schools White Paper stated that “The truth is, at the moment we are standing still while others race past. In the most recent OECD PISA survey in 2006 we fell from 4th in the world in the 2000 survey to 14th in science, 7th to 17th in literacy, and 8th to 24th in mathematics.” The Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study (TIMSS) are the two main and highly respected cross-national studies of pupil achievement, although it should be noted that both have their (academic) critics. The Government tends to give prominence to PISA (OECD) which claims to measure students’ ability not simply to memorise facts but to apply knowledge and a grasp of facts to problem solve. Both PISA and TIMSS have been designed to study how different countries’ educational systems are performing relative to one another, and how this is changing over time. These are, however, politically sensitive issues, where different surveys can produce markedly different results. This is shown via a case study for England, where apparent decline in PISA test performance has caused policymakers much concern. But English pupils fare differently depending on what study you choose. Results though suggest that England’s drop in the PISA international ranking is not replicated in TIMSS, and that this contrast may well be due to data limitations in both surveys. Consequently, John Jerrim argues that the current coalition government should not base educational policies on the assumption that the performance of England’s secondary school pupils has declined (relative to that of its international competitors) over the past decade. He concludes:
• Both PISA and TIMSS are problematic for studying change in average test performance in England over time.
• Statements such as those made by the policymakers cited above are based upon flawed interpretations of the underlying data.
• England’s movement in the international achievement league tables neither supports nor refutes policymakers’ calls for change.
Jerrim says that the messages that England’s policy-makers and international survey organisers need to take from his paper are: First, that better documentation of the issues discussed is needed, both in the national and international reports. Second, it may be possible to get a better understanding of England’s comparative performance over time by linking the international achievement datasets to children’s administrative records. England is somewhat unusual in having national measures of performance collected at ages 7, 11, 14 and 16 and this could potentially be exploited to investigate at least some of the issues raised. Indeed, if policy-makers want to continue to make statements on this issue, then such data linkage should be strongly considered. And third, researchers using such data to investigate trends over time in England should make readers aware of the issues, discussed in this paper and check the robustness of their results. The big question surrounding international studies and the methodology they use is do they accurately compare like with like-ie apples with apples. And of course they do measure slightly different things. PISA tends to measure students’ ability to apply knowledge, as we have already said, to problem solve. Whereas TIMSS measures their ability to memorise facts (our pupils appear to do better in TIMSS). As a recent IPPR report on International Benchmarking said ‘The sampling methods of international assessments have been criticised for being too small to reliably judge a whole system’s performance, and for being open to countries ‘gaming’ the sample by excluding pupils who are likely to perform poorly (Hormann 2009, Mortimore 2009) and only provide system-level data, which makes it hard to apply the lessons at a more local level. It is also the case that ‘Country-specific factors – including the nature of curriculum, testing and teaching – can mean some pupils are better prepared for the format of international assessments than others’.
England’s plummeting” PISA test scores between 2000 and 2009: Is the performance of our secondary school pupils really in relative decline?; John Jerrim
Department of Quantitative Social Science; IOE; December 2011
IPPR Report-Benchmarking the English School System-Against the Best in the World-Jonathan Clifton; July 2011