Stagnating but some encouragement in science
But many worry about the Pisa league tables while admiring the data generated
Governments around the world waited – some eagerly, but most rather anxiously – for the latest results of the PISA survey, on 3rd December.
PISA represents an ambitious and expensive, large-scale attempt to measure and compare literacy in reading, mathematics and science in a large number of countries. The first PISA survey was launched in 2000, and it has since been followed up with surveys in 2003, 2006, 2009 and 2012.
Many concerns have been raised concerning the comparability of educational test results from different countries in general and in particular with the difficulties in producing items that are culturally and linguistically neutral.
According to the latest PISA report, (3 December) England’s performance in mathematics, science and reading has remained stable since PISA 2006. In each survey, pupils in England have performed similarly to the OECD average in mathematics and reading and significantly better than the OECD average in science. This is in contrast to a number of other countries which have seen gains and losses. For example, Singapore, Macao-China, Estonia, Poland, the Republic of Ireland and Romania have shown significant improvements in mathematics, science and reading since 2009, whereas Finland, New Zealand, Iceland, the Slovak Republic and Sweden have shown significant declines in all three subjects during the same period. However, average scores give only part of the picture. In all three subjects, England has a relatively large difference in the performance of lowest and highest achievers; this is greater than the OECD average.
Pisa reports generate a wealth of data which is undeniably useful and important. Although most of the publicity surrounding Pisa results focuses on the league tables that seeks to rate countries education systems, based on the tests covering literacy numeracy and science. Those who fare badly in the tables suffer what’s termed ‘ Pisa shock’ . Roughly half of the governments affected change their policies in response to the PISA results. In short, its very influential.
Andreas Schleicher, of the OECD, claims that world economy will not pay you for what you know, but rather for what you do with that knowledge and how you apply it. And that is what is behind data and the whole Pisa testing regime. But some academics challenge the methodology used by the OECD , claiming that the league tables are too crude to be of much use, though most concede that much of the data generated in this process can be very useful.
Professor Stephen Heppell, in this country, has been a long term critic of its methodology. Prais (2003), Goldstein (2004), Brown et.al. (2007), and Hopmann, Brinek & Retzl (2007) have also raised very specific concerns over the methodology. Svend Kreiner and Hugh Morrison have raised concerns too. Kreiners view is that PISA officials claim either that they know about the problems, that the problems have been solved or that their analyses show that the rankings provided by PISA are robust to the model errors. But he counters ‘The truths of such claims are not supported by evidence in the technical reports and our results suggest that the ranking is far from robust. If they want to restore the credibility of their results, it is PISA’s obligation to produce the evidence supporting their claims.’ Morrison says ‘the OECD’s claims in respect of its PISA project have scant validity given the central dependence of these claims on the clear separability of ability from the items designed to measure that ability.’
PISA’s comparison of countries relies on plausible student scores derived from the so called Rasch model. In short, pupils are not given identical questions but this is ironed out by the model which seeks to remove ‘contextual’ features. This begs the question whether or not this scaling model is reliable and consistent. In layman’s terms is PISA comparing like with like? However Some significant doubts have been raised, in this sensitive area, with the Rasch model criticised, or at least the way the OECD uses the Rasch model.
We already know that the ranking system can be misleading, as more and more countries join the rating system and some high performers dip in, and out. Some countries don’t take part at all. Also, statistically insignificant differences between countries performances have often been exaggerated in order to generate a headline .Even Schleicher has urged politicians to be cautious in using the evidence to justify policies (they tend to cherry pick and miss important nuances in order to get their basic message across – ie we are failing by international standards)
John Jerrim of the IOE ,who has himself raised concerns over PISA, says that criticisms that imply its useless as a benchmark are a ‘gross exaggeration’. While conceding that a number of valid points have been raised, and point to various ways in which PISA may be improved (the need for PISA to become a panel dataset – following children throughout school – raised by Harvey Goldstein is a particularly important point, according to Jerrim). And he accepts that no data or test is perfect, particularly when it is tackling a notoriously difficult task such as cross-country comparisons, and that includes PISA. But he says ‘to suggest it cannot tell us anything important or useful is very far wide of the mark. For instance, if one were to believe that PISA did not tell us anything about children’s academic ability, then it should not correlate very highly with our own national test measures. But this is not the case.’
Cambridge University statistics professor David Spiegelhalter investigated Pisa for the BBC recently.He talked to leading academics in the world of education including Svend Kreiner in Copenhagen, Harvey Goldstein at Bristol Oxford’s Jenny Ozga and Professor Alan Smithers of Buckingham University . His conclusion? The League tables are essentially misleading and unreliable although the data produced by the PISA exercise is useful.
Professor Alan Smithers, looking at the maths questions says all the questions have a picture or graph attached to them but don’t really cover mathematical understanding in any depth . And he made the important point that the OECD cannot possibly know, for sure, that if pupils do well in a certain test then that is due to their schools system. It could in fact be due in Japans ,Singapores and South Koreas cases, for example , to the extra tuition pupils receive outside the school classroom. Pisa doesnt control for this private tuition.
The NFER plays down the rankings. It doesn’t like to report on rank because it says it has some issues with the data .
Indeed, even Andreas Schleicher believes that both the 2000 and 20003 results should not be used for comparison purposes, because of data shortcomings although this hasn’t stopped our media and politicians from doing so.
England’s latest Pisa results show we have stagnated.. The education secretary was quick to argue that the results are a “judgment on the past not the present” because the 15-year-olds who sat the most recent tests had been educated for nine years of their schooling under a Labour government and only two years under the coalition. He seemed to be backed in this by Schleicher who says that we will have to wait until 2015 results to take a view on the effects of the reforms.
Meanwhile, Sir John Rowling, of the Performance in Excellence (PiXL) Club, says PISA tests are so politically important that pupils should specifically prepare for them. He argues that if we regard the PISA rating as serious, then we should take the tests seriously, and prepare pupils properly for them .The PiXL club is a group of some 800 schools dedicated to boosting pupils’ exam performance at A-level and GCSE. Sir John suggests that England may be losing out because other countries take the tests much more seriously and do more to ensure that pupils perform well. The former headteacher says one solution would be at least to familiarise pupils with the style of the tests.
He told BBC News that because the tests are taken by a minority of pupils they are not taken seriously and “nobody bothers”.
“It all seems so far away it doesn’t seem to matter – but when politicians get hold of the results it matters a great deal.” But then there is a counter argument that preparing for PISA tests surely rather defeats their object .And encourages teaching to the (PISA) test. Indeed, Sam Freedman, who championed PISA as an adviser to Gove, thinks that any country specifically preparing their children for tests, should be banned from participating in PISA. But, then again, if you think that PISA tests are assessing things that are worthwhile assessing and it has such political consequences, it makes sense to prepare your pupils for it.
But there is no hiding official’s disappointment in the latest results. And Professor Alison Wolf said “We did badly last time and statistically we have done no better this time. She continued “It is not just about better teachers, it is also about the home environment.“If you are growing up in Seoul or Shanghai, you go home from school to a family that cares desperately about education, no matter what its social standing is. British parents are simply not as aware of how important education is.”
Labour hailed a strong performance in 2000 as a triumphant vindication of its education policies, including the multi-billion pound literacy and numeracy strategies.(which ended in 2010). But this perceived success was short lived, despite impressive levels of investment.
Britain’s position has worsened. This governments reforms have, in this first phase, concentrated on structural reforms which, by themselves, were never going to improve Pisa ratings ,even over the longer term. Reforms to the curriculum, assessment and raising the quality of teaching (through, for example better selection, training and CPD) when combined with structural reforms, could have an impact. But it will take several more years for us to see the effects.
We probably can’t very accurately (the language and cultural issues alone raises big challenges),rank students across countries, as if all have sat identical questions at the same time, in an identical context. Because they haven’t. And the Rasch model is not unchallenged
And its odd that the media simply ignore the alternative TIMMS study, and its results, which seek to measure ‘factual’ knowledge among students in different systems. Both the UK and the US consistently do better in TIMMS than PISA. (the US stagnated in PISA too)
There are no grounds for complacency on our PISA performance. Stagnation is not good and raises many questions on the policy front. But there are no reasons to panic either Its worth noting that that some of those whom we have most admired in the past including Finland , are actually in long term decline, according to Pisa. We might be in a better place come 2015 but there are a number of countries new to Pisa that are improving rapidly ,who could overtake us.