Warning: Illegal string offset 'singular_post_taxonomy' in /homepages/33/d277883068/htdocs/webs/consider-ed/wp-content/themes/canvas/functions/admin-functions.php on line 2812

Warning: Illegal string offset 'singular_portfolio_taxonomy' in /homepages/33/d277883068/htdocs/webs/consider-ed/wp-content/themes/canvas/includes/theme-functions.php on line 819

How do you solve a problem like PISA?

Print Friendly, PDF & Email

Every three years the OECD produce a league table of what 15 year olds around the world can achieve in reading, maths and science. This time, they have added a fourth dimension: problem solving. In this piece, Graham Birrell suggests the new PISA tests themselves pose more problems than they solve.

mazeWith a heavy feeling of déjà vu, here we are again with another round of introspection on PISA and the mediocre education children must therefore be receiving in the UK. Except this time, there’s a twist: maybe we’re not so awful after all.

In the latest news from PISA HQ the United Kingdom (or to be more precise, England, as the sample was only from English schools) has scored relatively well on the new problem solving test. A sample of nearly 1500 15 year olds in 137 schools scored on average 517 points, placing us well above the OECD average and 11th out of 28 participating nations (or, bearing in mind common criticisms of the PISA tests, sixth if you only include whole countries which have populations over ten million).

According to PISA, our ‘students perform significantly better in problem solving, on average, than students in other countries who show similar performance in mathematics, reading and science’ and ‘this is particularly true among strong performers in mathematics.’

Looking at results of other participants, the usual ‘Britain’s useless schools’ headlines might not be the only narrative that gets challenged. An obvious issue is that Shanghai, which although still a strong performer, fares very significantly worse compared to its Maths, Reading and Science scores, particularly the former; and all the Chinese jurisdictions do worse than expected on ‘interactive tasks’. Sweden, once Michael Gove’s ‘country of educational choice’ confirmed its slide down the rankings; but perhaps the major standout is Poland, who, after their Premier League performance in the conventional 2012 tests were being talked of as the next big thing, find themselves relegated back to the Championship with a score of only 481.

But just before we gloat too much, let’s bring ourselves down to earth with three important issues.

Firstly, in terms of what this might mean for the UK in terms of education policy direction, particularly in regards to the new national curriculum, the short answer is not much. Too much is set in stone now for these results to have any major impact: the final version of the national curriculum is published and although problem solving is worryingly absent from most of the document (and bizarrely, completely missing from science) it is a significant feature of the new maths programme of study with one of the three aims being that all pupils should ‘solve problems by applying their mathematics to a variety of routine and non-routine problems with increasing sophistication.’ The government has already decided on its post-PISA strategy and little is going to change with the publication of the problem solving strand.

Secondly, it’s crucial to remind ourselves that not only is the PISA methodology deeply flawed, but the idea of reading too much into the results is frankly laughable. As the mathematician Dr Hugh Morrison has said: ‘there are very few things you can summarise with a number and yet PISA claims to be able to capture a country’s entire education system in just three of them. It can’t be possible. It is madness.’  Yong Zhao, a respected Professor of Education at the University of Oregon, has written a series of scathing articles on the ‘illusionary’ and ‘misleading’ PISA tests, accusing them of ‘glorifying educational authoritarianism’ and ‘romanticising misery’ through their holding up of Shanghai-China in particular as the star the rest of the world must follow. I recommend them to anyone who wants to know why placing too much stall in anything PISA has to say is a highly dangerous game. Therefore, we can’t have it both ways: if our weak results have ‘scant validity’, so do our good ones.

Finally, it’s worth asking whether or not these results actually do as they say on the tin. PISA assert that: ‘across OECD countries, there has been a marked increase in recent decades in the share of jobs that require creative problem-solving skills…PISA’s first assessment of creative problem-solving skills shows how well-prepared students are to confront – and solve – the kinds of problems that are encountered almost daily in 21st century life.’

Looking at the sample questions, this is a highly challengeable claim. The questions are better tests of pupils’ ability to read and follow instructions than they are about the genuine creativity that many employers now say they want; if the practice tests are anything to go by, PISA is going to be great for encouraging people to be better at buying railway tickets and putting up thermostats, but perhaps not so good at finding radical and original ideas to unpredictable and non-mechanical issues.

In terms of problem solving, the biggest problem of all is once again highlighted by these results: PISA itself. It purports to be an accurate indicator of entire countries’ education systems and in this Svend Kreiner, Professor of Statistics at the University of Copenhagen has said that ‘the best we can say about PISA rankings is that they are useless.’ No doubt this won’t stop politicians from using them to justify their own policies, but at the very least this should encourage us all to challenge them when they do.

A version of this article previously appeared in The Conversation

, , , ,

Comments are closed.