League Tables Part 3: The Best Model of a Cat is a Cat

In the first post in this short series we outlined the dizzying array of changes to school league tables since they were introduced in 1992. In the second post we looked at some of the problems associated with Progress 8 – the measure that currently dominates our league tables. What both of these posts have in common is that we shouldn’t expect any single measure to accurately capture the complex picture of school performance.

Cat

Harry Fletcher-Wood once suggested to me a way around this problem with the idea that schools are given a variety of performance measures but each year the government chooses to rank schools on just one of these, such as A*-B grades in modern foreign languages. The twist in the tale of Harry’s plan is that the DFE would only announce which measure it was going to use after pupils had sat their exams. Not knowing which measure to focus on, schools would simply get on with securing the best outcomes for all pupils across all subjects.

Harry’s cunning solution addresses the problem with performance tables known as Goodhart’s Law: “when a measure becomes a target, it ceases to be a good measure”. We see this with Progress 8. The intention might be that P8 gives credit for the performance of every pupil across a wide range of subjects, but the reality is that the quickest way for schools to boost their Progress 8 is to cram the open bucket full of vocational qualifications, particularly if you have a lower attaining cohort (because you then get an even bigger boost if these low prior attainers secure top grades in vocational qualifications).

If any single measure dominates, or is perceived to dominate, the attention of schools is diverted towards this measure. Just as the old 5A*-C with English and Maths measure was flawed because it led to an obsession with pupils working at the C/D borderline at the expense of other pupils, so too P8 is flawed because of the easier pickings in the open bucket which benefit the school more than the pupil. The solution is not to dispose of Progress 8, but to see it as one performance measure amongst many.

We could wait for the DFE to do this, or we could do this ourselves. Imagine if a school, local authority, or academy trust said that it was going to strive towards the following indicators, as well as the DFE’s official measures:

  • Ensure that all pupils can read, write and multiply with fluency by the end of Year 7 (useful for a school with weak attainment on entry so that pupils can access the curriculum in the rest of their time at school)
  • Ensure that no pupils leave school at 18 without a degree or an apprenticeship (useful for a school with high youth unemployment in the local area)
  • Increase the number of pupils gaining more than five top grades by 50% over three years (useful for a school that hasn’t previously enabled pupils to reach our very best universities)
  • Transform the performance of boys in languages or girls in triple science (useful for schools with gender gaps in these areas)
  • Double the number of pupils taking a course that is particularly valued in the local employment market e.g. engineering (useful for schools in regions with a distinctive local economy).

None of these metrics are official government measures, but they all seem like reasonable aims, and I can’t imagine the DFE or Ofsted taking umbrage with schools that choose to prioritise outcomes that they feel will make a particular difference to their pupils. We saw in the first post that one school in Salisbury secured a good Ofsted rating despite a very low Progress 8 score because it had chosen to enter its pupils for iGCSE English even after this qualification had stopped counting in league tables. The earth did not stop spinning for this school or its pupils. So maybe the bark of the league table is worse than its bite, and we have more freedom than we think to treat performance measures as an accountability framework that sits alongside our own aims for our pupils, rather than a singular goal that we must obsessively strive towards.

At their best, league tables work by honouring the actual outcomes that actual pupils walk out with when they leave our schools. It’s another reason why the predominance of Progress 8 is not helpful: it’s an irrelevant metric for individual pupils. At one school I worked at it was my job to write the press release on results day for GCSEs and A Levels. In addition to the usual published figures we also compiled a list of the top 20 pupils, showing their full range of grades across all subjects. For the 18 year olds, we added where they were going for university (GDPR wasn’t even a glint in a lawyer’s eye at this point). It struck me that this was powerful ‘data’ and would allow prospective parents to see the actual outcomes that pupils walk away with. As a parent I would be more keen to send my child to a school where performance is consistent across subjects, rather than a school with a spiky attainment profile, particularly if the best grades were in lower tariff subjects. Revealing the real outcomes of pupils, whether it’s the top twenty or a typical pupil at each decile, might be too messy to be an official government measure, but it could be a useful internal metric for schools or trusts to use. It’s a reminder that any performance measure is a model of pupil achievement, but sometimes it’s best to just look at the thing itself: ‘the best model of a cat, is a cat’ (paraphrasing the mathematician Norbert Wiener).

While we should continue to refine our performance measures, we shouldn’t expect to arrive at a perfect measure. Progress 8 was heralded as a saviour for schools with lower attaining intakes, as indicated by a BBC article in January 2016: “Head teachers have long complained measuring success on the basis of GCSE results alone is unfair as it does not take into account the intake of the school”. Three years on, many of these same heads bemoan the fact that Progress 8 – like all other measures – doesn’t tend to favour schools with lower attaining intakes, especially if these pupils are white British and disadvantaged.

I’m not sure we can blame league tables for exposing the under-performance of schools, regions or groups of pupils. This under-performance would exist whether or not it was reported in performance measures, and this reporting at least gives us a chance to confront some ugly truths about educational disadvantage and perhaps do something about it. But maybe it’s time for a more nuanced treatment of league tables, moving away from judging schools on one measure in a single year and towards looking at a range of measures over a series of years, as well as giving schools the opportunity to focus on the priorities that are particularly relevant to them in their local context.

Advertisements