League Tables Part 3: The Best Model of a Cat is a Cat

In the first post in this short series we outlined the dizzying array of changes to school league tables since they were introduced in 1992. In the second post we looked at some of the problems associated with Progress 8 – the measure that currently dominates our league tables. What both of these posts have in common is that we shouldn’t expect any single measure to accurately capture the complex picture of school performance.

Cat

Harry Fletcher-Wood once suggested to me a way around this problem with the idea that schools are given a variety of performance measures but each year the government chooses to rank schools on just one of these, such as A*-B grades in modern foreign languages. The twist in the tale of Harry’s plan is that the DFE would only announce which measure it was going to use after pupils had sat their exams. Not knowing which measure to focus on, schools would simply get on with securing the best outcomes for all pupils across all subjects.

Harry’s cunning solution addresses the problem with performance tables known as Goodhart’s Law: “when a measure becomes a target, it ceases to be a good measure”. We see this with Progress 8. The intention might be that P8 gives credit for the performance of every pupil across a wide range of subjects, but the reality is that the quickest way for schools to boost their Progress 8 is to cram the open bucket full of vocational qualifications, particularly if you have a lower attaining cohort (because you then get an even bigger boost if these low prior attainers secure top grades in vocational qualifications).

If any single measure dominates, or is perceived to dominate, the attention of schools is diverted towards this measure. Just as the old 5A*-C with English and Maths measure was flawed because it led to an obsession with pupils working at the C/D borderline at the expense of other pupils, so too P8 is flawed because of the easier pickings in the open bucket which benefit the school more than the pupil. The solution is not to dispose of Progress 8, but to see it as one performance measure amongst many.

We could wait for the DFE to do this, or we could do this ourselves. Imagine if a school, local authority, or academy trust said that it was going to strive towards the following indicators, as well as the DFE’s official measures:

  • Ensure that all pupils can read, write and multiply with fluency by the end of Year 7 (useful for a school with weak attainment on entry so that pupils can access the curriculum in the rest of their time at school)
  • Ensure that no pupils leave school at 18 without a degree or an apprenticeship (useful for a school with high youth unemployment in the local area)
  • Increase the number of pupils gaining more than five top grades by 50% over three years (useful for a school that hasn’t previously enabled pupils to reach our very best universities)
  • Transform the performance of boys in languages or girls in triple science (useful for schools with gender gaps in these areas)
  • Double the number of pupils taking a course that is particularly valued in the local employment market e.g. engineering (useful for schools in regions with a distinctive local economy).

None of these metrics are official government measures, but they all seem like reasonable aims, and I can’t imagine the DFE or Ofsted taking umbrage with schools that choose to prioritise outcomes that they feel will make a particular difference to their pupils. We saw in the first post that one school in Salisbury secured a good Ofsted rating despite a very low Progress 8 score because it had chosen to enter its pupils for iGCSE English even after this qualification had stopped counting in league tables. The earth did not stop spinning for this school or its pupils. So maybe the bark of the league table is worse than its bite, and we have more freedom than we think to treat performance measures as an accountability framework that sits alongside our own aims for our pupils, rather than a singular goal that we must obsessively strive towards.

At their best, league tables work by honouring the actual outcomes that actual pupils walk out with when they leave our schools. It’s another reason why the predominance of Progress 8 is not helpful: it’s an irrelevant metric for individual pupils. At one school I worked at it was my job to write the press release on results day for GCSEs and A Levels. In addition to the usual published figures we also compiled a list of the top 20 pupils, showing their full range of grades across all subjects. For the 18 year olds, we added where they were going for university (GDPR wasn’t even a glint in a lawyer’s eye at this point). It struck me that this was powerful ‘data’ and would allow prospective parents to see the actual outcomes that pupils walk away with. As a parent I would be more keen to send my child to a school where performance is consistent across subjects, rather than a school with a spiky attainment profile, particularly if the best grades were in lower tariff subjects. Revealing the real outcomes of pupils, whether it’s the top twenty or a typical pupil at each decile, might be too messy to be an official government measure, but it could be a useful internal metric for schools or trusts to use. It’s a reminder that any performance measure is a model of pupil achievement, but sometimes it’s best to just look at the thing itself: ‘the best model of a cat, is a cat’ (paraphrasing the mathematician Norbert Wiener).

While we should continue to refine our performance measures, we shouldn’t expect to arrive at a perfect measure. Progress 8 was heralded as a saviour for schools with lower attaining intakes, as indicated by a BBC article in January 2016: “Head teachers have long complained measuring success on the basis of GCSE results alone is unfair as it does not take into account the intake of the school”. Three years on, many of these same heads bemoan the fact that Progress 8 – like all other measures – doesn’t tend to favour schools with lower attaining intakes, especially if these pupils are white British and disadvantaged.

I’m not sure we can blame league tables for exposing the under-performance of schools, regions or groups of pupils. This under-performance would exist whether or not it was reported in performance measures, and this reporting at least gives us a chance to confront some ugly truths about educational disadvantage and perhaps do something about it. But maybe it’s time for a more nuanced treatment of league tables, moving away from judging schools on one measure in a single year and towards looking at a range of measures over a series of years, as well as giving schools the opportunity to focus on the priorities that are particularly relevant to them in their local context.

League Tables Part 2: Progress 8

Search ‘never skip leg day’ and you’ll find a barrage of graphic warnings on the risk of working out your arms and upper body while ignoring your lower half. I wonder if Progress 8 has led to some schools ignoring ‘leg day’ – a solid academic core – in favour of superficial gains in the open bucket.

leg-day.jpg

When Progress 8 was launched I remember thinking that this was the performance measure to end all performance measures – at last schools would be credited for the achievement of every pupil in every subject at every grade. No longer would schools limit their attention to the ‘key marginal’ pupils at the C/D borderline, writing off in the process those pupils deemed incapable of securing a pass and ignoring higher attaining pupils who might be able to gain exceptional grades with a bit of a push.

Such naivety!

I still think that Progress 8 has its place in the performance measure mix, but I’m not sure that the predominance it has gained is justified, primarily because it’s too easily skewed by tactical behaviour in the open bucket and too far removed from the actual grades that pupils walk out with when they leave our schools.

First, is it right to say that Progress 8 has become the predominant measure of school improvement? I think so:

  • The DFE’s floor standard for secondary schools is based on Progress 8 (lower than -0.5)
  • The DFE’s Compare School Performance service gives a higher profile to Progress 8 than other measures e.g. if you enter the name of a school the first data you’ll see is that school’s Progress 8 score alongside a comment such as ‘well above average’ and if you create a comparison list of several schools, or compare all schools, the list is presented against Progress 8 scores by default
  • The DFE’s Multi Academy Trust performance tables begin by comparing Progress 8 between trusts.
  • Since its inception Progress 8 has topped the list of performance measures for secondary schools in DFE communication*

Separately, Ofsted’s Section 5 handbook explicitly states a preference for progress over attainment: ‘In judging achievement, inspectors will give most weight to pupils’ progress.’ It’s very clear from the rest of this paragraph that Ofsted is referring here to progress with a small ‘p’ rather than ‘Progress 8’ (e.g. the paragraph goes on to say that ‘inspectors will consider the progress of pupils in all year groups, not just those who have taken or are about to take examinations or national tests’) but with the DFE pushing its progress measure and Ofsted reaffirming its interest in progress, we can see why schools have reached the conclusion that Progress 8 is the performance measure that matters most.

When Progress 8 was introduced in 2015 it was – from my memory at least – schools with lower attaining intakes which welcomed its arrival most enthusiastically. At last, the key measure on which they would be judged would recognise pupils’ varying starting points and credit schools only for the progress made under their care.

It hasn’t quite turned out like that. In 2017 the average Progress 8 score for selective schools was 0.45 – well above average – while those schools languishing at the lower end of the P8 tables tend to be schools that also struggled under the previous attainment measures. If my Twitter feed is anything to go by, it’s now heads of schools with lower attaining white British intakes who feel most aggrieved by Progress 8, arguing that the playing field remains uneven.

So why have some people lost faith in Progress 8 so soon? Let’s take a few key issues:

  • The Open Bucket – Despite the removal of ECDL (see previous post) there are still easier pickings to be found in the open bucket which can artificially inflate a school’s P8 score if they enter large number of pupils (thereby deflating the P8 score of schools that don’t). Whether it’s the LIBF Certificate in Personal Finance or the TLM Level 2 Certificate for IT User Skills in Open Systems and Enterprise, these qualifications benefit schools more than pupils, undermining our faith in performance tables in the process.
  • The EAL effect – One striking feature of schools with exceptionally high P8 figures is that several of them have a high proportion of EAL pupils. If we go through the top 12 schools by P8 in 2018 then check the number of EAL pupils in their 2017 Y11 cohort (2018 EAL information is not yet available) we see EAL figures of 117 from a cohort of 119; 169 from 209; 17 from 111; 0 from 25; 105 from 118; 3 from 35; 68 from 75; 71 from 75; and 7 from 147 (data unavailable for three of the 12 schools because they didn’t have Y11 cohorts in 2017). The average here is 61% EAL. Switch the digits over and we get the national average of 16%. Perhaps the key point here is less about EAL and more that KS2 outcomes are not a great indicator of KS4 potential, particularly if performance at KS2 has been held back because pupils haven’t yet gained a secure grasp of English. Schools with lots of EAL pupils are potentially on to a winner: once their grasp of English is secure they motor ahead of first language English speakers. This is brilliant news, but it’s unclear why the schools they happen to go to should be credited for this.
  • KS2 results – Just a hunch, but I wonder if disadvantaged pupils are more likely to have inflated SATs results compared to non-disadvantaged pupils. My thinking here is that disadvantaged pupils are more likely to attend primary schools at risk of poor KS2 performance, so these schools are likely to devote more time preparing for SATs rather than just teaching the normal curriculum. If this hypothesis holds water, then secondary schools that recruit a high proportion of disadvantaged pupils are recruiting pupils whose real attainment is weaker than their SATs results would suggest, making it difficult for these secondaries to secure higher rates of progress. Even if this is a bit far-fetched, we can probably agree with Dr Becky Allen that ‘Key stage 2 test score is quite a noisy measure of a child’s educational attainment at age 11‘ and that KS2 results depend on a wide range of factors such as the quality of teaching; the amount of preparation for KS2 tests; the way the tests are administered etc – so are fairly limited as indicators of a child’s ability at the age of 11, let alone their potential ability 5 years later. Given that one side of the Progress 8 ledger is based on KS2 outcomes we should probably avoid using P8 to make sweeping judgements about schools.

It’s the first point above – the problem with the open bucket – that troubles me the most. The DFE’s Compare School Performance service reveals the P8 score of each element (for 2017 at least, not yet for 2018), and it’s not unusual to find schools with modest P8 figures for English, Maths and EBacc, and then stunning P8 figures for the open bucket. We can assume with some confidence that this is because schools will be entering the whole cohort for two or even three vocational subjects, and then securing excellent outcomes in these subjects (of course the open bucket also includes subjects like RE, the English GCSE not already counted, as well as Art, Music and Drama etc, but it doesn’t really stand to reason that a school performs massively better in these subjects than in English, Maths, Science, languages and humanities, hence my confidence in suggesting that a disproportionately high Open Bucket P8 figure is likely to be based on large numbers of pupils doing vocational subjects).

If you’ve got a lower attaining cohort who all secure top grades in three vocational qualifications (in 2017 this would often have been ECDL, Business and Sport) then your P8 score would go through the roof. We’ve then got a strange situation where lower attaining pupils might only be making average progress in English or Maths – therefore leaving school without good grades in these subjects (because they started with low attainment and only made average progress) – yet the school’s P8 figure would indicate that the school is excelling. The school is rewarded for skipping leg day.

The first rule of league tables should surely be that they credit schools for the things that matter for pupils, thereby aligning the interests of the school and the interests of the child. The fact that Progress 8 rewards tactical behaviour in the open bucket is therefore hugely damaging for its integrity as a measure of school performance, and represents a dangerous drift away from the things that matter to pupils.

So despite the best of intentions in creating a measure that values every student, every subject and every grade, and recognising schools that add value from lower starting points, Progress 8 has not been the ‘measure to end all measures’ that I naively welcomed back in 2015. I think it still deserves its place in the mix, but I’m not sure it’s worthy of the first among equals status that it seems to hold at the moment. It would surely be better for school leaders to focus their energy on outcomes that matter to pupils – such as decent grades in English and Maths – rather than the school’s Progress 8 figure.

A simple solution to this is to remove the open bucket to create a ‘Progress 5’. We don’t need the DFE to do this for us – it’s something we do at United Learning to shine a light on the ‘core stability’ of schools, with the view that schools with strong performance in the academic core have a more secure basis for future success than schools relying on their open bucket.

The point of this post isn’t to attack schools that skip leg day, schools where the academic core lags behind the open bucket. It’s difficult for outsiders to know the challenges that a school might face in recruiting a stable team of English, Maths and Science teachers, for example, and I understand the drive to gain momentum at the beginning of a school improvement process. The open bucket offers quick wins to restore league table pride while longer term gains are made at the core of the curriculum.

Instead, the point of this post is to reflect on the way we rank and judge schools, and to think twice before heaping praise on schools that have secured a stunning Progress 8 without gaining decent grades in the academic core. These pupils might have starred distinctions in ECDL, Business BTEC and Sport BTEC, but unless they’ve also got 5s in English and Maths, they’ll struggle to get on to A Levels and degrees.

Where does this leave Progress 8? I’ve tried to argue that the two bits of data on which P8 is based – KS2 performance and performance across 8 subjects at KS4 – are both flawed, the former because KS2 performance depends a lot on how seriously primary schools take SATs and the latter because the P8 figure is easily skewed by the open bucket. As such, I don’t think Progress 8 is worthy of the status it seems to hold as the first among equals of our performance measures.

One of the criticisms of P8 that I considered but didn’t include here was the argument that a school’s P8 figure is easily distorted by a small number of ‘outliers’ – often pupils who have left school with no qualifications. Such pupils didn’t particularly affect a school’s figures under the old threshold attainment measures, such as 5A*-C with English and Maths, because although these pupils would be marked as failing on these measures, their impact on the school was no greater than ‘near miss’ pupils. Under Progress 8, pupils leaving school with nothing have a significant impact on a school’s league table performance.

I actually think this is a huge strength of the P8 measure. By incorporating the outcomes of all pupils in all subjects we’ve finally got a measure which encourages schools and the system as a whole to grapple with the ugly truth that thousands of pupils leave our schools each year with nothing to show for their education. The current and long-overdue discussions about off-rolling and exclusions are perhaps an unintended consequence of Progress 8, and might just represent Progress 8’s most important contribution to our collection of measures.

*KS4 performance measures, as stated by DFE in 2018

  • Progress 8
  • Attainment 8
  • EBacc Average Point Score
  • the percentage of pupils entering the EBacc
  • the percentage of pupils achieving a grade 5 or above in English and maths
  • the percentage of students staying in education or employment after key stage 4 (destinations).

League Tables Part 1: Cat & Mouse

League tables divide opinion. For some they support our core purpose of securing the best outcomes for our pupils, providing in the process the transparency, accountability and feedback that all organisations need to sustain improvement – sunlight is the best disinfectant, and all that. But for their critics, league tables say little about what really matters in schools; not only do they fail to capture the complexity of life in our classrooms, they distort our behaviour and encourage teachers and leaders to make decisions based on what looks best in league tables rather than what’s best for our pupils.

Over the following posts I want to see how league tables have evolved since they were introduced in 1992, before taking a detailed look at Progress 8, which has become the headline figure for secondary schools. We’ll finish by suggesting how schools might respond to the ongoing flux of league tables.

League tables have long been one of two ‘eyes’ of accountability, with the other being Ofsted. With signs that Ofsted want to focus more on the inputs of education than the outputs, it’s all the more important that the gaze of league tables is fixed on the things that matter.

First, a quick point on language. ‘League tables’ is a bit of a misnomer. What we’re really referring to is the DfE’s Compare School Performance service which provides open access to a wide range of measures from any state school in the country. When we search for a particular school we see key measures (see example of Paddington Academy below) and then more specific measures for that school, and we can compare a group of schools, or all schools, by any measure we want.

League Table

Part 1: Cat and Mouse

Given that league tables were introduced to provide transparency and to invite public scrutiny, it’s fitting that all league tables published since their inception in 1992 are available online. Naturally I started my search at the school I attended – Cantell School in Southampton – where in 1992, the year before I joined, 32% of pupils gained 5 A*-C grades in their GCSEs (not necessarily including English and Maths). The national average that year was 38.3, and once again this was 5 A*-C grades in any subject – it wasn’t until 2006 that English and Maths had to be included. By the time this headline measure was phased out in 2015, 64.9% of pupils left school with this basic benchmark. Our profession is good at self-flagellation, and a week rarely goes by without another article bemoaning the state of our schools, but things were a lot worse just a generation ago. We can of course debate whether league tables have caused this improvement, or simply revealed it.*

The most striking finding from a year-by-year check of league tables is how much they’ve changed over time. I’ve identified some key changes here (all quotations below are from DFE’s guidance document accompanying each year’s league tables):

  • 1992-1996: 4 simple measures captured in league tables %5A-C, %1A-C, %5A-G and %1A-G
  • 1997: GNVQs combined with GCSEs for the first time this year on the basis of ‘broad equivalencies’ e.g. intermediate GNVQ equivalent to 4 GCSEs at A*-C.
  • 1998: the GCSE/GNVQ average point score per 15 year old is introduced. “This provides a fuller picture of the GCSE and GNVQ achievements of pupils of all abilities. The average point score is calculated by dividing the total GCSE/GNVQ points achieved by all 15 year olds by the number of 15 year olds”.
  • 2002:
    • Introduction of KS2-KS3 value added measure and KS3 to KS4 value added measure
    • Average capped point score based on pupils’ best eight results introduced to deter school from entering pupils for an excessive amount of qualifications
  • 2004: Introduction of KS2-KS4 value-added measure.
  • 2006:
    • Contextual Value Added (CVA) introduced: “CVA takes into account the varying starting points of each pupil’s KS2 test results, and also adjusts for factors which are outside a school’s control (such as gender, mobility and levels of deprivation) that have been observed to impact on pupils results.”
    • 5A*-C including English and Maths included for the first time. This had a significant impact on the league table position of some schools, as reported by this BBC article “The effects on some schools have been dramatic. One school went from a score of 82% passing the equivalent of five A*-Cs to just 16% when maths and English were included.”
  • 2009: Progress figure for English & Maths included for first time
  • 2010: “The percentage of pupils who have met the new English Baccalaureate requirements reported for the first time this year”
  • 2011:
    • CVA measure scrapped, replaced by VA measure (best 8 with bonus for E&M), including separate VA measure for each EBacc subject
    • Figures for high, middle and low attainers introduced for first time
    • Headline figures published with and without equivalences
    • In-school gaps published for first time, revealing the gap between the GCSE outcomes of each school’s FSM pupils and its non-FSM pupils
    • iGCSEs included
  • 2012:
    • Pupil premium reported (in place of the gap analysis above)
    • Destination measures introduced
    • Gender breakdown introduced
  • 2013:
    • Progress gaps now revealed alongside attainment gaps
    • ‘Similar schools’ table introduced
  • 2014:
    • No early entry – only first entry counts for EBacc subjects
    • Wolf reforms lead to the “removal of around 3,000 qualifications from performance measures; adjustment of the point scores of non-GSCEs and the restriction on the number of non-GSCE qualifications that count to two per pupil”.
  • 2015:
    • Last year of 5A*-C with English and Maths
    • Early entry policy now counts for all subject areas
    • Progress 8 score published for schools who opted in
  • 2016: Progress 8 introduced for all schools, so the headline measures are now:
    • Progress 8
    • Attainment 8
    • English and Maths at C+
    • EBacc (entering and achieving)
    • Staying in education
  • 2017:
    • iGCSEs no longer count
    • 1-9 grades replace A*-G in English and Maths
  • 2018:
    • ECDL no longer counts
    • New measure for EBacc (average points score)
    • 1-9 grades replace A*-G in most subjects

This dizzying array of changes (and there are plenty more I didn’t include here) reveals one of the challenges of league tables: they don’t tend to remain stable for long enough to guide school improvement in a meaningful and sustainable way. Given the fact that league tables are a retrospective check on the outcome of 5 years of schooling, you would think that stability would be built into their design so that schools can gradually work towards known metrics. Instead, the frequency of reform has left some schools lurching from one change to another.

The extent of this lurching is not insignificant. In 2016 there were more than 300,000 entries for iGCSE, the final year that they were included in league tables. In 2017 the figure had fallen to 110,000, creating one of the more unusual examples of our divided school system as independent schools retained the iGCSE while state schools took flight. Independent schools, of course, do not feature in league tables.

The rise and fall of the BCS Level 2 European Computer Driving Licence Certificate in IT Application Skills (ECDL) is another example of the influence of league tables on schools’ behaviour. ECDL entries increased from 26,000 in June 2015 to 117,000 the following year, and rose again in 2017 as the ECDL was promoted as an accessible qualification that can be quickly delivered to boost the open bucket of Progress 8. Figures for 2018 are not yet released but between January and March 2018 there were just 2,800 entries for ECDL compared to 37,650 in the same quarter in 2017. Did schools suddenly decide that their pupils no longer needed this certificate in computer skills? Of course not – this qualification fell out of favour the moment it ceased to count in league tables.

As I was searching for the entry figures above I found these questions on web forums for parents and students:

  • Mumsnet, March 2016: “Is ECDL worth having? I’m sure it is in its own right. But is it equivalent to taking Physics, History, Art or French GCSE?”
  • Student Room, February 2017: “I have asked many people on whether or not ECDL counts as a GCSE and they all give different answers. Does anybody with the ECDL qualification know? Say you get 5 GCSE’s and ECDL, would a job/uni/sixthform count that as 6 GCSEs? Thank you”

“My school calls it a GCSE” is the depressing reply of two respondents to the question posed by a student.

Since the ECDL has been discounted, some school leaders have searched for the next quick win, which no doubt will itself be discounted in time. And thus continues the bizarre dance of cat and mouse that has played out between the DFE and schools since league tables were introduced. Take this example from an IPPR report in 2003:

“The Government’s decision to give intermediate GNVQs an equivalency rating of four A*-C GCSEs has led to a surge of schools taking advantage of what is seen as easy league table success. Thomas Telford School in Shropshire, for example, has embraced GNVQs from its inception. Yet now, all of their pupils take at least one GNVQ and some leave with a total equivalent of nineteen GCSEs contributing in large part to their outstanding league table performance.”

One final example of the responsiveness of schools to changes in performance measures: in 2014 when the government decided that only first entries would count for EBacc subjects, the number of early entries for GCSE Maths fell from 169,000 in 2013 to 31,000 in 2014 – a profound overnight change to the delivery of Maths at thousands of schools.

It’s difficult to defend a system which appears to result in such obviously tactical behaviour by so many schools, and it is pupils, particularly poorer pupils at schools vulnerable to weak league table performance, who are caught up in this not-so-merry dance between schools and the DFE. We see this in the fact that so many independent schools continue to enter pupils for the iGCSE, while state schools abandoned the iGCSE when it ceased to count. Pupils in independent schools therefore benefit from teachers who have taught the same qualification for several years, while state schools gradually familiarise themselves with new specifications. In my experience it is disadvantaged pupils, even the high attainers among them, who are more likely than their more affluent peers to take vocational qualifications such as BTEC Sport or BTEC Business, which again might be because they tend to find themselves in schools desperately seeking any possible league table advantage.

Does it have to be like this? One school in York gained an outstanding Ofsted judgement in 2017 with a report which praised leaders for putting the needs of pupils ‘above all else’. The report continues: “The curriculum reflects leaders’ integrity because it is designed to match pupils’ needs and aspirations regardless of performance table measures. Leaders have made provision for almost all pupils to study a modern foreign language because research tells them that pupils will develop valuable skills for their future.” Sure enough, the league tables show that 83% of this school’s Year 11 pupils took a language GCSE in 2017, with a progress score for language of -0.43 – sharply at odds with the school’s progress score in other subjects (e.g. science 0.48, humanities 0.63).

250 miles south, in another cathedral city, one school gained a good Ofsted judgement in 2018 despite a Progress 8 score of -0.71. The reason, as I understand it, is that the school continued to enter pupils for iGCSE in English as they believed this was in the pupils’ best interests, even though the league table performance of the school would suffer as a result (the P8 score for English here is -2.91).

Perhaps these two schools, which have flourished despite taking decisions that have hampered their league table standing, prove that the bark of the league table is worse than its bite? What would happen if more schools did their own thing rather than dance to the DFE’s tune?

More on this in our next post

*After checking my own school I went on to check the schools that I work with at United Learning. Just look at these improvements between 1992 and now:

league tables ul

Obviously this is just a handful of schools among the 3000 or so secondaries across the country, but let’s be grateful that the woeful outcomes evident in the 1992 column hardly exist anymore.

Lock up the Curriculum

A sign on the back of a security van: “This vehicle contains a locked safe to which the crew have no access.” Imagine if the school curriculum came with a similar tamper-proof warning: “THE CURRICULUM IN THIS SCHOOL IS CONTAINED IN A LOCKED SAFE TO WHICH TEACHERS, SLT AND GOVERNMENT HAVE NO ACCESS.”

LOCK UP 1

As it stands, the curriculum in any given subject in any given school can be a moveable feast, disrupted one year by national reforms, the next year by the preferences of a new Head of Department, the next year by the decision to switch exam boards, the next year by a reduction in the number of hours allocated to each subject. This leads to teachers constantly teaching new topics for the first time and relying on piecemeal resources lifted from the internet.

A few examples from the history department where I started my career:

  1. We spent one half term of Y7 history making paper mache castles. It might have been fun, but it wasn’t history.
  2. The holocaust and the atomic bomb were taught in the summer term of year 9 but this was often interrupted by activities week, sports day and trips, so these crucial topics were barely touched.
  3. The GCSE course comprised mostly of topics perceived to be more accessible to our pupils, such as the American West and Medicine Through time, with coursework on Jack the Ripper. Did this prepare students for history at university? Did it fulfil their democratic right to leave school with a basic understanding of the world around them?
  4. The curriculum taught in each classroom would depend on the preferences of the teacher; we would sometimes deviate from the curriculum to teach an area of personal interest e.g. the Olympic Games or London through time (might be a good thing in the right hands but it’s a lot to ask from inexperienced teachers or teachers teaching out of their subject).

It was a curriculum guided not by powerful knowledge, eternal truths and threshold concepts but by the whims of teachers and the state of the department filing cabinet on any given day. Despite the fact that this particular history department had been in place for decades, we had failed to establish a secure curriculum and a stable set of teaching materials to go with it. The classroom experience suffered as a result, particularly for pupils taught by supply teachers and non-specialists.

We can’t guarantee every child an exceptional teacher, but we can guarantee every child an exceptional curriculum.

Our national tamper-proof curriculum would be an entitlement for all pupils. In each subject the content would be laid out in a logical sequence: year by year, term by term (the current National Curriculum simply sets out what pupils should be taught in each key stage). The stability of this curriculum would enable resources to latch on to it: lesson plans, topic tests, low-stakes quizzes, knowledge organisers and masterclass videos by subject experts.

With the whole country studying the same stuff, publishers would be able to produce textbooks cost-effectively. Perhaps most excitingly, we could collate and share the best work produced by students across the land. Forget arbitrary levels and age-related grades – pupils could see how their work compares to some of the best work in the country.

There’s one massive problem with the idea of a ‘locked-up’ curriculum though – the curriculum should not be hidden away, it should take centre stage in our schools and in our society. Safe in the knowledge that it won’t be tinkered with, it could be emblazoned on walls, plastered on corridors, published on the website alongside resources that pupils and parents can access at home. Over time the curriculum could become a sacred national treasure, enshrined in our national psyche. Let’s have a national holiday on the rare occasion (every ten years?) that we update it!

The benefits for teachers’ workload would be immense. Earlier this year I walked through the staff room of an independent school. Teachers were reading newspapers and academic journals; they huddled in subject groups planning and reviewing lesson materials. They did this because the curriculum itself had been stable for years, allowing expertise and resources to gather around it.

Does this impede the autonomy of teachers? Of course not. Delivering the curriculum – linking it to prior knowledge, deftly checking for understanding and providing precise feedback– is the very essence of teaching. Deciding what to teach places a huge burden on individual teachers. In every profession there are accepted standards that professionals simply don’t interfere with, whatever their personal preferences.

It’s time to stop tweaking, tinkering, chopping and changing. The curriculum – the stuff kids study so that they leave school with an understanding of the world around them – is too important to be left to chance.

The specific things that leaders do

I recently spent a few months supporting a school in Portsmouth as it joined our group of schools. This return to hands-on school leadership presented me with a few situations that I hadn’t encountered for a while, such as holding a meeting with a parent and child to address persistently poor behaviour which could no longer be tolerated by the school. It’s a meeting with a clear purpose: the behaviour of the pupil needs to change.

pexels-photo-194094

On an early morning train to Portsmouth I happened to be accompanied by one of our Regional Directors. She’s an experienced headteacher so I sought her advice for the meeting that awaited me at the school. She suggested:

  • Speak to the parent on their own first – make it clear what the problem is and what you need the parent to do.
  • Invite the pupil to join the meeting when, and only when, you have secured the support of the parent.
  • Once the pupil joins the meeting, present a united front – “I’ve explained to your mother/father what the problem is; s/he is aware of how serious this is.”
  • Be crystal clear with the pupil about the behaviour that is causing concern, why it cannot be tolerated, and what s/he needs to do instead. Check that they understand this.
  • Agree on the next steps: e.g. “you’ll return to your lessons from Period 2 but for today only I’ll need you to spend break times with your head of year. I’ll pop in to one of your lessons today and I expect to see you working hard.”

None of this is rocket science and I’m sure that people with more experience of these meetings than me follow a structure like this without even realising it. But this experience reminded me that leadership is as much about the specific things that leaders do as the lofty ideals and the glossy mission statements, and that there is good practice relating to these specific things that we can codify and share. Even if established leaders do this stuff implicitly, by making it explicit we can catalyse the development of new leaders.

I was reminded of this when I read this thoughtful post in which a serving head argues that “Too many leadership programmes focus on ‘leadership’ over domain knowledge”. The head continues, “The problem for schools comes, I would argue, when leaders are more interested in notions of leadership over and above what they are leading on.”

Similarly, this article in the Harvard Business Review makes the case that successful leadership is less about generic competencies and more about perfecting a core set of daily routines:

“Leaders want to get better in the here-and-now, not to be judged against a competency map or be sold an abstract theory about what leadership should look like. If you want to become a great leader, become a student of your context — understand your organization’s social system — and mind your routines. Leadership development is more about application than theory.”

The HBR post continues: “As we pursued our work at BHP Billiton, six routines (for example, how leaders spent their time in the field, in one-on-one meetings, and in cross team meetings) were identified which, when executed well, appeared to differentiate the highest-performing supervisors from average performing ones (the routines we discovered are context-specific to BHP Billiton; the routines that are right for you depend on your organization).”

The 6 core routines for school leaders might include:

  • Managing a meeting
  • Taking an assembly
  • Doing a learning walk
  • Holding a developmental conversation with a teacher
  • Holding a difficult conversation with a pupil/parent
  • Line managing a senior/middle leader.

Doug Lemov improved our understanding of teaching by codifying the specific things that effective teachers do. By making the implicit, explicit, he established a shared language that thousands of schools have adopted to develop their teachers.

Perhaps it’s time we do the same for school leadership?

 

 

 

Careful What We Wish For

‘Summit fever’ is the term given to an obsessive focus on a symbolic achievement – reaching the summit of a mountain, becoming a millionaire, getting married – and the risk that our focus on the end-point can distract us from the issues that matter here and now.

Summit

It’s a term explored by Oliver Burkeman in The Antidote. Drawing on Christopher Kayes’ account of a fatally flawed Everest climb, Burkeman describes a group of mountaineers for whom reaching the summit of Everest ‘became not just an external target but a part of their own identities, of their sense of themselves’. As these doomed climbers ignored worsening conditions in their pursuit of the peak, their expedition became ‘a struggle not merely to reach the summit, but to preserve their sense of identity.’

You don’t have to spend long on a school’s website to see what it wishes for. Take this from one school: ‘With an unrelenting drive focused on achievement for all, our vision is to be graded as Outstanding within four years.’ Other schools strive to be ‘the best school in the borough’ or proclaim a ‘2020 vision’ to gain a Progress 8 of +1 by the start of the next decade.

Such statements provide clarity, purpose and urgency, but perhaps this obsession with the symbols of success distracts us from the steps required to actually get there. Burkeman tells the story of General Motors which in the early 2000s set itself a target of gaining 29% of market share. It met this ambitious target not by improving the product but by slashing the price of its vehicles. This self-imposed race to the bottom continued until it filed for bankruptcy in 2009.

Similarly, our school above which strives to gain its outstanding Ofsted badge might spend time sprucing up classrooms and perfecting the SEF, rather investing in teacher development. Our school that strives to be the best in the borough might resist collaborating with other local schools to support vulnerable students. Our school which seeks a Progress 8 of +1 might fill the open bucket with easier qualifications, rather than ensuring that pupils who arrive in Year 7 without basic literacy are provided with the support to catch up.

A school’s Progress 8 score and Ofsted rating do nothing in themselves to improve the prospects of its pupils, so a school driven by these external reputational goals can set itself on a path of activity which diverges from the needs of its students.

How can we avoid summit fever in our schools while still harnessing the organisational benefits of a clear and simple statement of intent?

Firstly, we can prioritise the process, not the destination, framing our targets around the inputs of school improvement. Such targets might include raising attendance, getting pupils to work harder, improving behaviour and ensuring that the curriculum is coherent and challenging.

Secondly, if we do want to set specific end-point targets, we can ensure that these benefit students, rather than the school. So rather than a Progress 8 of +1 we could commit to the majority of pupils walking out with 8 good GCSEs. Rather than being the best school in the borough we could commit to all of our students progressing to university or employment. Rather than an Ofsted outstanding rating we could commit to ensuring that all pupils can read fluently by the end of Year 7.

Say if our school above gained the outstanding judgement that it set out to achieve. What next? Like a runner with post-marathon blues, I wonder if the school would be able to sustain its momentum.

A colleague of mine recently conducted an Ofsted inspection. Throughout the process he didn’t once hear the word ‘outstanding’. It wasn’t uttered by a single member of staff. It didn’t feature on the SEF. In fact, the first person to use the word was the lead inspector when she delivered her final judgement to the school. If we invest in the process, the end-point might just look after itself.

There are hundreds of things that schools can strive for. A single headline measure, or a particular judgement from a team of inspectors, shouldn’t be the extent of our ambition.

One Click revision

Converting an intention to purchase online into the act of purchasing online is a billion pound problem for the world’s retailers. Just google ‘cart abandonment’ to see how much it bothers them.   Retailers have responded with One-Click ordering and tools which speed up the checkout process by remembering your delivery preferences and auto-filling your address.

Start

As exam season approaches, a similar problem plays out in homes across the land: converting the intention to revise into the act of meaningful, productive revision. Thousands of potential revision hours are lost each day as students fail to convert this intention into action.

Take two ways of fixing this.

Online programmes speed the conversion from intention to action by removing the question of what to revise.  One such programme is HegartyMaths which tracks students’ progress and enables them to pick up their revision from where they left it the last time.  To convert the intention into action, simply log-in to HegartyMaths.

Less techy, but just as powerfully, Walthamstow Academy (a United Learning school which I work with) provides each student with a 1-20 book in each subject.  This 20-page booklet captures all the important stuff they need to know for that subject.  They receive it just before the Easter holidays and it guides them through the start of their revision programme, day by day.  No more sifting through piles of papers for those important notes, or spending time making revision cards; the 1-20 books enable students to crack on with meaningful revision. The intention is quickly converted into meaningful action.

We can’t remove all the barriers our students face this exam season, but we can help convert the intention to revise into meaningful revision.