Skip to content
The UK's only independent think tank devoted to higher education.

To whom is Ofqual accountable?

  • 16 September 2021
  • By Dennis Sherwood

This blog was kindly contributed by Dennis Sherwood who writes for HEPI about Ofqual, A levels and examinations. You can follow him on Twitter @noookophile.

On 7 September 2021, the Education Select Committee met with the Minister for School Standards, Nick Gibb; the Permanent Secretary at the Department for Education, Susan Acland-Hood; Ian Bauckham, the Chair of Ofqual, the regulator of school exams in England; and Simon Lebus, Ofqual’s Chief Regulator. Mr Lebus hands over to Dr Jo Saxton on 17 September, and Mr Bauckham stands down in December; to their credit, neither used their imminent departure to avoid answering any questions on the grounds that ‘that is a matter for my successor to determine’.

The meeting was billed as covering topics including the 2021 summer exam results, 2022 assessment plans, and catch-up support. As expected, there was much talk about vaccinations, contingency plans should exams be cancelled yet again, and how to ‘get to grips with grade inflation’ when they return.

One particular aspect of grade inflation concerns differences by type of school: for example, according to Table 7 in Ofqual’s official statistics, in summer 2021, 70.1% of A level entries from independent schools were awarded grades A* or A, up from 60.8% in 2020; by contrast, at state-funded comprehensive schools, 39.3% of entries were awarded top grades in 2021, compared with 33.1% in 2020.

I dwell on those numbers for they have been the source of headlines such as Top A-level grades soar at private schools in widening divide with state students (the Independent) and Private schools in England give pupils top grades in 70% of A-level entries: Teacher-assessed grades in lieu of exams benefit those at independent schools as gap with state education widens (the Guardian) – headlines very similar to those that appeared in August 2020.

Did students at independent schools ‘benefit’ for a second year – with the associated implications of privilege, ‘playing the system’, and unfairly stealing those much-coveted places at Russell Group universities?

This topic was raised at the Select Committee hearing by David Johnston, MP for Wantage, (question 46), who observed that:

One of the things that concerned people about the results was the gap between private schools and state schools. The first results we got were A-levels saying that whereas top grades had gone up 6% in the state sector, they had gone up 9% in the independent sector.

To which Mr Gibb replied:

If you look at the percentage increase of that proportion, you see that for the independent sector the increase in the proportion of A grades goes up by 15.2% between 2020 and 2021 and by 18% among comprehensive schools.

The meeting then proceeded for several minutes with two independent conversations, mutually interleaved. Mr Johnston was arguing that because a particular independent sector number, 9%, is greater than the corresponding state sector number, 6%, the independent sector had ‘won’. Simultaneously, Mr Gibb made the counter-argument that because a different independent sector number, 15.2%, is smaller than the equivalent state sector number, 18%, the state sector had ‘won’.

This fruitless cross-purposes dialogue eventually petered out, and the meeting went on to other things, but it is a pity that it happened at all. Yes, Ofqual’s report does show that the percentage of independent school entries awarded A or A* increased by 70.1 – 60.8 = 9.3 percentage points, this being 9.3/60.8 = 15.3% relative to the 2020 figure. And for state comprehensives, the equivalent increase was 39.3 – 33.1 = 6.2 percentage points, or 6.2/33.1 = 18.7%. This explains Mr Johnston’s numbers of 9% and 6%, and Mr Gibb’s (slightly differently rounded) numbers of 15.2% and 18%.

But it doesn’t explain the muddle – a muddle attributable to confusion about the answer to the question ‘what is an appropriate measure of which sector fared relatively better in the summer 2021 exams?’

In so far as this might be a sensible question, Mr Johnston was assuming the best measure to be the ‘the increase in the percentage of exam entries awarded top grades’; Mr Gibb, ‘the percentage increase in the percentage of exam entries awarded top grades’. And given that these different measures result in mutually contradictory conclusions, the outcome was muddle and confusion.

So what is an appropriate measure?

My answer is ‘none of the above’. I find that percentages, as used by Mr Johnston, can often be difficult to interpret, and percentage increases in percentages, as used by Mr Gibb, even more obscure. I prefer, whenever possible, numbers that count real things.

Like the number 56,300, this being my estimate (based on Ofqual’s figures) of the number of entries at independent schools awarded top grades in 2020, a number that rose to 65,800 in 2021.

Like the number of state-funded entries awarded top grades in 2020, 205,000, and the corresponding number, 248,000, in 2021.

These numbers verify that many more state school entries won top grades than independent school entries. They also show that for every 100 independent school entries awarded top grades in 2020 there were 100 x 205,000/56,300 = 364 state school entries awarded top grades too, a number that rose to 100 x 248,000/65,800 = 377 in 2021. For comparison, the equivalent number for 2019, the last year in which there were ‘real’ exams, was 320 state school entries for every 100 independent sector entries.  

I find those numbers much more accessible. And meaningful too, especially that sequence 320 (2019), 364 (2020), 377 (2021). What a pity I had to calculate them for myself, for they are not published by Ofqual.

And talking of information published – or indeed not – by Ofqual, later in the meeting, Kate Osborne, MP for Jarrow, enquired (question 70) about the publication of the data submitted by schools in summer 2020 concerning student rank orders, and what where then called ‘Centre Assessed Grades’ or CAGs. As many will recall, the A level results declared on 10 August 2020 were determined by applying the infamous ‘mutant algorithm’ to school rank orders. A few days later, on 14 August, these results were revoked, and the CAGs declared to be the valid grades.

Mr Lebus replied that he was aware that his predecessor, Dame Glenys Stacey, had, in October 2020, talked about releasing the school rank order and CAG data. This is nearly correct: in fact, Roger Taylor, Mr Bauckham’s predecessor, had committed to release these numbers more than a year ago, at the Select Committee hearing on 2 September 2020 (questions 986 and 987 here).

Mr Lebus added that he recalled that Ian Mearns, MP for Gateshead, had asked about this ‘when I appeared in February, and I indicated it would be available in May’. The hearing referred to took place on 9 March (questions 1325 and 1331), and, as can be inferred from the transcript, that conversation too was rather muddled.

Mr Lebus continued by stating that ‘the data are now available and in the public domain. It was published on 6 August for 2020’. He graciously apologised that ‘it has taken such a long time to get this together,’ explaining that ‘it has proved exceedingly complicated but it is now available’, and that he had taken ‘a personal interest in trying to pursue the matter.’

There then followed a discussion about what Ofqual actually did on 6 August, and some further polite muddle.

Reference to Ofqual’s website resolves the matter: on 6 August, a ‘news story’ was posted which includes this statement:

We are now beginning to transfer this data to the ONS. The GRADE data will appear in the Office for National Statistics Secure Research Service data catalogue for accredited researchers to apply the week beginning Monday 16 August. If you are interested in knowing more about the data and its potential use you can email us at [email protected]. Researchers will also be able to apply to ADR UK for funding – more details will be available soon.

This is not confirmation that the data had been made available on 6 August; rather it is a statement that researchers would be able to apply for access in the week beginning 16 August – but there is no indication as to when the data would be available after an application had been made.

I contacted the Office for National Statistics on the afternoon of the Select Committee Hearing, Tuesday 7 September, this being four weeks after 6 August, and three after 15 August. The Office were kind enough to reply on the same day:

In reply to your request: 

The Ofqual GRADE data will be arriving this week, which was confirmed to us last Friday.

This timing is out of our hands and depends on the supplier of the data files.

‘This week’ is the week ending Friday 10 September; as at 5pm on 15 September, there is no confirmation of the release on Ofqual’s website, nor is there any mention on the website of the Office for National Statistics.

Yes, the release of this data is a minor detail, and far less important than students’ learning loss, let alone much more tragic events such as those in Afghanistan.

But the delay is most unfortunate, for the numbers are now long past their use-by date. They were first requested on 2 September 2020, and a commitment was made on that date for release. At that time, there would have been much relevance in understanding the patterns of the CAGs, in throwing light on some of the unfairness that resulted, and even perhaps in providing evidence for appeals. And this year too – scrutiny of the 2020 data would have identified any schools that ‘stretched the limits’ then, and so might have benefited from additional support for this year’s process. But now, all of this is water-under-the-bridge.

There is a further, much more important, point. On 2 September 2020, Ofqual made a commitment. A commitment they did not honour, despite being ‘reminded’ in March. This is not the first time Ofqual have ignored the Select Committee – for example, recommendation 5 of their report published on 11 July 2020 reads:

Ofqual must be completely transparent about its standardisation model and publish the model immediately to allow time for scrutiny. In addition, Ofqual must publish an explanatory memorandum on decisions and assumptions made during the model’s development. This should include clearly setting out how it has ensured fairness for schools without 3 years of historic data, and for settings with small, variable cohorts.

That’s unequivocal. ‘Ofqual must … publish the model immediately’ – the model in question being the details of the ‘mutant algorithm’. But Ofqual refused. Point blank. And the details were not published until 13 August 2020, A level results day.

The BIG QUESTION is therefore ‘to whom is Ofqual accountable?’

Constitutionally, the answer is not ‘the Secretary of State for Education’, for one of the key principles underpinning Ofqual’s creation in 2010 – in contrast to its predecessor, the Qualifications and Curriculum Authority – was independence from Government. As Ofqual’s website states, ‘We’re independent of government and report directly to Parliament’.

Parliament, though, is nebulous. So, in practice, is Ofqual accountable to the Select Committee? Is Robert Halfon, the Committee Chair and MP for Harlow, Ofqual’s boss?

If so, how can Ofqual treat the Select Committee in such a cavalier manner?

Maybe GSCEs and A levels are not the only candidates for reform. Perhaps this might be something that finds its way into the in-tray of the newly-appointed Secretary of State for Education, Nadhim Zahawi, who, although not Ofqual’s direct boss, certainly has both an interest and influence… 

4 comments

  1. albert wright says:

    Thank you Mr Sherwood for this article.

    Using actual numbers rather than percentages seems to me to provide more appropriate information to understand what is happening in independent and state schools.

    Highlighting the way Ofqual have behaved in relation to the Select Committee is also interesting and provides yet another example of how bureaucracies can escape proper scrutiny and create their own agendas.

  2. My thanks to Dave Thomson of fft education data lab for drawing my attention to an error: wherever I refer to ‘candidates’ or ‘students’, I should have referred to ‘entries’.

    So for, example, my statement that ‘56,300 candidates at independent schools were awarded top grades’ should say ‘56,300 exam entries from independent schools were awarded top grades’.

    My apologies for this error, and my thanks – once more – to Dave Thomson for alerting me to it.

  3. …and thanks too to Michael Natzler for correcting the original.

  4. Huy Duong says:

    Hi Dennis,

    From what I have seen, Ofqual has absolute power. What it says, the exam boards follow and the heads follow, organisations such as ASCL follow, and the teachers have no choice but to follow. In theory, perhaps Ofqual should be accountable to Parliament, but it runs rings around the Education Select Committee or repeatedly ignores them, eg, in the 2020 grading. There isn’t any effective check and balance for Ofqual’s power. Whatever Ofqual chooses, that’s always the best. What it chose in 2020 was the best. What it chose in 2021 was the best. What it choses in normal years is naturally the best.

    But do we get value for our tax money? Does it take that much of our tax money to keep grade inflation in check? How much talent and tax money does it take to draw the grade boundaries as being done by Ofqual?

Leave a Reply

Your email address will not be published. Required fields are marked *