Keith Geary is a former headteacher of two comprehensive schools and is currently a school governor and examiner.
Ofqual’s 2020 proposals for exam grades without exams stretched to 68 pages and generated a summer of results chaos and uncertainty for students and higher education admissions. 2021’s proposals run to 46 pages and – according to the Social Mobility Commission – risk ending in a ‘bigger disaster’ and ‘catastrophic unfairness’ for young people. Former UCAS Chief Executive Mary Curnock Cook warns the ‘sheer volume of appeals might overwhelm the system’. Ofqual’s 2021 proposals have generated 103,000 consultation responses, compared with 12,000 in 2020. Almost 50,000 came from students – 25 times more than responded in 2020. A fifteen minute YouTube video showing a Physics teacher talking through a highlighted copy of the consultation document attracted almost 40,000 views.
That extraordinary number of consultation responses suggests that there is wide concern that the proposals are ill designed to secure fairness.
A Level students want to know their grades are merited by their work and that someone other than their own teachers has assessed its standard authoritatively against that of other candidates. Similarly, universities want to know the grades will be sufficiently robust and fair for the admissions process.
How do we ensure that the 2021 A Level grades are fair enough for both?
Everybody needs to understand how the process will work and see it is as fair as circumstances allow, so keep it simple.
- Apply what I will call the ‘evidence-of-best-performance’ approach.
- Start from students’ work – what they have actually been able to learn and do during their course.
- Select a small evidence base from each student’s work – their personal best, work showing of what they are capable. If that base is small and flexibly defined, the unfairness arising from the unequal impact of COVID-related disruption on individual students’ learning is reduced.
- Use expertise in schools and colleges to propose grades: teachers produce a rank order of candidates based on the evidence-of-best-performance selections, indicating grade boundaries and internally moderated where possible.
- Use expertise at the exam boards to validate grades by confirming or amending: examiners award grades as in any normal year through sampling, based on the rank order – seeking to confirm grades, but also sufficiently rigorous to ensure comparable standards across centres. This slight shift of focus for examiners can be readily achieved by adapting their standardisation training, drawing materials in the usual way from work submitted by the current candidate cohort.
Thus a centre’s proposed grades would be quality-assured against established grade standards by experienced, specialist subject examiners who would – in normal times – assess candidates’ performance. This would reassure both students and higher education institutions that grades have been appropriately awarded.
What would ‘evidence-of-best-performance’ consist of?
It would be a small sample representing a range of the subject content. That range should be limited, clearly defined but flexible enough to accommodate different degrees of disruption to learning.
The sample would be drawn from a range of areas.
- Work completed under exam conditions, using tasks taken from questions or papers provided by the board. It is important students know they are doing tasks other candidates are doing so allowing fair comparison across centres. (Questions used could be those prepared for the 2020 or 2021 exam papers. There is no need to prepare fresh tasks – just give teachers flexibility in choosing which to use.)
- Completed non-exam assessment – this is often an independent study or investigation of some sort but varies by subject.
- Other illustrative independent work certified by the teacher as the candidate’s own
The range – Ofqual’s ‘minimum proportion of overall subject content’ – should be set at a realistic level, but with allowance for special consideration requests for candidates whose COVID-related circumstances have prevented them reaching this minimum by the assessment deadline.
How would this work in practice?
I will illustrate this with A Level English Literature – a relatively subjective, therefore tricky subject, where two examiners can assess the same response, give different marks and neither be ‘wrong’.
1. The school or college determines an evidence-of-best-performance selection for each student:
- Two essays across different papers from the exam board’s questions list, written under exam conditions. Students should have the opportunity to do more than two to remove the only-one-chance aspect of an exam – which particularly concerns 2021 students – and to ensure a choice when selecting the best two to submit.
- One piece of non-examination assessment: this varies across exam boards; each would need to define this within common guidelines.
- A ‘wildcard’ example of work if it is not possible to provide both of the above. This would be a representative piece of independent work, authenticated as such by the centre – for example, a mock paper.
2. The centre constructs its rank order of candidates with proposed grades and submits it to the board.
3. Using the rank order, the board identifies a sample of candidates (particularly those at grade boundaries) and requests their evidence-of-best-performance selections.
4. The evidence-of-best-performance samples are reviewed by the examiners who would normally have been marking the exam papers and non-examination assessment. The review question is simply: ‘Does this sample of the candidate’s work suggest that he or she is performing at the level usually required to achieve the proposed grade?’ One of two possible responses is required: ‘Confirmed’ or ‘Not confirmed: Performing at Grade X’.
5. This process would identify centres where the board needs to engage further and request further evidence.
In this way, an evidence-of-best-performance approach gives the greatest number of candidates the fairest chance to show their best no matter how much Coronavirus has disrupted their learning and preparation.