This blog was kindly contributed by Rob Cuthbert, Emeritus Professor of Higher Education Management, University of the West of England. You can find Rob on Twitter @RobCuthbert.
As A-Level results day (Thursday, 13 August) looms, there is still widespread misunderstanding among students and parents about how grades have been determined.
If students are dissatisfied with their grades there is probably nothing they can do about it, except take exams in the autumn. Contrary to what many people think, most grades are not based on teacher assessments, and the prospects for appeals against an unfair grade are limited and uncertain. Autumn exams probably mean an enforced gap year, and 2020 university offers may not hold for 2021.
1. Most grades are not based on teacher assessments
The Secretary of State announced on 20 March that there would be no exams in 2020 and ‘The exam boards will be asking teachers … to submit their judgement about the grade that they believe the student would have received if exams had gone ahead.’ He added there would be controls ‘to ensure that the distribution of grades follows a similar pattern to that in other years, so that this year’s students do not face a systematic disadvantage as a consequence of these extraordinary circumstances.’ In fact, it is this year’s students who face disadvantage from these ‘controls’.
Examinations are conducted under the auspices of a number of examination boards, all regulated by the national agency Ofqual. Ofqual consulted schools, teachers and parents before outlining the approach it would require exam boards to follow. its most radical innovation was the requirement for schools not only to propose grades for each student but also within each grade band for each subject to rank all students in order – with no ‘second equals’ or similar rankings.
The approach soon attracted fundamental questions about how it would work. The House of Commons Education Select Committee inquiry into the broad effects of COVID-19 on the education system received so many submissions about exams in 2020 that it published an interim report while it was still taking evidence.
The interim report asked Ofqual to publish details of its approach ‘immediately’. Ofqual refused: Chief Regulator Sally Collier was forced to admit that publishing its model would also enable schools to calculate awarded grades immediately. Under pressure, Ofqual ran a Summer Symposium on 21 July where it admitted for the first time that, having tested 12 different approaches, it would use what it called the ‘Direct Centre-level approach’.
For ‘small’ cohorts the Ofqual method will substitute Centre (school) Assessed Grades (CAGs) for the results which the model would otherwise require. Ofqual did not define ‘small’ in this context until 7 August, when the Times Educational Supplement reported that:
Where a subject has more than 15 entries in a school, teachers’ predicted grades will not be used as part of the final grade calculation … Where there are subjects with no more than five entries in a school … pupils will be awarded their teacher-assessed grades … For entries between five to 15 students, teacher-assessed grades will play a role in the calculation, alongside historic school data and pupils’ prior attainment
What this means is that, as has been suspected for some months, teacher assessments of grades are largely irrelevant. The only thing that counts is the ranking.
School teachers, like academics in higher education, do not make judgments about whether a student is 12th or 13th best in a group of 20 all assessed as a B. Academic judgment is about whether student X deserves a certain grade or not – it focuses on the boundaries between grades, and draws on teachers’ understanding of what each grade achievement means. It does not predetermine the numbers in each grade; it determines which grade each student deserves.
That is why the Ofqual approach was so controversial; it asked teachers to do something they had never been asked to do before, and for the first time it set students in competition with their fellow-students for the limited number of grades allowed to each school. Even worse, it judges students not on their merits, but on the merits of previous cohorts from the same school.
Government and Ofqual have represented the approach as being based on ‘teachers’ assessments of grades’, when it is not. As independent analyst Dennis Sherwood wrote on 18 June: ‘It would have been both more honest, and far simpler, for each board to have written to each school saying “our statistical algorithm has determined that, for your cohort of 53 students for 2020 A level Geography, you are allocated 4 A*s, 10 As, 15 Bs … Please enter in each grade box the names of the students to be awarded each grade, ensuring that no grade exceeds its allocation”‘.
2. Schools might delay telling you their predicted grades and rankings
Ofqual have said that schools may tell students and parents their proposed grades. As for rankings, they might or might not be revealed, depending on whether revealing the ranking for the student concerned might also reveal personal ranking data about other individuals. It is for each school to decide its policy, but individual students or parents may submit a Subject Access Request or a Freedom of Information Request (FoIs apply to public bodies like state schools) to get proposed gradings, and perhaps ranking, information. Such requests must receive responses within one month from the date the request is received – which presumably means the response might not be received until 13 September. In many cases this would be too late for university entry, especially for selective universities which may complete admission decisions within days of the results announcement.
3. The prospects for appeals against unfair grades are limited and uncertain
Government and Ofqual statements make it seem that the appeal process will protect students against grades they do not deserve. But the process as first defined was far too limited:
Centres will be able to appeal against the operation of the standardisation model where the wrong data was used to calculate results for learners, or where there was an administrative error in the issuing of results by an exam board.
This allowed for only narrow, technical appeals against the process, not the outcome, as Dennis Sherwood and I wrote in a blog for HEPI on 14 July:
To deny an appeal against an unfair outcome using the defence that a perverse process was conducted fully in accordance with its own flawed rules flies in the face of natural justice.
…
Far better would be a broader basis for appeals on the grounds that the awarded grade is believed to be unfair, regardless of the wealth, ethnicity or any other characteristic of the candidate. There needs to be convincing evidence, but the principle of such an appeal is fundamental. This would not be an appeal against academic judgment; on the contrary, it would be an appeal to restore the academic judgment of teachers overturned by a statistical algorithm. [emphasis added]
On 6 August 2020, Ofqual issued its ‘final guidance’ on appeals, adding some new ‘examples’ of when an appeal might be made, in particular:
where – because of the ability profile of the students – a centre was expecting results this year to show a very different pattern of grades to results in previous years. That could include where the grades of unusually high or low ability students been affected by the model because they fall outside the pattern of results in that centre in recent years.
This may, in practice, prove too high a hurdle for most schools to overcome, and in any case it still does not provide for any individual appeals, especially where an individual has been unfairly treated within a cohort of similar ability to previous years. This minor change was presented in the media as a major concession, even though Ofqual itself has said it expects such appeals to be rare. At the same time Ofqual implied in its 6 August statement that there had been no change beyond minor modifications to its original decision about the appeal process. With no statement on appeal fees, it must be assumed that fees will remain in place, providing a further hurdle for appellant schools.
Other approaches are possible. For students in the UK taking the International Baccalaureate (IB) in 2020, the IB board received a deluge of criticism when its results were announced on 6 July, and instituted an apparently comprehensive appeals process. The Republic of Ireland has a three-stage process, of which the third is ‘If the student remains unhappy with the outcome after stages 1 and 2 he/she can seek a review by Independent Appeal Scrutineers’.
In Scotland, the SQA confirmed the appeals process will be free, and will allow for ‘further, evidence-based consideration of grades if schools and colleges do not think awarded grades fairly reflect learner performance’. However, appeals must be submitted by 14 August, putting school staff under even more pressure at a time when they are making arrangements for a socially-distanced return to school for next year’s students.
When Scottish results were announced on 4 August there was immediate disquiet about the outcomes, particularly for socioeconomically disadvantaged students. Among the 138,000 students there were 125,000 reductions in grading – about a quarter of all grades.
First Minister Nicola Sturgeon defended the SQA approach by claiming the appeals process would address any individual unfairness and that to accept schools’ proposed grades would have led to unacceptable grade inflation. Perhaps the SQA’s statistical model has indeed simply replicated existing disadvantage, as David Kernohan suggested in his analysis on 5 August 2020 for Wonkhe. A growing storm of criticism of the SQA approach included a devastating critique from Guy Nason, a statistics professor at Imperial College, which ended:
The problem at the heart of the statistical standardisation is that it can be simultaneously unfair to individuals, but also maintain the integrity of the system. However, if system integrity damages the life chances of individuals, then it is not much of a system.
The undue focus on collective fairness has led to neglect of fairness for individuals. While the examination system throws up many such issues every year, in normal years there is at least both an actual individual performance to assess, and an appeal process. This year there is neither. A previous HEPI blog said:
there is a danger that, in trying … to preserve the precious national grade distribution, Ofqual will merely tinker with their model to prevent the worst excesses of unfairness to high-profile groups such as socioeconomically disadvantaged and BAME students, with the price being paid by others – who will indeed be discriminated against but are considered more ‘expendable’. We need instead some modest relaxation of the policy of ‘no grade inflation’ to compensate for all the individual unfairnesses.
It is understandable that in the rush to introduce a completely new system it seemed reasonable at first to invent a system in which dissatisfaction could be tackled by an opportunity to take an autumn examination. Time and others’ experience has shown this to be mistaken. If initial results match the allowed national distribution and autumn exam candidates succeed in achieving higher grades, then grade inflation is bound to follow – unless (unthinkably) other candidates are downgraded. Are autumn exam candidates being set up to fail? Will August results be scaled down to allow some headroom in the national distribution?
Furthermore, students sitting autumn exams will almost inevitably face a gap year, because the exams will be too late for students or universities to manage a 2020/21 start. This in itself may be discriminatory, especially for disadvantaged students. The impact of autumn-awarded grades on admission prospects for 2021 is uncertain. Some universities are refusing deferred entry for 2021, others will honour offers but with added conditions. The competition for 2021 entry is likely to be much more intense as 2020 students reapply, a larger 2021 cohort apply for the first time, and international students from 2020 and 2021 return in much larger numbers.
4. An autumn exam probably means a gap year; 2020 offers might not apply in 2021
Sally Collier of Ofqual has urged higher education providers to focus less on this year’s calculated results and instead place more weight on other evidence, such as speaking to the student’s school to assess their potential. It would be better for Ofqual to take some responsibility for the individual unfairnesses inherent in their model and give more weight to teachers’ proposed grades in the first place. The consequences for university entry would be minimal: most universities will be sympathetic, not least because of the trough of the 18-year old demographic and major loss of international students. For broader employment purposes, young people this year and for the next few years are very likely to be disadvantaged by the recession; the exam system need not make it even worse.
5. Won’t most students get the grades they deserve anyway?
At this stage, no one knows for sure, but the chances are slim. The key slide in Ofqual’s Summer Symposium has been heavily criticised for its inconsistencies and one analysis based on Ofqual data suggests that 39% of centre-assessed grades will be downgraded while 61% will remain the same. Higher education data guru Mark Corver has been calling, in the interests of fairness, for UCAS to publish information showing the spread of actual grades against UCAS predicted grades (not the same thing as this year’s CAGs), which would give a different way of assessing which A-Level grades students deserve. UCAS have refused on the grounds that this might enable identification of individuals in some cases.
At a national level, there will be sophisticated analysis of how far this year’s grades deviate from the norm established in recent years, but individual unfairnesses are likely to be legion. The Ofqual approach applies at the level of every grade in every subject in every school. Ofqual data show that the average year-on-year standard deviation between the percentage of grade A and above for large cohorts (defined for this purpose as 20 or more) in one subject in one school is of the order of 12-14%. How much statistical confidence can we have about results from applying the algorithm to every group of students in one grade in one subject in one school?
Attempting to spin the narrative, Ofqual blogged on 6 August about ‘;Fairness in Awarding‘.
Exam board technical teams have worked with us to develop and test various statistical models to standardise teachers’ judgements between centres, to make sure that the one we used is the fairest possible approach, and we’ve also had input from external statisticians and assessment experts. … Of course, a system of calculated grades and a statistical model can never know how an individual student might have performed on the day. Some students might have done better, or worse, if they’d had an opportunity to take their exam; we will never know and they will never know.
This appeal to ‘expertise’ and a ‘statistical model’ (which we have to take on trust, since it remains secret until results day) completely misses the point. The obsession with ‘fairness’ at national level means thousands of individual students will get grades they do not deserve, whether higher or lower.
For the first time ever, students are competing within the school with their schoolmates for the limited numbers of grades available at each level. These are not the grades this year’s students deserve, they are the grades which the Ofqual model says the school deserves on the basis of past students’ achievements.
The only way to restore individual fairness is to restore individual appeals which look at the student’s actual achievements, not the past record of the school. Every previous student has had, every future student will have, access to an individual appeal process. Not in 2020. So much for the Secretary of State’s pledge that this year’s students should not face ‘a systematic disadvantage as a consequence of these extraordinary circumstances’.
Ofqual seem to have arrived in the worst of all worlds, with secrecy, schools working out how best to use a system they don’t understand, limited appeals or checks on school submissions and a scramble to avoid penalising disadvantaged students. But in a zero-sum game this means penalising what the Select Committee called the ‘well-heeled’ middle classes with, quite possibly, a well-heeled slew of litigation to follow. If as Ofqual admit, ‘most students will have one grade changed’ then there is virtually no room for manoeuvre for anyone with, say, an Oxbridge offer of A*A*A, or one HEPI blog commentator whose daughter has an Imperial College offer of A*A*A*A.
Stephen Bush wrote in The New Statesman on 5 August that:
The Scottish government has got it badly wrong over its exam results. England, Wales and Northern Ireland’s governments should learn from its mistake ahead of their own exam results on 13 August. … Some people have received worse results than they would have got had the exams taken place, while others have benefited. The aggregate results are more ‘fair’ yes, but our lives aren’t lived in the aggregate. The people who have had their results downgraded are experiencing a great unfairness.
In some parts of the media the narrative has changed, with commentators now blaming schools for overbidding. Ofqual have so far facilitated this without quite blaming schools, saying ‘it’s not surprising’ that schools want to do the best they can for their students. But if it’s not surprising, Ofqual should have anticipated it and changed their approach, at least by allowing appeals. As it is, students and parents can only wait with trepidation to discover whether what the national authorities regard as fair will be fair for them.
Thank you Rob. It’s a really clear explanation of the situation. So many people are convinced there’s a normal appeal system this year, even my friend Nigel who works for an exam board!
My own son requires AAA for Oxford, coming from a lower performing state school.
2020 A Level grading issues: support group (ALGI) are campaigning for a fair appeals system that is free to use. English students must not face greater unfairness than Scottish students.
Very informative piece from
Rob. Would add:
1 Parent/Students/Teachers. To check if something looks awry in your CAG then defer to your original predicted UCAS grades. They are more stable and more predictable than CAG
2 – Uni’s, for your misses then defer to predicted grades. They give a much more accurate account of a students potential. Use our simulation (find it on @andrewdatahe) but with the health warning that UCAS is refusing to release 2020 tables.
3 – Sector needs to ask some serious questions of UCAS, your admissions service. From a select committee question it is clear that there were no conversations between DFE, OFS & UCAS about using predicted grades during this crisis. WTH?
“Attempting to spin the narrative”… another example… on the Today show, at about 8:15 on 10 August, Chief Examiner Tony Breslin described how “standaridisation is nothing new, we’ve been doing it for years”.
That’s true. But what he “just happened” not to say was that in the past, this took place at the level of the entire cohort (or perhaps cohort within a board), and so over large numbers of students across the whole country.
This year, it takes place within each school, bounded by that school’s subject cohort – a cohort which could be quite small locally, and certainly vastly smaller than the whole-country cohort.
I wonder why he missed your absolutely central point that “students are competing within the school with their schoolmates for the limited numbers of grades available at each level”?
Thank you for this – a well written piece that eloquently sets out many concerns I have with this proposal. I would add a couple of points if I may.
On the Autumn exams, there I believe, another inherently unfair factor which is timing. With exams scheduled for October (date as yet unspecified), A level students may have only 6-8 weeks to get back up to speed on the whole syllabus. There is an asymmetry between this and Ofqual which happily allowed itself many months to come up with its model. And I note that they didn’t bring results date forward (which could have been helpful to students (allowing more time for clearing, finding accommodation (if needed) etc.).
As an adjunct to the point on timing, what educational support is it envisaged will be provided to those taking the exams in the Autumn? These are UVI who have technically “left” school. Some students have had pens down since March and some schools may not have completely finished teaching the syllabus. Is the provision of support feasible (given challenges already being faced by schools) or is it to be completely self taught? If the latter, how can that be fair in the short time frame?
As well as the costs of the appeals, is there also not the costs of exams to consider? Are there another set of exam fees to pay?
The point in natural justice is a good one and one that Ofqual would do well to note as it seeks to be perpetrator, judge and jury. Crowd funding for legal action against Ofqual has already raised a considerable amount
Andrew:
> Uni’s, for your misses then defer to predicted grades. They give a much more accurate account of a students potential.
This is a surprising claim, since predicted grades are not very accurate. For example, a 2011 DfE report found that “41.7% of all predictions were over-predicted by at least one grade”, and it seems likely that things are even worse now that there are more grades to choose from.
Could you say more about what you mean by “more accurate” here?
Andrew, I agree that UCAS should have been involved at a much earlier stage. 80-90% of what I wrote in today’s piece has been obvious for months; as soon as the ranking system was announced we knew it would be a huge risk.
Dennis, yes, for some reason Tony Breslin chose to say “nothing to see here” when of course the standardisation process is completely different this year. And the interviewer let him get away with it, as for the most part did the chair/former head of St Paul’s, who was mostly talking more sense.
Thank you for this – a well written piece that eloquently sets out many concerns I have with this proposal.
On the Autumn exams, there I believe, another inherently unfair factor which is timing. With the “retake” exams scheduled for October (date as yet unspecified), A level students may have only 6-8 weeks to get back up to speed on the whole syllabus. This is hard enough but it seems worse when compared to the time which has been available to Ofqual (which happily allowed itself many months to come up with its model). And I note that Ofqual didn’t look to pressure themselves by bringing results day forward which could have been helpful to students (allowing more time for Matters such as revising, clearing, and finding accommodation).
As an adjunct to the point on timing, what educational support is it envisaged will be provided to those students who take exams in the Autumn? Some students have had pens down since March and some schools may not have completely finished teaching the syllabus. Is the provision of support to these students by their “old” school feasible, especially given the challenges already being faced by teachers, unions and schools. Alternatively, is the preparation for the exams to be completely self guided and self taught? If the latter, I fail to see how that can that be fair, especially in the short time frame available. Those students who can afford private tuitions eell fare better.
And whilst we are on the topic of costs, as well as the costs of the appeals, there is also the costs of exams to consider. Are there another set of exam fees to pay?
The point in natural justice is a good one and one that Ofqual would do well to note as it seeks to be perpetrator, judge and jury. Crowd funding for legal action against Ofqual has already raised a considerable amount of money.
Given we have a government that purports to be in favour of levelling up, one might have thought the last thing to do when attempting to standardize grades would be to ensure the replication of disadvantage.
When Ofqual talks of fairness, it’s in denial of existing unfairness.
The impact is worse for students who lose an Oxbridge place; the UCAS deadline is earlier so even if they prove their ability in an October exam, they would not be eligible for a 2021 application. Two un-necessary gap years, competing with younger kids actively studying their A level subjects. And in some courses, Oxbridge don’t like gap years anyway due to students needing to keep the momentum going.
This system will severely impact on some of our brightest young people. It’s going to be a field day for the legal business.
anyone else stressed for results day lmao
3 more days until my fate is decided fml
For years the Tory education policy has tried to turn individual students into faceless data to be crunched. And now they have achieved it.
These are individual children. That one grade difference that they talk about so smugly will cost many students university places that they have slogged for. With no appeal for any individual. There should be a presumption of attainment in favour of assessed grades.
Ranking an entire cohort of students for ability is about as absurd as trying to rank the members of the Cabinet for incompetence.
100% agreed. The regulators have had since March to get this right for the sake of all candidates, as well as the integrity of the system.
A word to the wise, however: HEPI says it is an UK-wide organisation. At no point, however, does the author use the word England to show the geographic limitation of his critique. What is “national” to some is regional to others. Please name the UK country you are talking about.
Thank you for an excellent piece. Covers the main points clearly and succinctly. All that is needed now is for those in power to read it slowly, carefully and more than once if needed, then make the needed changes. If they don’t then I, and many others will be calling for their heads to roll come Thursday. Sally Collier and Gavin Williamson are the first two to mind though it would be good to see the internal communications to determine the proportions of blame and perhaps who should pay with their position. Or we could apply an analogy to Ofquals own algorithm when faced with limited data about this years issue – count the mistakes each has made in the last 3 years and sack the one who scores most badly. After all, that is fair… on average.
Excellent, if grim, article. Yet another spanner in the works is that several different teachers may have been involved in the grading/ranking process within individual subjects in individual centres, influenced by their own biases, conscious or unconscious. The person ranking may not even have been the same as the person ‘grading’. Add to that whatever process SLTs may have applied to the predictions of their own staff, based on whether they anticipated that downgrading was inevitable to prevent the equally inevitable inflation in CAGs. My own hope is that Unis, in their unfortunate desperation to maximise numbers, will indeed treat this year’s grades as fundamentally unreliable, let alone unfair, in many cases. This generation of youngsters is also more resilient and adaptable than they are often credited.
I have a question about the process. How is the prior attainment of the current cohort taken into account? Ie for A level grades, how are the students actual GCSE grades taken into account? As a cohort level adjustment or will there be individual adjustment (ie, a students GCSE grades directly influence their A level grade).
Andrew: Lots to be concerned about with the approach taken, but I query your suggestion that predicted grades are a solid benchmark to fall back on. Isn’t it widely known they are always massively over optimistic, by as much as 40%, largely because they have to match ‘standard offers’ quoted by universities which are themselves often inflated for marketing purposes? Hopefully this year’s chaos will be the final nail in the coffin for the broken pre-results application system.
Thank you for this. Does anyone know what will happen to the results of students under Special Consideration? The Gov website has only the information that the statistics will be released between Dec and Jan 21 – provisionally.
On: “> Uni’s, for your misses then defer to predicted grades. They give a much more accurate account of a students potential.
This is a surprising claim, since predicted grades are not very accurate. For example, a 2011 DfE report found that “41.7% of all predictions were over-predicted by at least one grade”, and it seems likely that things are even worse now that there are more grades to choose from.
Could you say more about what you mean by “more accurate” here?”
Yes I can see why that is surprising. UCAS predicted grades look awful. The mean and the distribution look nothing like achieved. Its easy to find stats that make them look ridiculous. But, predicted grades are misunderstood. They are estimates of the likely upper bound of attainment, roughly top quintile. Which is what you need if you are a university making conditional offers on potential. Also, be wary of asumming that a difference between predicted and exam awarded means an error in predicted. Exam awarded grades have high inherent randomness themselves. Error if you prefer.
You can control this upper-quintile property out. Further university admissions are about individuals not exams. The UCAS *set* of grades contains much more information than a single predicted grade. And the course that someone holds an offer with is very important too. Put these together in proper statistical models, which we have done for some, and you get a very accurate estimate of the grades someone would have got in exams. And because the predicted grades were collected as normal pre-pandemic than are unbiased and you can test their equality properties against previous years. We say more about this approach here – https://wonkhe.com/blogs/we-can-make-admissions-work-without-a-levels/
Thanks for this very helpful overview Rob.
The huge scope of the SQA/Ofqual job – everyone, in every exam – did make it more or less impossible. Perhaps they should have put more weight on deviations from what would have been expected from individual-level (rather than aggregate) prior attainment. But that wouldn’t have worked in all cases. Highlighting that exam-awarded grades have high randomness even in good circumstances would have been helpful and pointed towards different solutions. But overall – for what they have been asked to do in difficult circumstances – they have come up with a solution that has some terrible problems, but perhaps not more so than other things that could have been done.
Where things could have been much better is in university entry. The awarding bodies tend to overlook this not-incidental use of the grades but it is the urgent and large-scale impact on life chances. Here careful processing of the pre-pandemic UCAS predicted grades were the perfect answer for the large majority of university applicants. We could have placed everyone fairly by now. The Government decision not to allow early confirmations was unhelpful, and the lack of a central large-scale effort to provide universities and schools with estimates of normal attainment based on the UCAS predicted grades has been a serious omission.
UCAS’ decision not to help students by releasing the the predicted x achieved matrices, essentially a tool to know if their calculated grades have gone wrong, is as mystifying as it is damaging. The claim that this can’t be done without disclosing personal information for large numbers of students is unsubstantiated and simply not very credible.
The part of the equation which isn’t being given enough attention is the hastily introduced number controls for English universities in particular. These are choking off the ability of universities to use their judgement to redress the wrongs introduced by not having exams. You could award everyone A*/D* but that doesn’t solve the problem of not enough places. This is the key problem and one the Government still has time to solve by relaxing number controls.
As an outsider, one thing that I have not seen (of course, I may have missed that) commented on is how this policy will predictably worsen “not the best schools” as parent flee. There is no guarantee of what will happen in 2021 and plenty of reason to worry.
Jeremy:
Hi. I’m pointing to the difference between definitive accuracy of predicted grades (a point to point mapping) & them doing their job of predicted what range of grades a student has the potential to achieve.
On the latter they are actually very good, and can be modelled to allow for fluctuations to a very high resolution level at every uni in the land.
UCAS holds years of data on this, and could have acted on its duty to fair admissions by releasing this data and equipping the sector to u/stand how predicted grades work against multi dimensions and what normal achievement looks like. It’s really simple to do.
They have not released this data (and yes I think that is wrong), nor have they even discussed the power of them with DfE/SQA/OfS/Ofqual.
So modelling predicted to achieved grades is very accurate way of understanding what an individual would likely have achieved in normal exam circumstances, much more than the CAG.
Looking at the process, it strikes me that there is a potential for disabled students to be individually, if not systematically, disadvantaged if their ‘prior attainment’ has been limited by the inavailability of reasonable adjustments that would have been in place for their final exams, or which would have been grounds for appeal if they were not. Equally ‘prior attainment’ might have been limited by significant periods of disability-related absence that would not necessarily be reflected in final results. There’s even scope for teacher bias in assessment of capability as plenty of my disabled friends report having encountered it during their time at school.
As OFQUAL has provided no scope for reasonable adjustment in either the assessment process or the appeals process, this appears to place it in breach of the Equality Act 2010 on multiple grounds, particularly as EA2010 states that requests for reasonable adjustment must be considered on an individual basis and that treating disabled people identically to non-disabled does not guarantee equality.
Mark, Andrew: Thanks for the clear explanations of the predicted grade alternative, and the advantages of using predicted grades over standardised CAGs. (Thanks, too, for the link to the P/Q/A blog post, with the interesting discussion of what exactly is being measured.)
I have another question, if you don’t mind. Andrew proposes “for your misses then defer to predicted grades”, but I don’t see how this would work in practice. Suppose a university makes 80 conditional offers for 60 places, expecting that not all candidates will meet their conditions. Won’t deferring to predicted grades result in giving places to all 80 applicants, leaving the university over capacity?
For most universities this year the constraint on UK recruitment will be the number controls not physical capacity.
They have to take in the people who meet conditional offers through calculated grades – even if the higher grades take them over capacity (medicine is a prime candidate for this). Then they will take in people below conditions. This is where they can use info from predicted grades to fix any problems they see in the calculated grades. This is where the restriction from the number control can prevent them from doing what they want, or take in good applicants who have found themselves in Clearing.
Ofqual presumes that it is God. When there are discrepancies between its grades and the teachers’, naturally it is right and the teachers are wrong, “Because we haven’t had time to train you how to grade the students”.
When a student has consistently got As in tests and mocks, and the teachers predict A, Ofqual can say “B”. It says “Nobody can know how the student can perform on the day”, so we don’t need to know about those As in tests and mocks and prediction (for cohort sizes greater than 15), B must be right.
The key issue is how reliably can Ofqual’s algorithm can identify the correct
A-level and GCSEs entries for downgrading. Ofqual does not show this information to the public, and it tells us that its
algorithm is the best. Imagine the Department of Health and Social Care forcing a vaccination programme on school children without
any published data on efficacy and toxicity.
Has anyone seen any clear guidance about whether universities will honour their 2020 offers for 2021 if the Autumn exams are taken and the grades achieved?
Mark, thanks for your comments, and having tracked your own analyses in recent months I do think that your assessment of what could have been done by UCAS and universities, if government had allowed it, would have been a significant improvement. I agree that at the start it must have seemed like an impossible task facing Ofqual and the exam boards, so I wouldn’t criticise their first attempt as strongly as I do now. Ofqual have throughout been focused on national ‘integrity’ at the expense of individual fairness and natural justice. They could have listened and changed tack, but they have instead doubled down and failed to trust teachers and schools enough, if at all. Much of what was in my blog today was already evident months ago – I speculated then that CAGs would be completely ignored, and indeed they are for all cohorts >15. So Ofqual could have published a model early on, invited schools to ‘fill in the boxes’ as far as they thought fit for each grade and subject – but required them to justify their judgments by reference to the previous achievements of this year’s students (i.e. not previous years). That would have allowed time for consideration of ‘deviant’ cases, perhaps even with Ofqual issuing a revised model with different numbers for a second round. There could have also been early confirmation, still with scope for individual appeals, probably by schools on behalf of individuals where appropriate. But instead the secrecy and attempts to ‘reassure’ everyone have induced most if not all schools to guess what the model would be and react accordingly, as FFT datalabs have shown.
I don’t have enough grasp of the number controls to know how they will play out; I’d assumed that the 5% tolerance, combined with the lack of international students, encouraged by Clearing Plus, would shift significant numbers into the more selective universities at the expense of many at the other end of the spectrum, causing severe problems for a considerable number. But you know much better than me.
Huy, the problem is not with teachers’ ability to grade, but with their ability to put students in rank order. As you yourself have pointed out earlier, treating the rank order as sacrosanct is a big part of the problem, because of the huge uncertainty around whether student A can reliably be ranked, for example, No 5 or No 15 in a group of 20 awarded a B. That in itself is a major challenge to the confidence level with which the statistical algorithm can be used, and is no doubt part of the reason the Royal Statistical Society is now calling for a public inquiry.
Why did results day need to be left so late? No time for appeals or even a calm, reasoned decision about how to deal with the inevitable disappointment for many. Results should have been available back in July before schools broke up.
this as several respondents are making clear is a ‘car crash’ waiting to happen. Reassuring statements from ministers won’t cut it particularly now that Nicola Sturgeon has effectively both disowned the SQA process and highlighted the parallel outcome about to hit the rest of the uk and which can only be accentuated by the Swinney announcement to the Scottish Parliament this afternoon moving all Scottish educated students to the CAG grading model (with a no detriment rule for those who benefitted from application of SQA moderation rules).
The latest ministerial statement looks more like an attempt to pass the buck to universities who will struggle to hold places open until 07/09 for (all) those who might conceivably appeal. The concession on the ‘numbers control’ model is quite strict given that it is set at the threshold of the terms of the original conditional offer rather than the marginal point of acceptance set by course/school/faculty within the institution or the attainment level acceptable from a Clearing entrant.
Expecting ‘after the event’ fixes either to work at all or to meet any standard of procedural fairness looks highly implausible given that anyone with even the slightest awareness of the operation of the awarding body/UCAS results service must be able to see that the optimum ‘time window’ for institutions to make adjustments to Confirmation decisions is in the days prior to official publication of results rather than following this date – my comments yesterday on the blog from the UCAS Head of Policy at the end of last week refer where release of the CAG data in parallel with the OFQUAL sanctioned grade detail was canvassed as what could have been the optimum practical solution to the scenarios which are now emerging if only a little strategic foresight had been applied across DfE, OFQUAL & UCAS.
A further issue as Rob C mentions above is that the CAG result will apply in the case of small cohorts in individual subjects within centres; this will almost inevitably benefit entrants for ‘minority subjects’ at A Level ( eg Classical or minority modern foreign languages) and which are most likely to be found in independent school settings rather than in comprehensive sixth forms ( where such subjects may not be part of the curriculum ‘offer’ either for budgetary reasons or because of lack of suitably qualified teaching resource) thus further ‘baking in’ educational advantage to this year’s outcomes.
As Rob C and others have suggested potential obvious solutions to this ‘car crash’ such as Early Confirmation and/or the establishment of a ‘pause’ between results publication and the onset of Clearing ( tentatively suggested earlier this summer in a blog contribution by Mary Curnock- Cook) both of which would have provided some space for review of the impact of the standardisation model both for its consequences for individual progression and for its equality impacts have all been summarily rejected.
What is clear is that the Sturgeon/Swinney intervention will put the consequences of the OFQUAL results model on the front pages for at least the rest of this week and probably beyond and move these debates away from the HEPI/WonkHE blog space as taken up by HE policy wonks and related interested parties into a much bigger public space placing both 16-19+ examination and assessment arrangements and HE admissions arrangements under the most severe public scrutiny with perhaps all sorts of unforeseen consequences for the future of these long standing models.
To-date standardisation has been used to moderate actual exam scripts. With no scripts to moderate this year, it has been adapted for a completely new purpose – generating the grade decision in the first place. Under GDPR rights if the exam bodies are making these decisions about candidates through purely automated means, via the algorithm and by-passing CAGs, students can appeal against these decisions. The exam bodies have to ensure that, at the very least, students can express their point of view and have a human provide a review of the decision (most likely their teachers as they know them best) That would seem ultimately to lead to the same outcome as Swinney has just announced for Scotland .
My son just received his CIE A-level results today – despite having received an A in AS Chemistry and an original predicted grade of A* he has been awarded a D. Do you have any insight into how this calculation may have happened?
Simon: a quick detail if I may, please.
Your point about the “small cohort” rule benefitting independent schools. That was my belief too.
But – and to me surprisingly – it’s exactly the other way around.
At the risk of being awarded the nerd-of-the-week trophy, I was told of a database that contains subject cohort data, for every school in the country – https://www.compare-school-performance.service.gov.uk
I spent an afternoon (!) analysing a spreadsheet with more that 600,000 rows, and this is what I found, based on 2019 data…
For GCSE, about 0.6% of students were in cohorts of 5 or fewer; and about 96% in cohorts of 16 or greater. S
For A-level, there are a greater number of smaller cohorts, so about 8% of grades are in the <=5 category; about 67%, 16 or greater.
There's also a state/independent school split: for 2019 GCSE, about 21,000 students in independent schools were in cohorts of <= 5; about 32,000 state; for A-level, about 16,000 indpendent, 43,000 state. So state schools seem to be gaining more benefit from the 'small cohort = CAG' rule.
My spreadsheet skills might have let me down, so if there any other nerds out there who would like to check my results, please do! And apologies for any mistakes!
Hi Sally,
Cambridge has stated in emails to offer holders and on its C19 faq that if you make the grades in one autumn exams they will honour the offer for 2021 entry. But, as far as I know, each university has its own policy on that.
Mark, on this GDPR issue: there is perhaps some human involvement in that the rankings on which the grades are based have been determined by teachers and the school. Perhaps a lawyer can advise! Or maybe the courts will be asked to decide the question.
Rob, Quite sure that this argument is the one the exam authorities as proxies for Ofqual would use in court. I’m sure a lawyer could opine (and another might the other way) but in the end as you suggest court would decide
Sophie, I don’t know anything about the CIE grade-awarding process but I found this on the CIE website https://www.cambridgeinternational.org/Images/592001-explaining-our-statistical-standardisation-process-in-more-detail.pdf.
It seems that their process is similar to the one adopted by Ofqual, using results from the school/centre for the last 3 years, but also using global results from just 2019. I guess most of the comments in my original blog above will apply.
You might also note that the International Baccalaureate results early in July were also greeted with a storm of protest, as a result of which the IB introduced a special review process. The IB approach had some similarities with Ofqual’s method but did not use a computer algorithm. As a result of the protests Ofqual put the IB under scrutiny.
Dennis, on small cohorts: you are right in terms of absolute numbers, state school students outnumber those from private schools in small cohorts. However I believe about 7% of UK school students are in private schools, whereas your numbers show about 40% of private school students in small GCSE cohorts, and about 27% of private school students in small A-level cohorts. So private school students benefit disproportionately from the small cohort approach.
Hi Rob,
I think the human involvement needs to come after the automated decision and qualify/inform/sanction that decision. In the case of the rank order, this was carried out prior to the automated decision in the form of a stack ranking/profiling of the immediate cohort. The automated grades were simply attached to the rank order – again via automated decision. Had the exam bodies generated the grades first and then supplied the pool to teachers to allocate I think your interpretation would be correct. As it stands it was an automated decision subsequently applied – via automated process again -to a pseudo-anonymised cohort profile. Under GDPR, an automated process produces what is in effect a recommendation concerning a data subject. It is only when and If a human being reviews and takes account of other factors in making the *final* decision, that decision would not be ‘based solely’ on automated processing. I think the exam bodies will need to tread carefully should rights be exercised under GDPR as the majority of them enshrine these right in their privacy notices (AQA for example states “You have the right not to be subject to automated decision-making and can require that any such decisions are reviewed by a human” – which brings us back to teachers again as they are surely the only “humans” capable of understanding if the automated decision/recommendation is the right one for the specific data subject (as opposed to simply ensuing no grade inflation which was the original purpose of the standardisation Algorithm) In attempting to do what’s “right” for OFQUAL, they have inadvertently neglected the rights of their own data subjects. It’s a potential legal minefield!
Hi Rob – yes, thank you for your point about small cohorts – you are of course quite right.
And Mark! Wow! The ‘I married an automaton from outer space’ concept is amazing – there must be all sorts of implications as regards AI… But the implications as regards this year’s process are absolutely intriguing…
Dennis,
Two things come to mind. Either they didn’t consider the GDPR implications of re-purposing an algorithm and dataset previously used to moderate actual exam scripts to generate original automated decisions by itself. Or they did consider it but thought in the chaos of Covid and lockdown, nobody would notice. Not sure how Williamson now apparently allowing students to use their mock grades instead of the standardised grade inspires great faith in their data endeavours. A mock is surely just the final result minus 4 months of intensive study and therefore bound to be a poor outcome for the majority of students. It seems a political aversion to re-instating CAGs might be behind all this nonsense. Maintaining the narrative that teachers ruined the process seems to be paramount.
Denis – thank you for the number crunching. I am no statistician and appreciate it. May I ask: in absolute terms there are more state school students are in the cohort category <=5, is it correct to say that at an independent school, you have a higher % chance of being in that cohort?
Ironically, introducing mocks as a “safety net” has sent us in the direction that the small cohort rule was designed to avoid. We have offspring in a small set. It’s a bit of an exam factory and, presumably to incentivise students, the school ranked mock results using their own internal bell curve (small cohort of high achieving students). A very decent % (would normally fall well in A grade territory) came out considerably lower. We didn’t question the wisdom of this as it was a mock (the clue is in the name). We will question it next year – just as schools are to be focussing on opening, not only are they digging up the 2019/2020 mock results (and, I hear, there are appeals for remarks on there way already), students and parents in 2020/21 and later years will be closely monitoring mock results and how they are marked.
Mark, a very persuasive argument and a legal minefield as you say. Raises the intriguing possibility that Ofqual may have required exam boards to break the law by refusing an individual appeal process.
Rob,
I have been speaking to the ICO and they confirmed that whatever appeals process an organisation has in place, the appeals process enshrined in Article 22 of the DPA 2018 has to be adhered to in the case of automated decision making such as the standardised grades generation this year. I asked this explicitly – Me: “Can a data subject request a human review of the decision and rectification of that decision if it is found to be wrong in the opinion of the human. ”
ICO: “You have the ability to request that yes “. In my opinion they have misunderstood DPA rights under the assumption that the relevant relationship is between the individual data subject/student and their school as data controller. The moment they decided to by-pass CGAs for larger cohorts and generate grades from historical data this became an automated decision process, covered by Article 22. It is there in plain sight “…the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.” The only way I see for exam bodies to meet their obligation under Article 22 would be to reference the CAG as these are precisely the human input the process lacks at present.
Mark, interesting, but It may mean not that you can appeal against the grade but you can appeal against your position in the ranking … with potential conflict with the obligation to protect other people’s data
thanks to both Dennis & Rob for their comments re small cohorts and who stands to benefit.
Listening to the minister on the Today programme this morning it’s clear that the real intention behind last night’s supposed concession is to double-down on the standardisation algorithm and to place long-standing concerns about grade inflation above the futures of individual students in these unprecedented times. It also signals continuing resistance to a reliance on teacher-led assessment despite signalling back in March this would be at the centre of this year’s arrangements through the CAG as moderated by SLTs within individual centres.
The ‘car crash’ we can expect to see play out as the day progresses tomorrow will perhaps sharpen the comparison between the position in England and that in Wales where AS levels within the broader A Level have been retained providing a bedrock of 40% of assessment by the examining board to fall back on backed by a largely modular approach to Level which will again provide access to appropriate assessment tools. Perhaps Gove and Cummins might like to explain tomorrow why line summative examinations as the sole basis of assessment are such a good idea in the context of supposed cross government department public health preparations for pandemics.
Institutions are now left with needing to balance their Clearing operations with receipt of updated Scottish results – not available for another week ( on 21/08), an appeals process for A Level on which this afternoon there is literally no supporting information or supporting timetable with absolutely no idea of the number of students schools and colleges may be putting into this process and all in the context too of the reintroduction of SNCs which might yet need to be loosened to take account of this wholly predictable mess.
Rob,
I believe that it is the grades that have been generated by the algorithm, not the rank order. This profiling was created by the schools and used by the algorithm in a pseudo-anonymous way to distribute the grades it had generated. The grades themselves however came from data used. And this standardisation data was repurposed from its original function of contextualised cohort referencing. But neglecting to incorporate the CAGs for >15 cohorts removed the qualitative judgement that could have given the human dimension required under Article 22. From the data subject point of view, if the algorithm comes up with a 3 and matches that 3 to their rank order, they will get a 3 without any human input into that grade. Appealing under GDPR rights would necessitate a human to look at the grade an consider ” is this a fair decision” for the data subject?” If not, what is? CAGs were submitted to the exam bodies as a fair reflection of students abilities and predicted outcomes were they to have sat exams in June. It is hard to think of a better qualitative judgement of a student than that of their own teacher but this is exactly what has been removed from the automated standardisation process for large cohorts. Students wont be challenging rank order, they will be challenging unmediated machine-generated grades. ICO confirmed that this human mediation has to happen after the automated decision which can only happen at appeal now. And they have to rectify the error once reported to them – presumably by referring to the only relevant qualitative judgement to hand – the CAGs.
To put another way. “Is this a fair decision?” needs to consider whether having the data subject’s grades extrapolated from historical cohort data is more appropriate for the individual than a qualitative judgement based on that individual’s personal learning trajectory. When the query is raised through complaint or appeal it is the best outcome for the individual subject that needs to be resolved. This takes the focus away from preventing grade inflation (which is cumulative across the cohort) back to the individual data subject who should surely have be the main priority from the outset?
I asked a question a couple of days back which got lost in the subsequent comments.
How does the A level algorithm take into account previous GCSE results? At individual student level, or is it a cohort level moderator?
Alison, GCSE results are part of the evidence schools were asked to consider, but they only sent a Centre-Assessed Grade and a ranking to the exam boards. The algorithm itself uses ranking, and sometimes CAG, plus the last 3 years of A-level results, to deliver its verdict. We will know more when Ofqual publishes the algorithm today. An appeal on the grounds that this year’s cohort is untypical of the last 3 years might use GCSE grades as part of its evidence. There are no individual appeals and GCSE results are not a cohort moderator, except in the sense of being part of a cohort appeal.
Thanks. It’s obviously misreporting then, I have repeatedly read in the papers in the last few days that prior attainment of the current students was one of the factors formally taken into account in the grading. I was curious because I hadn’t picked this up from the guidance previously. Blooming journalists!
My son got downgraded grades, He has offers from renounced Universities. He was motivated enough to get into the Uni and meet the offer. Any super computer cannot determine that motivation and efforts put in by my son and his teachers who knew he will get in. When there was a system in place to determine offers why was that not used? so the only thing that needed to be addressed was if the Uni got over subscribed , they should than have decided to hold online exams or Skype Interviews to accept the students numbers. Job done! I am sorry but this fiasco was not needed at all. The emotional stress both to the student and parents was uncalled for.
Despite the horrific COVID 19 pandemic, my daughter still wants to become a doctor even if it entails serious health risks from poor PPE… etc . The fiasco of downgrading A levels has caused her a significant problem of whether to repeat year 13.
Will medical schools consider policy changes for application for 2021 resits yet .
Most medical schools don’t accept resits, Institutions who might consider resits ie repeating year 13, will only accept a reapplication to their uni if the first sitting achieved quite high grades . Eg AAB.
If a student who was downgraded to ABC, and cannot appeal, does this mean the student will have no option of reapplying for Medicine .
Or does a student In this situation need to do another degree to gain future opportunity to apply to medicine .
Rob, Thanks for your illuminating article.
Regarding using GCSE results to predict the ‘A’ level CAG, I can’t find any official guidance except the ‘Previous grades’ for re-sitting pupils. Perhaps this comes under ‘Holistic approach’. This is particularly pertinent if a pupil has not done the subject before ‘A’ level. Do you have any advice for a student who has consistently done better than some classmates in homework, tests and Mock exams, but then received a lower grade. My research has not found a reasonable explanation, but hope you might.
Hi I’m writing this in the middle of the night sleepless because of the grades I have achieved.
In year 12 – I received ABB in my mocks
In year 13 – BBC – after taking the UKCAT AND BMAT entrance exams for medicine .
My UCAS predicted grades at the end of year 12 was A*AA – However my CAGs forwarded to exam boards were BBC when was downgraded to BCC!!
I emailed my teachers and requested
1. The grades they forwarded individually for CAGs
2. My rank order within my grade
I was denied this information .
I have filled out an appeal form but I don’t know what I am appealing, did the centre assessed grade lower thre grade my teacher submitted? Did they use my name or candidate number – is there bias/discrimination involved ?
I don’t know what to do my teachers would always affirm that I would get AAA in my exam at the very least an AAB – my university has taken back their offer and now I have no university to attend
I need to know IF the grade submitted by my subject teachers MATCH my CAGS or IF they were lowered before being sent of to exam boards – can anyone help me please !
I’ve tried emailing the exams officer to no avail should I fill in a FOI form ? Any help please !!
@Alison, I am sorry to hear of your disappointment . You clearly need to know your CAGs, and your school or college should provide this information. They should also have a process for supporting you with your result and progression options, and help you understand whether there is grounds for appeal (although at the moment this is unknowable as the appeal criteria have been withdrawn). It is possible the centre are appealing their whole cohort results.
You should request your CAGs formally in writing, to whatever main office address you have for your school/college (or perhaps write directly to the principal or head). If you are refused, identify what formal complaints processes exist and follow it precisely. Remember to keep all correspondence polite and specific, and follow any procedures carefully. Also recognise that centres will be busy and under considerable pressure, so there may be a wait. Good luck.
Hi Alison , I am a mum of daughter in same situation and have been trying to take stock of what has happened and best way forward.
I believe the school should And will send you your teacher calculated grades but it might be 1 to 2 weeks .
You can request freedom of info, however If they are inundated with school business they can decline the FOI request under the law, especially with pandemic as reason .
I do think the teachers calculating the grades should take responsibility and explain why the grades have been calculated like this . Bear in mind They had no choice but to rank students as forced by exam board / Government etc.
So you may have been ranked down by teachers against their professional judgement of the grade you deserved and also downgraded A second time after by exam board based on historic data.
For example , they had to rank In order , how many students would get an A, and if you were number 7 on the list , You would be probably awarded a lower grade or more, i.e. They could not put all 7 students at ranking 1 which should increase guarantee of a top grade .
You cannot appeal if you don’t have teacher Calculated grades and ranking to begin with, so I feel that it could be considered discriminatory Against a fair position to appeal . I also feel the ranking forced on them By the exam boards is discriminatory as essentially you are competing against your peers which is not what should happen in a fair system . I would advise you to read about this .
With regard to medical school ,Putting aside the grades fiasco , and the possibility of no appeal , or certainly a long delay , one worrying issue is that you need AAB to be allowed to resit another year 13 for a medicine reapplication . Any lower needs extenuating circumstances . I would hope the medical school boards will agree to change their resit policy based on pandemic being extenuating. So all may not be lost .
I would recommend strongly to consider sitting your best subject in October as a starting point .
If your university is unable to offer you a medicine place this year, you may need to Consider to reapply for Medicine For 2021 and a repeat ucat/bmat, which I know is overwhelming . Doing this will show determination and resilience .
To avoid all this stress and mess,
you could choose to accept
a healthcare based degree of which there are many and apply at a later date for medicine so then you don’t have to go through repeating A levels as those results would be irrelevant . You would be an amazing doctor with following this route .
As a nurse myself , I also completed a biology degree after my nursing qualification , then started to apply for medicine as a mature student as I was always thirsty for learning . So what I am saying is that you can have an exciting future starting a bit differently .
Good luck . Just remember not to fall out with school as you will need references , you may decide to repeat Year 13, and mostly it wasn’t the teachers fault .
Hi Alison again.
as you are aware, the government had made a complete U-turn in their decision and are now awarding CAGs.
However these are still unfair and adhere to OFQUAL’S guidelines by using RANK ORDER to award grades to students. Furthermore, teachers often went against their own better judgement to align with the previous historical data and minimise moderation by the exam board.
As such, I still don’t agree with my CAGS and my teachers awarded me – BBC which is EXACTLY the same as my Year 13 Mock exam. I am not able to get into my chosen university. but that’s beside the point now. I’m trying to get my CAG to be modified/change to somewhere
1. reflective of my hard work and perseverance throughout my two years 2. as close to my year 12 mocks ABB or my predicted grade of A*AA.
Could you please let me know the steps I should take, if this is even possible? I have a phone call with the deputy head of school tomorrow morning.
What should I say, I want to be polite but adamant it’s a fine line so any advice is welcome.
I have spoken to the headteacher about my grade and he Has said there is no way he can change my grade Unless it’s for
Malpractice on the basis of discrimination or
Administration error which he said the school did not do.
I requested that grades submitted were double checked but he said that no because there was a rigorous process followed.
I don’t know what to do – the system has forgotten about student like me who have been held back by my CAGs
Alison, if neither your CAG nor your algorithm-awarded grades are what you need for entry to the course you want, then there is probably nothing you can do except take the Autumn exam if you want to change the grades for this year. Your school headmaster has told you correctly how things stand; the school has no grounds on which to make an appeal. You really need to get expert help and advice on how to move forward from someone who knows your situation; the school is usually the best source of advice. I hope you can find a good way forward. Best wishes
Thank you for your response – I just don’t understand how that is fair at all. I believe my teachers under predicted me and my capabilities in order to meet the requirements for Ofqual, which I think is now unfair
Do you think if I asked for them to Look my work again and try to find fault, that they would so I can go up a grade – will that affect the rank of other students and thus their grade ?
The school has refused to give information on my rank order aswell. So I was thinking,
Could I apply for a subject access request form to see what evidence was used to grade me .
Hi Alison , my daughter is in similar position . I don’t think schools will be allowed to appeal CAG in part because unless proven bias or discriminatory marking etc It would lead to 10’s of 1000’s of students appealing CAGs when teachers are inundated . If anything changes to that affect with regard these type of appeals it won’t happen In the near future unless schools admit to forced ranking down if students prior to submission. You could consider sitting at least one of your subjects in October to aim for A * if you can But that would be alongside repeat ucat and bmat . I’m guessing most students wouldn’t manage sitting 3 A levels in October . And do resits next summer, but I think Medical school Places may be reduced making it difficult to gain a place . I personally think you should have a serious think about doing a related healthcare degree / apprenticeship or science degree to reapply as a graduate in future . It would add wealth to an amazing career in medicine as you would be more al rounded as a doctor . I don’t think you will make progress in the current arena of appeals so take a different route but one that you would enjoy too .
Hello,
My daughter is very disappointed with her A level result. She feels that she was given grades based on her mock exams rather than on her overall work effort and the level of the work she was submitting. We have informed school that we want to ask for a formal review and for details of the greads/marks she has received through the year.
So far, apart from the acknowledgement email we have not been given any information.
The grades she was given by her teacher means that she lost out on her University place.
What can my daughter do in this situation? What are her options and steps for challenging her A level grades?
My daughter has specific learning difficulties and I am not sure if this was taken into account. Additional, just before she was due to take her mock exams she has sufferEd physical injury and her mentaller Heath was really low.
Since then she has worked very hard o improving her mental health and I feel the recent events are dragging her down again.
I would appreciate any information or advice
Kate, I’m not even sure what the appeal process is now, but in any case the school would have no grounds for appeal, unless there was an arithmetical or administrative error, or obvious evidence of bias or discrimination. Even if the school changed its view of the CAGs, which seems unlikely, the only prospect of changing your daughter’s grades would probably be to take the Autumn exams. But I don’t claim to be expert on these issues, you really need help from the school or some other professional adviser who knows your daughter’s situation.
Thanks Rob, a very informative article.
Unfortunately our son was a victim of the algorithm, incorrect data supplied to WJEC by his large FE college and then the regional variation in Wales where the AS grade was the safety net for A level grades, not Mock results like in England.
One question which the college has failed to answer from us is this: ‘Why was he not awarded a new AS grade as he was due to resit one paper which he messed up last year?’ We would have thought he should have been re-graded for AS overall. Shouldn’t the school and exam board have been obliged to award a new June 2020 AS grade for him, independent of his overall A level grade?
Steve, apologies for the delayed response, but this kind of detail is outside my experience, I’m afraid. You’d need to look at the small print, probably in the guidance (which should be publicly available) from Ofqual/exam boards to the FE college and the school (you mention school and college – perhaps your son left school after AS level and went to the FE College?). If incorrect data were supplied than there are presumably grounds for the college to submit a correction. And presumably it would be for the college to appeal – if it can – on the grounds that the AS paper was to be retaken.
However, since Wales eventually went with CAGs, is the AS level outcome still relevant? It presumably depends on which subjects your son is doing now and whether he has gone on to do the relevant A-level following the AS. Presumably it might also be relevant if his preferred option after A-level became unavailable this year after results day but before the U-turn to CAGs (as it did, sadly, for numbers of students e.g. in medicine). In any case, I hope things work out for him now. Best wishes.
hello
Hope some one could guide me and advice. My son who currently on EHCP(educational health care plan mand doing A/L. He did his gcse exam last year and received an grade 3 for English Language but his actual marks were 1 mark below pass grade for paper 1 and 4 marks below pass grade for paper 2. since starting year 12 he didn’t received any support laid out in his EHCP PLAN, he needs are delay in speaking and language and he should get more support for Literacy. The school did not provide any support and the school did not give any lessons after the resit in November, just 6 weeks of lessons.
The school given him the grade 3 again, the other resit boys on level 2 course received lessons and given an pass grade but my son wasn’t given the same opportunity to be assessed. He is been unfairly discriminated because the teachers didn’t have any evidence to support his grade. I have complaint to school and the council but school said they will provide the support from now and asking him to sit the exam in Autumn. Thi is ridiculous he didn’t have any lessons for nearly an year and they expect him to pass the test. I have been advised my other educational ehcp advisers that I have an case for malpractice under bias and discrimination, if my son was offered the support and lessons he is entitled he would definitely passed the exam because he just missed by few marks. they completely ignored him , I have also had an meeting with deputy head to clarify the misinformation given by school he was offered lessons in small group and support was given, but I have cleared all these misunderstanding but they still not going to appeal because this would come against teachers professional judgement but this is about unfairness and the work he did before his resit are reasonable, I have all evidence of previous exam results and email chasing for support and lessons from last September and his english exercise book where evidence of level 4 and 5 grade given. I am in process of complaint to exam board, I really would some expert advice how to process this under Malpractice, He wasn’t given extra time to finish all his work in the class so school did not give reasonable adjustment and did not take into consideration all these issues i have raised. This is clearly education negligence and the teachers though just because he have language issues he wouldn’t progress even though he have improved and missed last exam by few marks. He needs this grade in order to progress to university next year.
Thank You