A British fiasco derived from algorithm

UK exam U-turn exposes algorithm's deep flawsReuters

It was a British fiasco, but Prime Minister Boris Johnson termed it a "mutant algorithm". The fiasco is about the exam results of both GCSE and A-level involving millions of pupils. Though both these exams are run and managed by British authorities, it impacts thousands around the world including Bangladesh.

According to Dhaka Tribune, this year about 8000 Bangladeshi students received their A-level grades and a similar number of them got International GCSE results. As the COVID-19 pandemic made holding of any examination impossible this year, the results given were based on mathematical calculations, known as algorithm, which caused a national scandal. Hence, PM Johnson, after almost two weeks of silence told pupils at a school, "I am afraid your grades were almost derailed by a mutant algorithm and I know how stressful that must have been.”

Initially, an algorithm was used to determine A-level grades for about 700,000 students this year. But, it was scrapped after nationwide outcry following detection of serious problems. Government ministers initially defended the grades produced by the algorithm saying that it was world-class procedure. But, following widespread anger over major flaws detected in algorithm-based grading, the government made a U-turn and decided to use predicted grades from teachers instead.

By basing it so much around previous school performance, a bright student from an underperforming school was likely to have their results downgraded through no fault of their own

The GCSE result was delayed to replace algorithm-awarded grades following the A-level fiasco. A sudden and last-minute change in GCSE results of more than four million school-leavers means too generous grading which has been described as grade inflation.

In England, the official exam regulator Ofqual is responsible for awarding grades and this year it had asked teachers to supply for each pupil for every subject: an estimated grade and a ranking compared with every other pupil at the school within that same estimated grade. These were put through an algorithm, which also factored in the school's performances in each subject over the previous three years.

The idea was that the grades this year - even without exams - would be consistent with how schools had done in the past. Ofqual said this was a more accurate way of awarding grades than simply relying on teachers' assessments. The rationale behind Ofqual’s preference on algorithm was teachers would likely to be more generous in assigning an estimated mark, which might lead to a much higher number of pupils getting the top grades.

The fallout of the scandal continues and the National Education Union (NEU) called Mr Johnson's “mutant algorithm” comments "brazen" and accused him of trying to "idly shrug away a disaster that his own government created"

Once A-level grades were announced it showed that nearly 40% students got lower grades than teachers' assessments. More shockingly, the downgrading affected state schools much more than the private sector run schools. By basing it so much around previous school performance, a bright student from an underperforming school was likely to have their results downgraded through no fault of their own. Likewise, a school which was in the process of rapid improvement would not have seen this progress reflected in results. As private schools are selective and better-funded in most years they perform well in terms of exam results. Thereby an algorithm based on past performance puts students from these schools at an advantage compared with their state-educated equivalents.

The English fiasco happened within two weeks of Scottish experience where algorithm based results of their higher qualification, comparable to the A-level, was overturned by the government as soon as the fault detected. But the government in London responsible for England, Wales and Northern Ireland’s exam results seemed reluctant to take lessons from Scotland and insisted that its algorithm was a robust one.

The algorithm suggests the sense of powerlessness felt by students disappointed by their results. Many experts want to find a human way back, instead of computers deciding such crucial things for us. One may wonder whether such desire will extend to other things too

Prime Minister Johnson was on summer holiday and his silence caused widespread anger. One tabloid not known for political journalism, splashed a single word headline ‘Missing’ with a manipulative caricature of Mr Johnson and asking its readers , ‘Have you seen him’? The fallout of the scandal continues and the National Education Union (NEU) called Mr Johnson's “mutant algorithm” comments "brazen" and accused him of trying to "idly shrug away a disaster that his own government created".

The results fiasco also caused considerable logistical problems for universities too. Some of the students, lost out in their first choice course and university due to lower grades, rushed back causing oversubscription in many universities. It forced the government to lift its cap on the numbers each institution can admit. But, admitting more students means tackling other challenges, such as capacity, staffing and facilities.

Though this cap and advance offer for courses by universities do not have any direct impact on international students including Bangladesh, the grading fiasco had some unsettling effects on many Bangladeshi families. Many of our friends and relations made their child’s result known only when the revised grades were announced. The obvious reason was initial results were not what they expected.

The fiasco raises questions about the oversight of algorithms used at all levels in society, ranging from very basic ones to complex examples that utilise artificial intelligence. Tech giants like Facebook, Twitter, Google everyone use algorithm and whatever we see on our newsfeeds on social media platforms are chosen by such mathematical tools.

The results produced by the algorithm left everyone unhappy and now the Office for Statistics Regulation (OSR) said that it would now conduct an urgent review of the approach taken by Ofqual.

The algorithm also suggests the sense of powerlessness felt by those students disappointed by their results. Now many experts want to find a human way back, instead of computer deciding such crucial things for us. One may wonder whether such desire will extend to other things too.

Kamal Ahmed is a senior journalist and columnist