Entities
View all entitiesIncident Stats
Risk Subdomain
1.3. Unequal performance across groups
Risk Domain
- Discrimination and Toxicity
Entity
AI
Timing
Post-deployment
Intent
Unintentional
Incident Reports
Reports Timeline

Nearly 5 million GCSEs will this week be awarded using a controversial model which education experts fear could lead to even more results being downgraded than in last week’s A-levels fiasco.
According to analysis shared with the Observer, …

GCSE students in England, Northern Ireland and Wales are receiving results based on teacher assessments, after a last-minute change to the system.
They were originally due to receive marks worked out in a mathematical model, or algorithm, b…

The UK has said that students in England and Wales will no longer receive exam results based on a controversial algorithm after accusations that the system was biased against students from poorer backgrounds, Reuters and BBC News report. T…

Ofqual was warned at least a month ago of flaws in the exams algorithm that left thousands of students devastated, but the regulator pressed ahead amid longstanding ministerial pressure to prevent grade inflation, the Guardian understands.
…
Ofqual’s chief executive, Sally Collier, is expected to be hauled before MPs early next month to face questions about the exams fiasco, it has emerged.
Collier has made no public appearance or statement since the exams regulator announced i…

For such a short string of algebraic symbols, there is a lot we can learn from Ofqual’s grading algorithm (though really it is an equation) – and a lot we can learn about what went wrong.
First and most obviously, the size of the algorithm …

Gavin Williamson, U.K. defence secretary, arrives for a weekly meeting of cabinet ministers at number 10 Downing Street in London, U.K., on Tuesday, April 23, 2019.
Britain is in the throes of a nationwide grading debacle after an automated…

When algorithmic harms emerge, a reasonable response is to stop using the algorithm to resolve concerns related to fairness, accountability, transparency, and ethics (FATE). However, just because an algorithm is removed does not imply its F…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Racist AI behaviour is not a new problem

Biased Sentiment Analysis

Northpointe Risk Models
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Racist AI behaviour is not a new problem

Biased Sentiment Analysis
