The ASC of Umalusi takes flawed examinations very seriously and holds the assessment bodies accountable for any blemishes that might undermine the integrity of the Grade 12 examinations.
Image: Google Gemini
EVERY year, just over 900,000 Grade 12 pupils sit for the National Senior Certificate (NSC) exams in South Africa. This school exit exams are high-stakes, as they compress years of learning, and combine academic progress, economic opportunity, and social judgement into a single outcome. In simple terms, they can become less about what a candidate knows, and more about what that result allows or blocks the candidate from doing next.
They determine a candidate’s admission points score, university admission, course selection (medicine, engineering), bursary funding opportunities, and employment opportunities for those not pursuing tertiary studies. In South Africa, the NSC exams are administered by three examining bodies: the Department of Basic Education for public schools (about 900,000 pupils), the Independent Examinations Authority for private schools (just over 11,000 pupils), and the South African Comprehensive Assessment Institute, which services adult learners and home schoolers (under 5,000 candidates).
Although the scale and capacity of these exam systems are very different, all three assessment bodies have some history of fielding Grade 12 exam papers that contain errors. All Grade 12 exams (and marking guidelines) set by all assessment bodies are based on the national Curriculum and Assessment Policy (CAPS), and are guided by specific Subject Assessment Guidelines (SAG).
Each assessment body appoints a panel of examiners for each Grade 12 subject, and an internal moderator evaluates the quality and standard of the exam paper. Each draft exam paper is then sent to Umalusi, the standards-setting and quality assurance watchdog for all school exams in South Africa. Umalusi appoints independent moderators to evaluate draft exam papers and provide feedback on the standard and quality, including each paper's compliance with the SAG, and its comparison with previous exams.
These moderation and accountability protocols are meant to ensure that the final paper is pitched at the appropriate cognitive level and is error-free. The ultimate responsibility for ensuring that the final exam paper is perfect/flawless lies with the assessment bodies that undertake the mass printing of the master copy of each paper. Despite the quite rigorous and robust processes described above, all three assessment bodies have been guilty, to varying degrees, of neglecting to ensure that the long-awaited exam paper that a Grade 12 candidate encounters, is error-free.
These flaws/errors on exam papers could include minor typographical errors, ambiguity in phrasing, major conceptual errors, incorrect data, missing/unclear diagrams, or questions that are just unsolvable. The effect is that when pupils encounter an error, it disrupts concentration and time management. They waste time trying to “fix” the paper instead of answering. Cognitive load or thinking under stress increases. A pupil's working memory then gets split between solving and interpreting flawed questions. Strong candidates may overthink; weaker candidates may panic and disengage.
Unfortunately, performance becomes less about knowledge, and more about how well a pupil handles confusion under pressure. The psychological impact is that errors amplify the pressure inherent in such high-stakes exams. It causes unnecessary trauma and anxiety, and could lead to a loss of confidence and even poorer performance on later questions. Errors do not affect all pupils equally. Better prepared pupils (usually in well-resourced schools) may have been trained to work around errors.
The problem is likely to be compounded for pupils who experience language barriers – what is “slightly unclear” for one pupil, may be incomprehensible for another. Errors on exam papers compromise the validity and reliability of the exam, as the exam no longer measures what it intends to measure, and results become inconsistent, especially when performance depends partly on how pupils interpret flawed questions. For borderline candidates (for example, university admission thresholds), even small disruptions can have life-changing consequences.
To address these recurring errors, Umalusi and the assessment bodies comply with strict policy and procedures. At the marking stage, the identified errors are interrogated, and the marking memorandum is adjusted accordingly based on their nature. If the question was ambiguous, the option might be to accept multiple answers or award marks generously. While this may sound like a fair approach, the assumption is that all pupils have attempted the question. Pupils who did not attempt the question because of the unintended difficulty caused by the error, lose out. If the question is fundamentally flawed, the marks may be awarded to all pupils regardless of attempt, or the question gets eliminated, and the total marks of the paper are reduced by the marks of the culprit question.
Again, this may appear fair, but it artificially inflates marks. The Assessment Standards Committee of Umalusi ASC, a committee I sit on, assesses all available information, including examiners’ reports, moderators’ reports, its internal post-exam analysis reports, and aggregate cohort data, to determine the effect of such errors. Each subject and its associated problem areas are addressed on their own merit. After rigorous interrogation of all data, Umalusi makes a standardising decision: either to leave the raw marks as they are, to adjust them upward, or to adjust them downward in accordance with its standard norm-referenced mark adjustment principles.
The fundamental Umalusi principle is that pupils ought not to be advantaged or disadvantaged by the exam paper they encounter in a particular year. Corrections happen at a systems level, but cannot compensate for the in-exam stress and anxiety. In essence, fixes like moderation and standardisation are damage control, not justice. They stabilise the system, but an exam error does not just cost a mark or two. It can shift the assessment from a test of learning, to a test of resilience under flawed conditions. That is not what a high-level national exam is supposed to do.
The ASC of Umalusi takes flawed exams very seriously and holds the assessment bodies accountable for any blemishes that might undermine the integrity of the Grade 12 exams. The South African public can take confidence, though, in the fact that the Grade 12 exam system, and the quality and standards of the papers, have improved significantly in recent years, and that errors on individual papers, in the main, have been relatively minor.
** The views expressed do not necessarily reflect the views of IOL or Independent Media.
Related Topics: