ePC

8 myths about paper assessments debunked

on Wednesday, 26 February 2025. Posted in Paper based exams

eight-myths-paper-assessments-debunked

Explore eight common myths around paper assessments and learn how automated marking supports quicker results turnaround and delivers significant cost savings.

Despite some awarding bodies moving towards on-screen, computer based assessments, traditional paper assessments remain an important method for evaluating student knowledge, especially for high-stakes testing.

In this blog, we examine eight common myths of paper based assessments and discuss how the automated marking of paper assessments can support the quick turnaround of exam results and deliver significant cost savings.

Quick links

Myth #1: Paper is inefficient for large-scale assessments
Myth #2: Paper exams are not secure
Myth #3: Paper tests are prone to cheating
Myth #4: Paper assessments disadvantage students
Myth #5: Paper exams are time-consuming to mark
Myth #6: Paper testing is not environmentally friendly
Myth #7: Paper based exams are more stressful for students
Myth #8: Paper based assessments are more error-prone
Conclusion

Myth #1: Paper is inefficient for large-scale assessments

Paper-based exams come with extra administrative burden.

Many awarding bodies suffer from extended lead-in times. They must obtain candidates' names to personalise answer booklets and OMRs before collating and transporting materials to test venues.

Another commonly cited problem surrounds the laborious and inefficient process of script marking with completed answer sheets shipped from test centres to scanning bureaus for digitalisation with scanned scripts later marked electronically. Inevitably, the logistical challenges of transporting physical scripts contribute to lengthy lead-out times.

Extended lead-in and lead-out times can reduce market share for awarding organisations.

Reality

Despite these challenges, paper based tests have been successfully administered on a large scale for decades with candidates filling in forms and exam officers collecting completed scripts and arranging transportation for marking. However, a more progressive approach to script capture, processing, and marking could increase efficiency.

With distributed scanning, test centres can download personalised or blank test papers from a web portal and arrange printing on laser printers or with a local commercial printer. This approach reduces the lead-in time to a few days and enables centres to accept last-minute candidates.

At the end of an exam, the answer sheet is scanned at the exam centre rather than transported for bureau scanning or manual marking. This means marking can commence sooner – with exam results issued in days rather than weeks or months!

Alternatively, OMR technology can, for example, read candidate responses to multiple choice questions (MCQs) and export a value for each question (i.e. A, B, C, D). If pre-loaded with mark schemes, OMR can mark each question and export a final score (rather than raw responses), leading to the release of results in only a few days.

Myth #2: Paper exams are not secure

Critics of pen-and-paper testing argued that digital exams are more secure due to data encryption and automated proctoring.

Undoubtedly, there is a risk of transporting exam papers to and from test venues. For instance, a leaked AQA chemistry A-level paper was stolen from a Parcelforce van in 2022. Meanwhile, the Singapore Examinations and Assessment Board (SEAB) revealed a parcel containing 238 A-level chemistry answer scripts was stolen from a courier van transporting it to an examiner for marking in the UK in 2018. The scripts have not been recovered.

Rare occurrences like these have led to the misconception that paper-based exams are not secure and more prone to leaks.

Reality

Physical exam papers can be securely transported, stored, and distributed under strict conditions.

However, one of the main benefits of distributed scanning is answer sheets are scanned at the test venue. This provides awarding bodies with an additional layer of security as the risk of lost or stolen papers is removed as no physical scripts are shipped back. Instead, scanned scripts are uploaded to third-party on-screen marking (OSM) systems for electronic marking.

Additionally, awarding bodies and exam boards can log on to a web interface to view upcoming sessions and monitor scanning progress for any session or centre. With physical script return, it is only when the scripts are late that a problem is apparent.

Myth #3: Paper tests are prone to cheating

One of the common myths surrounding paper exams is that they are prone to academic misconduct. Examples of academic cheating in an examination include scribbling notes on your hand or using a smartphone.

Reality

Paper exams, held within a controlled examination room, minimise the risk of academic cheating and are less prone to academic misconduct as students sit the exams in tightly controlled environments with in-person invigilation and managed seating arrangements.

Conversely, online exams present unique challenges to academic integrity. The lack of human invigilation, uncontrolled environments, and the availability of technology provide students with an opportunity to cheat and engage in academic dishonesty. For instance, Dyer et al. (2020) reported that cheating was found in 62% of online exams.

During the pandemic, assessment methods shifted from paper to online delivery as academic institutions moved away from face-to-face, invigilated examinations. However, universities are increasingly citing academic integrity as a factor in their decision to move back to paper assessments for high-stakes exams.

Myth #4: Paper assessments disadvantage students

Some argue that traditional paper-based exams disadvantage students with AQA saying digital exams are "truer to the digital world" students grow up and work in.

Reality

While digital exams reflect the modern workplace, they risk leading to a decline in basic skills. Conversely, a well-designed exam paper can challenge critical thinking and creativity while supporting students with concise language, accessible formats, and clear questions. Reversing a decision to make digital devices mandatory in preschools, Sweden, for instance, has emphasised printed books, quiet reading time and handwriting practice after the International Reading Literacy Study (PIRLS) highlighted the declining reading levels of Swedish children between 2016 and 2021.

At the same time, there is growing evidence to suggest that handwriting is better than typing when learning to read and write. In one study on the Frontiers in Psychology website, students showed increased connectivity across the brain when asked to handwrite words, compared with when they typed those words.

There is also evidence to suggest that paper assessments can motivate students to study and prepare more diligently, leading to better learning outcomes. Paper testing can also improve memory recall and information retention while most people are accustomed to using pen and paper, making it a comfortable and familiar format.

Myth #5: Paper exams are time-consuming to mark

Another common myth is that paper-based exams take significantly longer to mark than digital exams. This is especially true if awarding bodies transport physical answer sheets for scanning and marking. It is common for this process to take several weeks when returning completed scripts from some parts of the world.

Reality

With distributed scanning, awarding bodies can accelerate the marking cycle and publish exam results faster. MCQs and short-written responses are automatically marked and exported to awarding systems or standard file formats.

At the same time, you can export response areas for essay-style questions to OSM systems for electronic marking.

Myth #6: Paper testing is not environmentally friendly

There is no denying paper has an environmental footprint.

Awarding organisations routinely print millions of question papers and answer sheets and ship them to/from test centres. This approach has a significant carbon footprint, contributes to environmental concerns, and is one of the driving factors behind the move to digital exams.

For instance, Ofqual has revealed that it takes 5.6kg of CO2 to prepare, print, sit, and mark one English Language GCSE exam. Similarly, an AQA report found an estimated 9% reduction in carbon emitted per exam (measured in kg of CO2 per exam event) by moving from paper to digital.

Reality

Automating paper-based exams is a more sustainable approach.

The ability to print directly at the centres and scan completed answer sheets immediately after the exam reduces carbon emissions as the need to transport exam materials to/from venues is removed.

Myth #7: Paper based exams are more stressful for students

A common argument against paper-based exams is that they are more stressful than online assessments for students.

Reality

It is likely that anyone preparing for and taking exams will experience some stress and anxiety.

With paper testing, students sit the exam in silence and a paper-based exam is likely to be preferable over an on-screen assessment if a candidate is distracted by notifications or the sound of typing.

Paper exams can be adapted for students with SEN requirements (e.g., large fonts and formats, coloured paper). This can help to create a more inclusive environment and reduce anxiety.

Another benefit is that the risk of screen fatigue from prolonged exposure to a computer is eliminated.

These factors combined can reduce exam anxiety and improve mental well-being.

Myth #8: Paper based assessments are more error-prone

Many supporters of online examinations believe that paper-based exams are more likely to contain errors such as missing or incorrect papers, lost answer sheets, or marking mistakes, leading to extra administrative burden on exam officers.

Reality

While human error cannot be ruled out, stringent quality control measures are in place to mitigate the risks. Test papers are reviewed before printing alongside secure delivery and collection of completed answer scripts. Automated grading of MCQs and distributed scanning reduce risks.

Furthermore, online exams are susceptible to technical errors. In 2023, the University of Oxford had to offer students a supplementary test after online exams for the Geography Admission Test (GAT) and English Literature Admissions Test (ELAT) contained technical errors in the test rubric. The ELAT, for example, consists of six themed passages of text. Students reported sitting in exam conditions for up to an hour waiting for the passages to load while the test asked students to consider passages of text from the previous year’s paper.

Conclusion

Digital exams are suitable for low stakes testing like quizzes and offer educators the potential to improve candidate experience in subjects like computer science.

However, in our view, paper will remain the norm for high-stakes exams for the foreseeable future.

Automated marking and distributed scanning solutions negate the benefits of online assessments by maintaining academic integrity, accelerating the marking cycle, and delivering fairer assessments for all.

Crucially, this approach is perfect for organisations that do not currently have the necessary infrastructure to support computer-based assessment for large cohorts.

Are you considering automating the marking of paper-based exams? Call us on 03300 100 000 or complete this form to discuss your requirements.