The camera pans across a sterile gymnasium, where dozens of tables are arranged six-feet apart. Students, anxiously biting nails and tapping pencils, take their seats. Exams are passed out. Moments later, camera bots descend from the ceiling. Lenses hyper-focus on the test takers, capturing every blink, twitch, and cough. A deep voice reads out: “Big Education is watching you.”
This scene, in all of its Orwellian glory, could be the opening sequence of a (bad) dystopian film. Intense, heart-thumping music would play as students raced to select A, B, C, or D. If students waved to each other, the anthropomorphic camera bots would assemble into an army and swallow them. Spooky!
Of course, descending robots and omnipresent narrators won’t be the scariest thing to appear at your next organic chemistry exam. Yet, this dystopian premise of hyper surveillance is not as far-off as it may seem. For thousands of students engaged in online education due to the COVID-19 pandemic: yes, big education is watching.
When in-person testing became difficult due to the ongoing public health crisis, universities and other educational organizations were left to determine how to hold examinations securely in a fully virtual or hybrid environment. Some professors mandated that students take exams on recorded Zoom calls. Others invested in specially-designed testing software. High-stakes exams, like the ACT, turned to blended forms of proctoring technology.
These test proctoring software systems are unduly creating a digital panopticon that threatens the privacy of a wide range of students, many of whom have been stripped of the choice to participate in the first place—after all, if your standardized tests and degree progression depend on these proctoring tools, you’re not often in a position to decline to take the exam. While we do acknowledge the need for education to adapt to the virtual environment, we should not allow mass biometric surveillance to become the status quo.
In our pre-COVID world, if students were taking an exam and a professor sat in front of them with a video camera, we would rush to declare the action as a violation of those students’ privacy. The same should hold true when an exam records the sounds in your house, detects and tracks your eye movement, or uses facial recognition software to monitor your behavior.
One example of proctoring software is Proctorio, a “learning integrity platform” that, according to its website, is partnered with four hundred universities—including Duke. Proctorio enables “automated proctoring,” “advanced facial detection technology,” and “automated and live ID verification.” Capabilities include “room scanning,” where students are asked to provide a 360-degree view of where they are taking the exam, “screen recording” where instructors can see what the student was seeing on their monitor, and “head and eye movement,” where the program will flag instances of “abnormally high” or “abnormally low” eye movement.
This recording and analysis of an individual’s personally-identifiable physical and behavioral characteristics, known as biometric data, is a gross invasion of their privacy—and for most test-takes, the privacy violations don’t end when the test is over.
That camera shot of you panicking about a logic game on the LSAT? Still hanging out on the virtual proctor company’s servers, maybe even after you graduate from law school.
Other data markers such as your home address, government-issued identification, education records, and device identification numbers are stored there, too. At that point, your private information is in the hands of the virtual proctoring service, and all you can do is hope that the company will be able to fend off hackers targeting your personal data.
Once a data breach occurs, your information can then be sold to commercial or nefarious actors. Once that information is out there, there’s no guarantee that you’ll have any recourse. In the case of a recent data breach of ProctorU, hackers obtained approximately 440,000 records, including users’ full names, universities, addresses, and phone numbers. Yikes!
Given the glaring problems with how virtual proctoring tools collect and store data, students are presented with an impossible choice: surrender your information to these third-party examination platforms or skip the test and suffer the consequences. In the current system, there’s no mechanism for students to opt-out or even to request the correction and deletion of their data after their exam is over. We should not be required to consent to this reality with no other options.
Institutions are using exam softwares to ensure “fairness.” Yet, requiring such intensive data-gathering is antithetical to this mission. The use of proctoring software only creates additional barriers for students who are already facing challenges during remote learning and serves as a way to compound the effects of algorithmic discrimination.
Much of this software operates under the assumption that students are able to find quiet, distraction-free spaces to complete tests—an assumption that is wildly classist during a time in which a diverse array of students must work remotely. Students caring for a screaming child or those who do not have access to a good webcam will be more likely to be “flagged” for academic dishonesty. So too will people with medical conditions that trigger certain behaviors be deemed “abnormal” by the algorithms.
According to a recent report published by the Electronic Frontier Foundation, the surveillance features will create a culture of success that is “determined not by correct answers, but by algorithms that decide whether or not their “suspicion” score is too high.” The “flagging” system adds an additional layer of stress to examinations that are already nerve-wracking for students.
Additionally, the use of proctoring software requires a strong internet connection. If your computer is buckling under the demands of Zoom and Sakai, the bandwidth demands of virtual proctoring software may be too much for your computer or your network. And unlike in class, turning off your camera is simply not an option. In the instance of the California Bar Exam, one failed identity authentication step, even due to a weak connection, will kick you out of the exam.
You also could be bumped from an exam for factors entirely out of your control. These virtual proctoring technologies rely on facial recognition technology—of course, how else would they be able to track how many times you blink or move your gaze back and forth over the same question? We’ve seen how poorly-tested facial recognition technology can perpetuate algorithmic discrimination on Twitter, and these new proctoring tools are not immune.
States across the country are using ExamSoft to proctor bar exams—the company offers “A.I.-driven remote proctoring” that fundamentally depends on continuously recording the test taker, to offer a “complete and permanent record of all of the exam taker’s actions from the first moment of the exam through the very end.” The virtual proctoring system will automatically flag potential “abnormalities” in the recordings that test-takers upload.
However, people of color have been unfairly penalized by flaws in the software’s facial recognition algorithm and have faced significant barriers to test-taking. Alivardi Khan's face wasn't detected by the software, citing “poor lighting concerns.”
ExamSoft doesn’t seem to think there’s a fundamental flaw in the company’s software, instead suggesting that Khan should sit in front of a bright lamp. Kiana Caton, one of the participants in the remotely-proctored bar exam, decided to shine a bright light on her face throughout the two-day process, which caused headaches.
Because many institutions are using these virtual proctoring tools as a back-up during the COVID-19 pandemic, students may be locked out of their exam, with no option for recourse. There’s no in-person substitute for an exam that’s already been moved online. And if the virtual proctoring software says they’re cheating on the exam, how can they challenge that claim?
These virtual proctors aren’t just a concern for this summer’s bar exams or standardized tests this fall. If we accept and normalize this level of mass biometric surveillance now, we can expect that this technology will remain even when it’s eventually possible for us to meet and take exams in person. At that point, we may even see additional mission creep, as software designed for standardized, proctored tests enters classrooms.
As always, not everyone is going to be concerned that their data is being stored indefinitely. You might not be worried about a third-party company storing hours of video footage documenting every move you make—after all, your data is yours, whether you would like to see it wiped from the web or shared across services.
Forget about your personally-identifiable information and consider this: the system that you might find tolerable will create significant barriers in higher education for people of color, people lacking Internet access, and anyone lacking the pristine conditions that these virtual proctoring services demand. When it comes down to it, defending our privacy requires all of us to stand together: students and universities.
A global pandemic is not an excuse to build systems of mass biometric surveillance. If you’re a professor thinking about using one of these virtual proctoring services, consider giving an open-note, open-book exam where you’re not concerned about cheating and are instead focused on students’ grasp of the material.
If you’re an admissions officer for a graduate or professional school, recognize that standardized testing companies using these virtual proctoring services are coercing students into providing highly-identifiable data to a private company, even despite mounting cybersecurity and privacy risks. Help organize an in-person examination option for individuals who want to opt-out (the SAT has had a few of these), or make your offers of admission conditional on having students take a test when we can finally have in-person testing.
Fundamentally, AI-enabled virtual proctoring requires students to trade away the security of their biometric data in exchange for furthering their education. Higher education needs to be focused on the true concerns of accessibility and continuity of education for students facing barriers in the remote learning environment. If we are, instead, focused on how many times someone blinks when solving an integral, we are no better than a bunch of bots.
Big Ed, please stop watching us.
Jessica Edelson and Niharika Vattikonda are Trinity juniors. Their column, “on tech,” runs on alternate Thursdays.
Get The Chronicle straight to your inbox
Signup for our weekly newsletter. Cancel at any time.