This online test may well be unconstitutional
The problem of surveillance in what is called “the constant and expanding classroom” predates Covid-19 and is as serious in elementary school as it is in colleges and universities. But unlike teenagers and toddlers, students are adults, clothed in all constitutional protections. And the problem is not going away anytime soon. The global exam proctoring software market is expected to reach $1.5 billion by 2028. The United States is the largest user and developer. So, while the ruling of course only applies to public universities, the outcome matters.
The plaintiff, a student named Aaron Ogletree, alleged that Cleveland State University violated the Fourth Amendment when, before taking a test, he was asked to authorize remote monitoring software to scan his surroundings for “potentially forbidden study aids or notes”. According to the complaint, when the email request arrived just before the test in question, Ogletree had confidential tax documents in view that there was no time to protect themselves. The scan has been saved. A copy was kept by the seller, and the scan was also available for his comrades. This process, he argued, violated his Fourth Amendment right to be free from unreasonable search and seizure.
The court accepted. In his ruling, he rejected the university’s analogies to cases that provided exceptions for items that are in plain view from places where the public regularly visits. Computer cameras, the court wrote, “go where people wouldn’t otherwise go, at least not without a warrant or invitation.”
The school further argued that in fact everyone is now using remote monitoring. The judge was not moved: “The ubiquity of a particular technology or its applications does not directly influence this analysis.” In the court’s view, the “very heart” of the Fourth Amendment is the right to be free from government intrusion into the home; the surveillance scan “took place in the plaintiff’s house, in his room, in fact.”
One could answer all of this by saying that if Ogletree didn’t like surveillance policies, he should have enrolled elsewhere, or perhaps taken a course that didn’t require the analysis. But the information needed to make that decision was presented in a way the court described as “opaque”. And, of course, the pandemic left little choice in any case.
The court accepted that the school had a legitimate interest in preventing cheating, but found, on balance, that the surveillance scan was so intrusive and unreasonable that it violated the Fourth Amendment.
The result warms my libertarian heart. But even those who might disagree with the court’s constitutional finding have reason to be concerned about software surveillance.
Consider the most obvious: apart from all the other things the AI monitors – atypical facial movements or speech, for example (woe to many students with disabilities!), or people passing by in the background (woe students with children at home!) — the first task is to make sure that the person taking the test is not an impostor.
It’s harder than it looks.
For example, a long-standing criticism of AI software used for identification purposes is that it often misperceives the gender of trans people. Why is this important in practice? If the remote proctor identifies the student seated at the keyboard as gender and the school files record the same student as another (or as non-binary), the mismatch may cause the software to flag the test for possible cheating or even to deny access.
The problems are deeper. Commercially available facial recognition products tend to be much better at identifying males than females and white subjects than blacks. And, as one might suspect, there is a deeply troubling intersection. A much-discussed 2018 study published in Proceedings of Machine Learning Research found that the error rate in identifying “darker-skinned women” was as high as 34.7%.(1) Darker-skinned women might even be prompted by the software to shine more light on their face to facilitate remote identification.
None of this is new. The biases of facial recognition software have been known for two decades. One could reasonably answer that what is needed are better algorithms. Fair enough. But consider the possibility that as AI becomes more precise, ethical questions become more complex.
That’s a topic for another day. For now, the Ogletree case is a reminder that the rush to online learning could make education worse. Admittedly, this worsens the tests. There are crude and ugly answers — like requiring students, as a condition of attending school, to waive their Fourth Amendment rights in the event of further closure.
Or perhaps schools should instead dismiss online surveillance as intrusive and unfair. Honor students. And if the fear is that there will be an epidemic of cheating with no one watching, then the problem is much bigger than what the camera can and cannot see.
More from Bloomberg Opinion:
• Online school is the bad idea that refuses to die: Andrea Gabor
• Remote learning can be much better: publishers
• The perverse social fracture of distance education: Justin Fox
(1) Admittedly, facial recognition software is more likely to generate false positives than false negatives.
This column does not necessarily reflect the opinion of the Editorial Board or of Bloomberg LP and its owners.
Stephen L. Carter is a Bloomberg Opinion columnist. A law professor at Yale University, he is the author, most recently, of “Invisible: the story of the black lawyer who shot down America’s most powerful gangster”.
More stories like this are available at bloomberg.com/opinion