fbpx
Breaking Campus News. Launching Media Careers.
New facial recognition AI classroom management tool prompts concerns

Numerous legal, ethical, and civil liberties questions raised

A new AI-infused classroom management tool with facial recognition capabilities has garnered attention recently with promises to take attendance, assess the emotional states of students, and monitor classes for distraction, boredom and confusion.

But the technology and similar developments have raised numerous legal, ethical, and civil liberties questions similar to those surrounding campus safety surveillance programs and test proctoring programs widely adopted during the COVID pandemic.

The AI-infused classroom management tool comes from a team at Guilford College led by Chafic Bou-Saba, associate professor for computing technology and information systems.

Bou-Saba generally characterizes the classroom management tool as a boon to educators reluctant or unable to manage their students more directly, while simultaneously improving the academic experiences of those in their class, according to articles covering his new program.

“When you’re in a classroom in (real time) it’s not easy picking up on every student and understanding if they are getting the concepts,” Bou-Saba is quoted as saying in an October news story published on Guilford’s website.

“We want to see if there’s a way to track students’ (facial) responses with how they are learning in class.”

In classrooms equipped with the program, the tool “will document student behavior, if needed, by taking five- to 10-second videos,” Bou-Saba later told Inside Higher Ed. From there, instructors can initiate conversations with students – or reprimand them – for not looking in the proper direction or for interacting with other students when they shouldn’t.

Among those with concerns about the emerging technology is Chad Marlow, senior policy counsel for the ACLU and principal author of a detailed report on what he refers to as the “EdTech Surveillance Industry.”

He told The College Fix AI classroom management tools using facial recognition can be problematic for those who are different from “the average” student.

“For example, persons with disabilities, including neurodivergent persons, may act or appear different from what is deemed typical, but that does not mean they are bored, confused, or distracted,” he said via email.

“In fact, a student with ADHD is likely to appear distracted [at] times because they may be fidgeting or looking around, but such a conclusion would be wrong.”

He said cultural differences could also present challenges for AI, because “students from cultural backgrounds different from where the program is developed may be flagged based on misinterpretations of their behaviors or looks.”

He also pointed out facial recognition programs have been shown to be less accurate in identifying the faces of persons of color, females, and young people, which could negatively affect AI-based attendance tracking.

Furthermore, Marlow questioned how well such programs actually work.

He said it “is very difficult for any person, let alone a computer, to determine how a random person is feeling based on their expression and behavior. For example, some people have a resting face that looks happy and others upset, but that doesn’t mean they are either.”

“[I]t is pretty irresponsible for someone to suggest these programs, especially the ‘affect detection’ ones, work,” Marlow said.

After being contacted by The College Fix in late March via email, Professor Bou-Saba invited The Fix to send him questions regarding his program. The Fix proceeded to email Bou-Saba several questions, including some regarding criticisms of programs like his made by Marlow, however, Bou-Saba did not respond to these questions by the time of publication.

Erik Learned-Miller, a computer vision researcher and professor of information and computer sciences at the University of Massachusetts, Amherst, said that he shares some of Marlow’s concerns, noting that many emotions “have a high degree of ambiguity.”

He said individuals with neurological problems may be assessed incorrectly by such tools, but he did not want to discount all uses of classroom management tools that rely on AI and facial recognition.

After putting forth a “general disclaimer” that he did not know the exact details “of how face recognition is integrated into the workflow of teachers in [Bou-Saba’s] research,” Learned-Miller said, “This can make all of the difference.”

“If the tool is used assuming it is 100% reliable and dependable, that is a problem,” Learned-Miller said via email to The Fix. “However, if it is used as a way to alert a teacher to a ‘possible problem’, then that could be helpful.”

“It is helpful for [AI] systems to have ‘confidence levels’ in the output. However, it is notoriously difficult in AI systems in general to produce good measures of confidence,” he wrote, arguing many AI systems will claim to be 99 percent confident, but still make errors 20 percent of the time, “which is clearly a problem.”

Discussing the use of such programs at the K-12 level, Learned-Miller wrote, “My view is that such a system is best used to alert a teacher that there *might* be an issue that needs attention. After such an alert, ideally, the teacher would assess the student themselves.”

“Of course,” he added, “this is often difficult if a teacher is burdened with too many students.”

Marlow, when asked whether the widespread use of classroom management systems utilizing AI and facial recognition was inevitable, wrote, “Absolutely not, but it will take far better public education about their lack of efficacy and unintended harms of these products. Right now, the conversation is being dominated by the makers of these products (the ‘EdTech Surveillance industry’) who spend millions deceptively marketing them to schools all so they can make a lot of money.”

According to Inside Higher Ed, Bou-Saba is looking to test his program by the end of the semester.

Marlow said that for students who object to the use of such technologies, they should work to promote greater awareness of the shortcomings and lack of proven efficacy of the programs as well as “the negative impact of surveillance on their student experience.”

He said such programs should not be used unless independent, data-driven and verifiable testing demonstrates they are consistently accurate in identifying the problems they allegedly were built to address with no false positives or negatives. He added they should also only be used if there are no better, alternative options for addressing the problem.

MORE: NSF paid universities to develop AI censorship tools for social media

IMAGE: Memory Man / Shutterstock

Like The College Fix on Facebook / Follow us on Twitter

Please join the conversation about our stories on Facebook, Twitter, Instagram, Reddit, MeWe, Rumble, Gab, Minds and Gettr.

About the Author
College Fix contributor Daniel Nuccio holds master's degrees in both psychology and biology. He is currently pursuing his doctorate in biology at Northern Illinois University where he is studying the impact of social isolation on host-microbe interactions and learning new coding techniques to integrate into his research.