FEATURED
ACADEMIA SCIENCE & TECH

ASU’s new AI tool turns faculty lectures into paid courses without permission

Share to:
More options
Email Reddit Telegram

Artificial Intelligence; peshkov/Canva Pro

Key Takeaways

  • ASU has launched an AI platform, Atomic, that transforms faculty lectures into personalized courses without faculty consent, raising concerns about intellectual property and potential misuse.
  • Faculty members like Philosophy Professor Jeffrey Watson express worries that the AI may misrepresent their teachings by splicing videos without context, impacting their academic integrity and opening them to potential harassment.
  • Experts advocate for transparency and collaboration with faculty in projects involving AI, cautioning that misuse of educational content could discourage instructors from sharing their materials.

Arizona State University just launched a new artificial intelligence platform that turns faculty lectures and other materials into personalized courses, reportedly without the professors’ knowledge or consent. 

The new technology is raising concerns among experts and faculty members about intellectual property and the potential for AI misuse.

ASU’s media team told The College Fix that the current tool, known as Atomic, is not intended to be the final project.

“We will test things, and improve things, and it will evolve along the way. That’s part of the innovation process,” the media team said. 

“The pilot explores how ASU can use existing digital content in new ways to reach learners beyond those enrolled in degree programs,” it said. 

ASU did not respond to questions about prior faculty consultation or whether ASU claims full ownership of faculty-created materials.

The platform’s launch has caused widespread concern on campus, according to The Chronicle of Higher Education

The pilot version of the tool, which charges users $5 a month, reportedly simplifies complex topics into short, “decontextualized clips,” and often contains errors, the outlet reported. 

Elisa Kawam, president of the University Senate, told The Chronicle that faculty were not consulted before the platform was rolled out, and they are still learning the full scope of the project.

Philosophy Professor Jeffrey Watson told The College Fix that a fellow faculty member notified him via email that his videos were on ASU Atomic.

Watson said that while he supports anything that makes philosophy more widely accessible, he was concerned when he heard that his videos “were being edited and spliced by AI.”

He told The Fix the AI tool could use his voice and image to communicate ideas that he doesn’t endorse. 

“Philosophy is dialectical,” the professor said. “In these videos, I might briefly argue that what we call the physical world is a massive illusion produced by an evil demon of the utmost power and cunning, intending to deceive us, before then arguing against this position.”

“An AI splicing up my videos without expert review can’t be trusted to communicate this context accurately,” he said. 

He added that this should especially concern professors who teach about controversial social issues.

Further, Watson said the university should not expose faculty to “potential harassment” by using AI to modify their “voice and image without even informing its faculty that they are doing so.”

“Projects like this should involve openness, transparency, and informing and collaborating with faculty, with appropriate expert vetting of materials and appropriate recognition of the public impact of our teaching if it is re-used in this way,” he told The Fix. 

AI experts are raising similar concerns.

Marc Watkins, director of the University of Mississippi’s AI Institute for Teachers, told The Fix via email he is concerned faculty will stop uploading teaching materials for their students if they fear the university will scrape their content for use in AI platforms.

“Text is one thing,” Watkins said, “but what many faculty find most egregious is the scraping and remixing of instructor video content.”

He added that higher education is not prepared for the rapid changes brought by generative AI, as most campuses have had little time to fully process its implications.

Asked about recommendations he would make to universities, Watkins said they should “dedicate the next year to institution-wide conversations about what role, if any, AI should play in the classroom.”

“Doing so will likely highlight the urgent need to adopt principles of transparency, disclosure, and accountability not just regarding student use, but also faculty and administration use of AI tools,” he said.   

But not everyone views the platform negatively. Educational Freedom Institute Board Chair Matthew Nielsen told The Fix the tool could greatly benefit students. 

“Students deserve more flexible, personalized options that are not locked into rigid schedules or one-size-fits-all courses. AI tools like Atomic have genuine potential to expand educational freedom by giving learners greater choice and control,” he said. 

Asked about professors’ intellectual property and academic freedom, he said faculty materials are typically considered the property of the university.

However, Nielsen acknowledged that “commoditizing” faculty materials can compromise trust. 

“Professors may become reluctant to record nuanced material or develop distinctive courses if they fear their work will be chopped up and sold without any input. Over time, this damages the trust essential to a healthy university community,” Nielsen said.