English professor: ‘The abilities of chatbots like ChatGPT are impressive, but they aren’t yet advanced enough to serve as a total shortcut for students — they will be soon, though’
A College of Staten Island student recently used ChatGPT on his final exams. He got As on both.
“I used it for my multiple choice finals, two of them, and got a 95 on one of them and the other one, a 100,” said the student in an interview with The College Fix.
“Half the kids in my class used it,” added the student, who asked to remain anonymous.
ChatGPT is OpenAI’s new artificial-intelligence chatbot, and apparently it’s becoming more common for both high school and college students to use it for homework and tests.
“ChatGPT Wrote My AP English Essay—and I Passed,” declared a Dec. 21 headline in The Wall Street Journal.
The College of Staten Island student told The College Fix it’s pretty easy to use.
“All you have to do is copy and paste the multiple choice questions, or take a picture of it so it converts from chat to text, and then paste it into ChatGPT, and out of the multiple answers it gives you the right one and explains why,” he said.
He said he did not use it for his final that required an essay, but that his friend at Baruch College “used it for his final paper, and got a near perfect score.”
“All he had to do was add some final citations, scattered randomly throughout the essay,” he said. “Our other mutual friend who goes to Wagner [College] used it on her final 7-page paper and got an A+, a near perfect score.”
Adam Ellwanger, a professor of English and rhetoric at the University of Houston-Downtown, tried out the service by importing a prompt for an essay similar to one he would give to his students.
He used the “text-davinci-003” model of ChatGPT and it “produced the entire essay in about five seconds,” he wrote in a piece for Campus Reform. Ellwanger then “graded” the essay as he would if it were produced by one of his students, looking at “the quality of the writing and the strength of the argument.”
“Where the writing itself is concerned, Davinci exhibits a total mastery of English grammar and syntax. There are no sentence-level errors in the entire essay,” Ellwanger wrote.
As for the quality of argumentation, Ellwanger wrote “Davinci doesn’t advance any arguments of its own – it merely recounts claims that it encountered in its research. What Davinci has really produced is a book report – not an essay that shows some evidence of critical thinking.”
“The abilities of chatbots like ChatGPT are impressive, but they aren’t yet advanced enough to serve as a total shortcut for students. They will be soon, though.”
In an interview with The College Fix, Ellwanger said that after studying ChatGPT for some time, he is confident that he would be able to “recognize if one of his students used ChatGPT.” However, he added, it would be difficult to prove because he would need to plug in the same prompt that the student used to get the same output.
To combat this, Ellwanger said “professors must anticipate innovative ways to ensure that students do not come to rely on AI for ‘their’ writing.”
Additionally, Ellwanger said other professors need to be aware that their students can use this program to answer multiple choice questions.
“There’s ways to write multiple choice tests to disable [this],” he told The Fix.
Ellwanger stressed that while the program may help students get good grades, it will not actually help them become better writers.
“The only way to become a better writer is to write,” he said. “Ultimately, good writing reflects the soul of the writer … and Davinci doesn’t have one. Depending on AI won’t just deprive you of refining a critical skill in college – it will ensure that ‘your’ writing is soulless and forgettable.”
IMAGE: Alex Millos / Shutterstock