An academic journal has published an article about the risks and opportunities ChatGPT poses for education. And yes, it was written with the help of ChatGPT.
The launch of ChatGPT has resulted in a constant flow of columns and articles that were written using the chat program, often highlighting its dangers. Early this year, for instance, Delta questioned teachers about the use of the artificial intelligence (AI) programme ChatGPT in writing and programming assignments.
Meanwhile, the much-discussed AI application has released his/her first scientific publication.
The authors of ‘Chatting and Cheating: Ensuring Academic Integrity in the Era of ChatGPT’ have demonstrated this. With the exception of the concluding Discussion section, the references and the subheadings, almost all of their article was written by ChatGPT.
Made-up references
ChatGPT made up all sorts of references, which the authors replaced with real references as appropriate. Other than that, however, they did little more than provide prompts (i.e. subjects to write about). ChatGPT even came up with the title.
British newspaper The Guardian says that the reviewers tasked with assessing the quality of an article before it is published were fooled on this occasion. This is not correct, as the article clearly states it was written with the help of ChatGPT.
The article mainly shows what the consequences may be for higher education and academic research. “This is an arms race”, says Professor Debby Cotton, one of the authors. “The technology is improving very fast and it’s going to be difficult for universities to outrun it.”
Equal opportunity
One of the other authors, biologist Reuben Shipway, sees a silver lining. ChatGPT may be conducive to equal opportunity, he suggests on Twitter. It may help non-native English speakers, or students with certain learning disabilities or impairments.
Nature had already reported ChatGPT being listed as a co-author of four articles. A number of researchers think the program would serve well as an aid, while some academic journals want to discourage its use.
The program functions based on probabilities. It doesn’t look for the truth, but produces plausible responses to questions and assignments it receives from users.
Eighty cases of fraud
In Maastricht, eighty students were caught committing fraud, university magazine Observant reports. It concerns second-years who were asked to write programming codes for the Computer Science Skills course. Out of just over one hundred students, eighty handed in the exact same code, most probably generated by ChatGPT.
Educational institutes are looking for solutions to prevent such fraud. Teachers of some courses, for example, are planning to administer handwritten exams or have students provide a spoken explanation of their work.
HOP, Bas Belleman | Translation: Taalcentrum-VU
Do you have a question or comment about this article?
redactie@hogeronderwijspersbureau.nl
Comments are closed.