PROTECT YOUR DNA WITH QUANTUM TECHNOLOGY
Orgo-Life the new way to the future Advertising by AdpathwayThe extent of AI cheating at universities is significant and increasingly going undetected, a survey by The Guardian has found.
Academic misconduct among university students is nothing new. But with the rise of generative AI it is getting easier to do – and easier to get away with too.
A recent survey by The Guardian shows how the rise of the generative AI tool ChatGPT directly correlates with AI cheating among university students.
ChatGPT was released for public use in late 2022. The chatbot quickly garnered attention for its detailed responses and articulate answers across many domains of knowledge. By 2023 its use had exploded across education, business and tech sectors.
To survey the extent of academic integrity violations, The Guardian contacted 155 universities under the Freedom of Information Act requesting figures for proven cases of academic misconduct, plagiarism and AI misconduct in the last five years.
It found that as AI tools have become more sophisticated and accessible, the nature of cheating has changed, with plagiarism now in a marked decline.
The results showed that plagiarism accounted for nearly two-thirds of all academic misconduct in 2019-20. It fell from 19 per 1,000 students in 2022-23 to 15.2 in 2023-24, and is expected to fall again to about 8.5 per 1,000.
On the other hand, cheating using AI tools is on an upward trajectory. In 2023-24 there were 7,000 proven cases of AI cheating – equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.
It is expected that these figures will increase again this year to about 7.5 proven cases per 1,000 students.
But these figures do not reveal the full picture as only 131 universities provided data to The Guardian. This data was not even complete as not every university had records for each year or category of misconduct, particularly AI misuse.
This shows that AI cheating may be an even bigger problem than many think, especially as the use of AI often flies under the radar of current university AI detection methods.
For instance, a University of Reading study found last year that despite the assessment systems in place, students were able to submit AI-generated work without being detected 94% of the time.
Dr Peter Scarfe, an associate professor of psychology at the University of Reading and co-author of that study, told The Guardian: “I would imagine those caught represent the tip of the iceberg.
“In a situation where you suspect the use of AI, it is near impossible to prove, regardless of the percentage AI that your AI detector says (if you use one).
“It is unfeasible to simply move every single assessment a student takes to in-person. Yet at the same time the sector has to acknowledge that students will be using AI even if asked not to and go undetected.”
The issue is compounded by other online tools that further help students bypass common university AI detectors. Videos posted on the social media platform TikTok show tools “humanising” text generated by ChatGPT in a way that sounds like it has come from a human.
Dr Thomas Lancaster, an academic integrity researcher at Imperial College London, told The Guardian: “When used well and by a student who knows how to edit the output, AI misuse is very hard to prove. My hope is that students are still learning through this process.”
For instance, this could be by using AI to generate ideas and structure for assignments and to suggest references, and then do the work themselves.
He continues: “I think it’s important that we focus on skills that can’t easily be replaced by AI, such as communication skills, people skills and giving students the confidence to engage with emerging technology and to succeed in the workplace.”
IET research conducted last year revealed how few people are aware of the environmental impacts of using ChatGPT. For instance, data centres, which are used to train and operate the models, consume huge amounts of energy – as well as a significant amount of water – to cool the servers and keep systems at optimal temperatures.