Hundreds of students are using AI to cheat on their assignments – but universities have worked out how to catch them and it’s simpler than you might think

Waves of students are turning to artificial intelligence to write their assignments, but universities are doubling down on their methods to intercept them.

The University of Sydney revealed it will complete 330 assignments using AI by 2023 and the University of NSW recently said it had similarly discovered a ‘new wave’ of cheaters.

OpenAI’s ChatGPT engine, which fueled the AI ​​boom, has become the tool of choice for lazy students, accounting for 60.2 percent of total website visits in the industry, according to Visual Capitalist.

The tool takes already written pieces of text from the Internet and combines, rephrases and paraphrases them to answer the questions asked – with varying levels of accuracy and impressiveness.

Marking systems have struggled to keep up with the use of AI in the classroom, but recent developments combined with a crackdown on lax grading by the university watchdog are making it harder for those who would rather have a robot write for them.

Although UNSW did not reveal how many AI-assisted assignments were picked up, the academic misconduct report records a significant increase in violations by 2023, reports the Sydney Morning Herald.

In 2023, universities have discovered a ‘new wave’ of fraudulent assignments using AI

A widespread consensus among students is that the use of AI was undetectable, but Deakin University fraud detection expert Professor Phillip Dawson revealed that close reading of altered work proved otherwise.

Professor Dawson said Turnitin, an AI flagging software tool, is actually only good at finding plagiarized work if the student is ‘an idiot’.

“Most research showing good detection rates is based on the assumption that someone is just copying and pasting, and not asking ChatGPT to rephrase or paraphrase,” he said.

Students are increasingly turning to AI tools to help them complete their work

A University of Sydney spokeswoman told the newspaper it was actually much easier to detect fraud, just with closer inspection of human attribution markers.

‘If (an assignment) contains derogatory language, is not relevant to the question, has false references or does not answer the question asked, we will investigate and use the Turnitin AI tool as part of this process alongside a number of indicators of misconduct’ , she said.

Turnitin regional vice president James Thorley agreed that the tool was intended as part of the proofing process, rather than the end-to-end process of AI detection.

The university sector watchdog, the Tertiary Education Quality and Standards Agency (TEQSA), in June demanded all higher education providers draw up action plans on how they will stamp out AI orders.

These plans should include thorough consideration of how each institution will ensure the integrity of its education.

Professor Dawson said that unless students are guided during an assessment, markers should assume they can turn to AI to complete it for them.

Both he and Mr Thorley agreed that universities must now navigate the tricky landscape of how many students can use AI before it becomes a bigger problem.

A University of Sydney spokeswoman said markers read assignments with greater attention to detail to weed out cheaters

Mr Thorley said the universities his company had consulted with were encouraging deploying generative AI ‘in the right framework and guidelines’.

A lecturer in English at the University of Sydney, Associate Professor Huw Griffiths told the publication he had done so has already integrated ChatGPT into its coursework.

Mr Griffiths said the use of AI allowed students to “seize their own agency” by discovering its limitations compared to traditional research sources.

University of Technology Sydney (UTS) is also taking a similar approach by encouraging staff to discuss AI tools with students.

UTS’s idea behind embracing tools like ChatGPT is that employees ‘invite students to actively participate… and think critically about how they can be used’.

Related Post