Education

Chat not cheat

Technology experts and academics says they’re learning to work with artificial intelligence to better prepare students for their industries.

The emergence of AI technology like ChatGPT saw the introduction of regulations and strategies, ensuring a standard remained constant across various institutions.

Curtin journalism lecturer Dr Kayt Davies says universities are being more open to the use of artificial intelligence in order to prepare students for use of AI tools in the workforce.

Picture of student using ChatGPT
Students are using ChatGPT to assist them with their assignments. Photo: Sarmara D’Monte.

As a journalism unit coordinator, Dr Davies allows some use AI technology like ChatGPT in assignment briefs, as long as students provide reasoning about how and why they used it.

“Industry is talking about it, so we need to talk about it in classrooms, we need to prepare students to join those conversations about AI,” she says.

“ChatGPT is good at taking a structure and putting information into that structure, but it became apparent that it has a cadence about the way it writes that is a bit generic.

“If you tell people that they’re not allowed to use it, then they are just going to be sneaky about using it and it’s going to be almost impossible to tell whether they have or not.”

Hear more from Kayt Davies.

University of Western Australia’s School of Psychological Science associate professor Guy Curtis conducts research on academic integrity and academic misconduct. He says the uptake of generative AI has been very swift, with students making evaluative judgments about how and what circumstances to use it in.

“Generative AI is very useful as a tool for doing things like coding and generating text for written tasks, which can include things like assignments, but will also include things like emails, and many other things where the writing needs to be done but may have taken longer without the help of generative AI,” he says.

Prof Curtis says research has shown the quantity of AI cheating in assignments has increased.

“Right from the outset AI cheating has been a difficult problem to detect, so I think educators should have as their basic assumption that any assessment that is not supervised could be completed wholly or partly by generative AI, and probably will be by some students,” he says.

“I think one of the things that generative AI is actually helping with, is getting academics to put more thought into their assessments.”

“Because of the possibility of assessments being completed by generative AI, academics are thinking more in the last year about what they assess, what they really want to know, and how they go about assessing it.”

Curtin students on how they use ChatGPT for the assignments. Video: Sarmara D’Monte.

Curtin law lecturer Tomas Fitzgerald says AI is a tool pushing academics to form effective assessments.

“I think there’s a really exciting opportunity for AI to cause us to think really hard about how we do assessments in higher education. So what I’ve started to see is serious talk about redesigning assessments to make them ‘AI proof’, which would also make them contract cheating proof,” he says.

“It’s a really good opportunity to have assessments that are much more realistic, that are much more useful to students and tests more advance skills, than simply summarise and regurgitate a bunch of information you have retained in this course.”

Hear more from Tomas Fitzgerald on the emergence of AI in higher education.