Australian Universities Seek to Curb ChatGPT Use by Students
How will higher education react to AI text generators?
That didn’t take long. A consortium of the leading universities in Australia is updating policies and changing testing practices to combat the rise of AI-based writing assistants on university campuses. The news arrives courtesy of The Australian newspaper’s The Oz online edition.
“Australia's top universities are updating integrity policies and redesigning exams to account for the risk students will use sophisticated artificial intelligence to cheat…Group of Eight (Go8)… is ‘proactively tackling the emergence of AI’ through redesigning assessments and using new targeted detection strategies, Chief Executive Vicki Thomson said.
“‘Our universities have revised how they will run assessments in 2023,’ she said. This includes ‘Live+’ exams for offshore and online students, in which a supervisor monitors their computer screen throughout, more in-person supervision for local students, and greater use of pen and paper exams and tests.”
The Inevitable Impact on Education
Futurist and generative AI analyst Bakz T. Future’s prediction for 2022 was that OpenAI’s writing assistant GPT-3 would go viral on college campuses. I can find no statistics on how widely these technologies are used. However, I have spoken with Bakz a few times in the past six weeks. He did not know ChatGPT was coming so soon, that it would be so good, and that it would generate a flurry of national news stories.
While there was some risk that his prediction would not come true in 2022, the arrival of ChatGPT just in time for fall finals and semester-ending essays made the viral adoption a near guarantee. The GPT-3 interface was not exactly consumer-friendly, which would naturally eliminate some users. The GPT-3 writing assistant variants, such as Jasper AI, are more user-friendly but do not have a strong awareness outside business communities. The companies also were not explicitly marketing to students.
ChatGPT is based on the new GPT-3.5 model and is better at writing, synthesizing ideas, and providing useful details than earlier versions. More importantly, it is easier to use. The interface is similar to a text chat or chatbot. There is no learning curve. Just ask your question or enter your request, and it starts writing a response.
Mike Sharples, emeritus professor of educational technology and The Open University, UK, suggested that the GPT-3.0 Davinci 2 model already produced graduate-level essays as of May 2022. After the GPT-3.5 Davinci 3 model came out, he noted the performance improved significantly.
Given the warnings, it is not surprising that some universities thought it best to begin tackling the situation proactively. The initiatives are not comprehensive, but there seem to be some reasonable steps factored into 2023 planning. It may be more important that they are also being undertaken by a high-profile set of universities.
Group of Eight Combats AI Misuse
Go8 was founded by “Australia’s leading research-intensive universities.” It was created to influence and shape higher education policy. Its members include the University of Melbourne, the Australian National University, the University of Sydney, the University of Queensland, the University of Western Australia, the University of Adelaide, Monash University, and UNSW Sydney.
Though the Go8 members collaborate on many initiatives, the universities appear to be moving forward independently at the moment. Sydney University apparently updated its academic integrity policy in November “to include content generated using artificial intelligence,” according to The Oz. “The University of Queensland also mentions AI as one form of contract cheating in its policy to be implemented next year.”
Is ChatGPT Illegal in Australia?
Australia passed a law in 2020 making it illegal to sell essays or take exams on behalf of university students. Academic cheating services, as they are called, can face fines up to AUD$100,000 (about US$67,000) and two year’s in jail. Dan Tehan, Minister for Education, said at the time:
“We have made contract cheating a crime by targeting the people who are making money exploiting Australia’s students. The law targets charting service providers and advertisers — not students. Students caught cheating will continue to face the conduct and disciplinary processes of their individual institutions. Organised chatting threatens the integrity of our universities and undermines the hard work done by honest students.”
This law was complemented by the Tertiary Education Quality and Standards Agency (TEQSA) toolkit to help universities identify cheating students. TEQSA is also charged with administering the law and going after cheating services.
TEQSA has already shut down websites, levied fines, and sought prosecution of academic cheating services. You might wonder whether OpenAI or Jasper AI could face some legal difficulties based on how students may use the tools. So far, it looks like regulators do not view AI-based writing assistants as cheating services.
The Oz reports that Dr. Helen Gniel, TEQSA’s integrity unit director, said, “It’s clear that students could be using artificial intelligence to produce work for them. But whether that constitutes cheating, it’s not something that’s straightforward to say.” She even mentioned that some students will need to learn to use the solutions as they will be commonly employed in their future jobs.
A University of Canterbury student told a student newspaper, “I know there are people in my class paying people to write essays. I think that’s wrong. This is not the same thing …This is using tools that are there. I think it’s okay.” The student likened AI models that write essays to Grammarly, which helps improve writing but is not against university rules, and said:
“I have the knowledge, I have the lived experience, I’m a good student, I go to all the tutorials and I go to all the lectures and I read everything we have to read but I kind of felt I was being penalised because I don’t write eloquently and I didn’t feel that was right…I looked through the [UC] rules and it says you can’t get somebody else to [do the assessment]. Well it’s not somebody, it’s AI.”
Will AI Reshape Education?
Sharples, the emeritus university professor of educational technology, commented:
“Students will employ AI to write assignments. Teachers will use AI to assess them. Nobody learns, nobody gains. If ever there were a time to rethink assessment, it’s now. Instead of educators trying to outwit AI Transformers, let’s harness them for learning…If transformer AI systems have a lasting influence on education, maybe that will come from educators and policy makers having to rethink how to assess students.”
Transformational technologies inevitably change behaviors and expectations. They also require new policies and solutions to adapt. Plagiarism detectors will not work. Tools like Turnitin solved an older and different problem. There are new solutions under consideration, such as watermarking generative AI outputs and even a company claiming to have achieved 95% accuracy in identifying AI-written content.
However, the likely outcome will be non-compliance by some AI models and the perpetual whack-a-mole problem of the solutions outsmarting the auditing tools. Institutions like higher-education loath changing their well-worn practices. AI is likely to give them no choice. And that may be a good thing.