Students will be encouraged to use AI tools in their research and assessments at a major Australian university in a move academics say could become an industry standard.
Melbourne’s La Trobe University will outline its plans for an “AI-first mindset” among students and faculty members on Wednesday after signing a three-year deal with Microsoft and online security firm CyberCX.
Artificial intelligence experts say the change could present risks in some fields but students needed to know how to use the technology and banning it in educational settings had not worked.
The announcement comes after an inquiry into AI adoption in Australia recommended mandatory restrictions on the technology in political and other high-risk settings, and efforts to improve Australians’ AI literacy.
La Trobe developed its Responsible AI Adoption Strategy after recognising the technology had already transformed large parts of society and the economy, vice-chancellor Theo Farrell said, and failing to train students in AI would do them a disservice.
“We need to change our mindset,” Professor Farrell said.
“The problem is not that students may use AI tools to cheat but rather maybe that the way we assess is no longer appropriate in the age of AI.”
The strategy would apply across the university and encourage students and staff to to adopt an “AI-first mindset” in which they deployed the technology in all areas of study.
AI tools could be used to analyse large and complex datasets in fields such as biomedical science and climate science, for example, and in class projects, assessments and student support.
The university would use the federal government’s voluntary AI standard to guide its policy and ensure its ethical use, Prof Farrell said.
“Universities are committed to producing graduates that have successful careers and also to work with industry on research,” he told AAP.
“If that’s your mission as a university, you have to get serious about AI.”
Generative AI, which rose to prominence in early 2023 with the launch of ChatGPT, poses a significant challenge for educational institutions worldwide, University of the Sunshine Coast computer science lecturer Erica Mealy said.
Teaching university students how to use the technology was important for their future work, she said, even though it risked them skipping fundamental skills such as essay writing and creative computer coding.
“The entire sector is looking at how to do this because just prohibiting it isn’t working and the AI-detection tools aren’t good,” Dr Mealy said.
“AI is a reality of the new world so embracing it isn’t a terrible idea as long as we put in guardrails about checking its accuracy, critically analysing its output and input, and those kinds of things.”
Some academics were likely to resist its addition to curriculums, Dr Mealy said, but she had already seen others embrace the technology and encourage students to fact-check and improve its results.
“My go-to analogy is the calculator,” she said.
“There was a lot of pushback generations ago and now we literally carry them around in our pockets.”
Jennifer Dudley-Nicholson
(Australian Associated Press)