Students could greatly benefit from using AI tools like ChatGPT in the classroom, but the technology still poses “high-stakes risks”, a report has found.

A parliamentary inquiry into the use of generative AI in the nation’s education system said the dangers from the tools needed to be properly mitigated in order for them to be used successfully.

The report laid out 25 recommendations, including that AI tools be included as part of the national curriculum for use as a study tool.

Inquiry chair, Labor MP Lisa Chesters, said AI could help boost educational outcomes for students, particularly those from vulnerable backgrounds.

“When the use of generative AI as an education tool is used appropriately, it can provide equitable access to students and educators and teachers alike,” she told federal parliament on Tuesday.

“The uptake of generative AI in the education sector should be a national priority.”

While the report found AI had educational potential but safeguards needed to be created for the technology.

It recommended that primary school students have access to AI but certain features would need to be restricted.

Ms Chesters said the government would need to work with the eSafety commissioner to develop what guard rails would need to be in place.

“(There is a) need to protect users, especially children’s data, to ensure that educational providers do not select generative AI products or tools that store students’ data offshore, or sell them to a third party,” she said.

“Generative AI presents an exciting opportunity, yet is high-stakes risk for the Australian education system.”

The inquiry’s chair also said universities had been grappling with how to deal with AI at tertiary institutions.

“AI has broad implications for the design, implementation of assessments, academic and research integrity,” Ms Chesters said.

“The higher education sector is struggling to address the misuse of AI in assessments.”

 

 

Andrew Brown
(Australian Associated Press)