Blog Post

Safely navigating AI adoption in schools

23, September 2025

Why we believe caution must come before acceleration.

In 2024, Australia’s education ministers endorsed a national framework for using generative AI in schools, reversing the blanket bans of the year before. The argument is simple: if students are already using AI at home (and at private schools), public schools can’t be left behind.

At Concord, we agree that ignoring AI isn’t an option. But nor is rushing it into classrooms without the right protections. In education, every new tool must be judged both by what positives it might bring and what damage it could do.

What’s keeping teachers up at night

Conversations with our partner schools echo what research and inquiries have highlighted. Teachers are deeply uneasy about:

With the World Economic Forum ranking misinformation as the world’s top global risk, teachers know the classroom cannot afford to be another site where the line between truth and falsehood gets blurred.

Reading in a Library

Why guardrails must come first

The new framework is a step in the right direction, but policy statements alone won’t keep students safe. We need practical safeguards, including:

  • Implementing clear rules on when AI can and cannot be used in assessment – as well as transparency when it is used.
  • Investing in training for teachers, so they can use AI as a teaching aid and guide students on its ethical use (rather than play catch-up).
  • Allocating curriculum time for digital literacy, so students learn to cross-check and question AI outputs rather than accept them at face value.

South Australia’s decision to roll out EdChat, a ChatGPT-style tool, in all high schools from next term marks a clear milestone. After a trial with 10,000 students, the government and Microsoft claim the app includes strong privacy protections, content moderation and mental-health flags.

But even in that controlled environment, concerns surfaced: students confusing the app for a peer, or seeing it as a constant assistant – rather than learning to do the work themselves.

This is exactly why Concord argues that guardrails must precede mass adoption, and why safe alternatives and human-led curation are essential.

Reading in a Library

What a safer path looks like

We believe digital content curation has a vital role to play. Platforms like LibPaths offer what AI cannot:

Instead of sending students into the unfiltered internet or an unpredictable chatbot, LibPaths creates a safe and structured research environment. Teachers and librarians can curate multimedia, articles and readings into portals that model critical thinking and reliable sourcing.

It’s fast and engaging for students, while also reducing exposure to misinformation. Just as importantly, it keeps educators in the driver’s seat.

Our view on AI safety in schools

AI will have a place in schools. There is no turning back.

The question is, on whose terms? We believe schools should proceed cautiously, with regulation, professional support and a strong commitment to keeping teachers and not algorithms at the centre of learning.

With strong guardrails and human-led alternatives in place, students should be able to confidently navigate an AI-enhanced world – not as passive consumers of algorithmic output, but as informed, critical and empowered learners.