What does it mean to make generative AI ‘safe’ for education? Education Services Australia outlines its work in this space.
This question reflects the need to balance AI’s transformative potential with the complexities it creates as use of this emerging tech spreads through classrooms in 2024. Last year, this balance was central to discussions about AI in education, focusing on academic integrity, student privacy, equity, and access for all students.
As technology continues to advance, a surge in AI-enabled educational products is expected. Recognising this, Education Services Australia (ESA) has been proactively collaborating with key educational bodies to address the challenges of Generative AI in education. The shared goal of these collaborations is to ensure safe, ethical, and practical use of AI in classrooms.
Making AI safe and fair for education
Navigating the challenges of student privacy and academic integrity in the age of AI is complex, especially considering the potential for its misuse. Adding further complexity, the accessibility of advanced technological tools varies across different demographics, potentially widening the digital divide and counteracting the benefits of AI.
Highlighting these challenges, the ESA-commissioned report, AI in Australian Education Snapshot: Principles, Policy and Practice (August 2023), underscores need to establish a base level of safety addressing privacy, integrity, and equity as prerequisites for the broader implementation of AI tools in classrooms.
To address this, ESA’s 2023-24 NSIP (National Schools Interoperability Program) workplan is introducing a new workstream. It will investigate expanding the Safer Technologies for Schools (ST4S) initiative to encompass assessments for AI technologies, building on the success of ST4S in standardising the safety evaluations of digital products in Australian and New Zealand schools.
The forthcoming workstream is set to develop a Privacy and Information Security Technical Standards Framework, updating the current ST4S initiative with specific privacy and security guidelines for AI-powered educational technology. Additionally, Human Rights and Wellbeing Standards will be established to validate developer claims of explainability, non-discrimination, and contestability in AI products.
Creating policy to guide educators
In 2023, a National AI Schools Taskforce, comprising government members of key educational bodies including ESA, was formed to create best practice frameworks guiding educators on using AI in Schools. To ensure a comprehensive and inclusive approach, the Taskforce engaged in extensive consultations with various stakeholders, including First Nations representatives.
Their efforts culminated in The Australian Framework for Generative Artificial Intelligence in Schools (November 2023), which led to lifting the ban on AI products in schools from Term 1, 2024.
Following this, ESA was tasked with creating ‘product expectations’ for AI technologies in education, to ensure safe, fair and effective integration into classrooms. The creation of these expectations will be managed through the aforementioned ST4S workstream, as well as a collaboration with the Australian Education Research Organisation (AERO).
Ensuring AI is practical for schools
Integrating AI into education faces practical challenges in proving its effectiveness and maintaining safety. It’s important to understand AI’s impact on learning outcomes, teaching methods, and educator workload. ESA’s AI in Australian Education Snapshot report covers addressing these concerns in its medium-term actions, suggesting targeted research to help proactively future proof the education system.
ESA will collaborate with AERO to research and ascertain the effectiveness of AI tools in enhancing the learning experience, ensuring these technologies are safe and align with educational goals. This includes safeguarding against risks to student data and privacy over time, a key concern given the rapid evolution of AI technology.
Supporting an empowered future for educators
To implement AI in education safely, it’s crucial to develop dynamic, ethical policies. There is a need for adaptable strategies for complex challenges in this rapidly evolving field, as highlighted in the long-term actions suggested in ESA’s AI in Australian Education Snapshot report. But beyond this, practical, user-friendly and quality-assured teaching tools must be created.
Education Services Australia offers such tools to help educators adapt amidst the rapid pace of change, including Scootle, a free national online library of curriculum-aligned resources. With thousands of links to online resources covering a wide array of subjects including AI, teachers can find lesson plans, create learning paths for students and curate their own personalised libraries of teaching materials.
Workforce needs will evolve in response to new technologies like AI, as highlighted in the Productivity Commission’s 2023 Interim Report. To support educators in preparing students for the transition to work in a rapidly changing environment, myfuture offers freely available career planning tools and webinars covering industry profiles and emerging jobs.
ESA is committed to working with key educational bodies to provide support to schools that helps ensure the safe, ethical and effective use of AI in classrooms. To keep up with developments, educators can sign up for a monthly education newsletter or check the news and articles section of the ESA website.