South Australia has front-footed the adoption of AI-driven tech in the classroom since the rise of ChatGPT. Two leading academics at the University of Adelaide discuss the next growth phase of AI literacy and its role in STEM education.
In July last year, students at several secondary government schools in South Australia began trialling an artificial intelligence (AI) app – the first of its kind in the nation – with the safety of students a key focus.
The custom ChatGPT-style tool – called EdChat – was introduced by South Australia’s Department for Education for an eight-week trial. At the time, South Australia was the only jurisdiction in the country not to have banned AI in schools.
The app, which has been designed in partnership with Microsoft, showed students how to use AI to support their studies, while also having parameters in place to protect them from inappropriate information.
In addition to trialling the app, schools and parents received guidance around the use of AI in education more broadly. This included outlining ways AI can provide educational opportunities through chatbots, image and video generation and intelligent tutoring systems, as well as guidance about managing risks.
In August, in a submission to the Parliament of South Australia – Select Committee on Artificial Intelligence (AI), the SA Department for Education explained how the trial evolved.
“Since the release of ChatGPT in November 2022, the department has been reviewing ways to support the responsible and effective use of ChatGPT and other related AI capable technologies. It recognises that in order to realise some of the benefits, the access and use of generative AI needs to be made more appropriate for classrooms and students,” it wrote.
“Early in 2023, the department commenced a proof of concept with Microsoft to integrate the ‘Open AI’ platform (the platform currently hosting ChatGPT) into the department’s Microsoft Azure Tenancy (private cloud). This enabled the department to produce its own version of a generative AI chatbot, like ChatGPT.”
In its submission, the department said it recognises that many schools are seeking guidance on the opportunities and challenges associated with generative AI, as well as on practical classroom uses.
“The department will continue to monitor the evolving technology landscape and provide advice and guidance as appropriate, particularly in relation to pedagogy, curriculum and assessment implications, which all jurisdictions and education systems around the world will need to grapple with over time.”
Professor Katrina Falkner and Dr Rebecca Vivian, leading academics from the University of Adelaide, agree South Australia has embraced curiosity and innovation with generative AI.
Falkner, who is executive dean of the Faculty of Sciences, Engineering and Technology, and Vivian – senior research fellow and project lead in the Computer Science Education Research (CSER) Group – say they have recently seen the Australian education sector more broadly make moves that proactively consider the safe and responsible adoption of generative AI in schools.
They are referring to the release in December 2023 of the nation’s first Australian Framework for Generative Artificial Intelligence in Schools, implemented in Term 1 2024.
“Representatives from all states and territories have had input into the development of the framework to guide school communities. It provides a framework for guiding not only considerations for teaching and learning but the safe, fair, and ethical use of generative AI,” they say.
A separate government report, Assessment reform for the age of artificial intelligence, published by the Tertiary Education Quality and Standards Agency (TEQSA) in November 2023, addresses how the emergence of generative AI, while creating new possibilities for learning and teaching, has exacerbated existing assessment challenges within higher education.
“TEQSA says there is little value in ignoring AI or implementing a blanket ban, but acknowledges the complexities that generative AI introduces, particularly around assessment and the need for building teacher capability,” Falkner and Vivian say.
“Being aware of AI, knowing where it can help, but also recognising where it may pose problems for students, is an advantage and it is exciting to see Australia embracing how we might collectively approach this as schools go back in 2024.”
Another tool in the toolbox
Falkner and Vivian note that AI is already embedded in many of the technologies educators and their students use today.
“Think about generative AI as being another tool in a teaching and learning toolbox. Educators can take small steps to try AI as a tool to support teaching and ways to introduce it in the classroom,” they say.
“Generative AI has potential to provide efficiencies in learning and teaching, particularly to educators who are often time-poor and working with large classes of diverse student learners.”
AI can help educators with their workload – there are a range of tools available using generative AI such as lesson planners, slideshow creators, and tools like ChatGPT or BingChat to support lesson planning, assessment and differentiated instruction.
“These tools won’t replace teachers and they still require educator’s expertise in crafting and making use of the tool effectively and to contextualise its use, but they can save teachers valuable time and operate like an assistive tool,” Falkner and Vivian suggest.
“Likewise, leadership and school administration can leverage tools to reduce administration burdens.”
Assessment with generative AI is a challenge and regardless of whether bans are in place, students will find a workaround if they want to, the academics warn.
“This may be a time to review and consider assessment in your classroom and to have transparent discussions with students about expectations and use of generative AI for learning and assessment,” Falkner and Vivian say.
“Always check in with leadership or jurisdictions about what policies are in place and take care to consider privacy and data policies for the tools you are considering – avoid tools that will collect student and school data. There are a number of resources available to help teachers teach AI in classrooms, like the Digital Technologies Hub and CSER STEM Professional Learning programs.”
Improving AI literacy
Speaking from their combined experience running a national Digital Technologies education program at the University of Adelaide since 2014, Falkner and Vivian say investing in upskilling educators can have the biggest returns.
“They [educators] go on to impact students year after year, and teachers are one of the key influencers of students’ study and career choices,” they say.
“Building our educator workforce awareness and capacity, beyond the early adopters, and to include robust support for remote and disadvantaged communities is essential to ensure generative AI use is a level playing field for all Australians and students.”
But it goes further than that.
“In addition to using the tools, educators and students need to understand to some extent how underpinning AI technology works, how it is developed and the limitations of the technology. For example, AI algorithms and datasets can create biases and that information generated is not always accurate, both of which require critical analysis of outcomes as a key step in using AI tools,” the women caution.
“By understanding data and privacy implications with generative AI tools, educators can also make informed decisions that protect the privacy of them and their students.”
There are several programs across Australia, including Early Learning STEM Australia, Education Services Australia’s Digital Technologies Hub, the University of Adelaide’s CSER STEM Professional Learning Program, and CSIRO’s STEM Professionals in Schools program that provide resources and support and links between schools and industry.
Falkner and Vivian suggest further work will be needed to support teachers and effective student learning with evidence-based use of generative AI, including improving contextualised learning and teaching prompts to maximise output and evidence-based learning and teaching strategies.
“We’ve seen challenges and complexity around the use of generative AI by students in assessment, and teachers will require training in how to rethink assessment with generative AI.”
Future demand for AI skills
AI has already fundamentally changed our workforce and the skills that the future workforce will need will increasingly rely on AI literacy, according to Falkner and Vivian.
They believe the demand for people with AI skills will only continue to grow.
“STEM skills are critical as building blocks for AI literacy and a strong STEM workforce enables Australia to be a world leader in developing AI technology enhanced innovation,” they say.
“Investment in STEM is key to ensure they are a priority in schools and that students, particularly underrepresented students, are choosing STEM subjects. Research shows that students need to be exposed to STEM early in schooling as they start to form opinions about STEM subjects in the early primary years. We need to upskill and support our educators to engage students early in school with STEM and AI.”
Students might be dabbling in using generative AI for learning, but this does not mean they know how to use it most effectively.
“Being able to help students make the most of any digital tool is a part of learning. It’s also more than just learning about how to use AI tools. An important aspect is developing students’ critical thinking and analytical skills that can help them to question, challenge and improve the reliability and bias of information presented by generative AI,” Falkner and Vivian point out.
“As TEQSA suggests [in Assessment reform for the age of artificial intelligence], if critical, ethical, and productive engagement with AI is taught and integrated into assessment tasks in meaningful ways then students will regard it as an essential part of their learning, rather than a supplementary component.”
Falkner and Vivian have seen cases at university and in schools where educators allow the use of generative AI as part of their process of learning, where they are transparent about its use in assessment tasks.
“It all comes down to trust and transparency. By promoting core digital literacy and AI awareness from an early age, including an understanding of basic AI concepts, tool usage, and an awareness of ethical considerations, the Australian education sector can help to shape the development and deployment of AI in a responsible manner across society,” they say.
It’s here to stay
“Generative AI is here, and we all need to learn to live and work with AI,” Falkner and Vivian say, but it won’t – can’t – replace the teacher in the classroom.
In their view, teachers are experts in teaching and subject matter and still play an irreplaceable role in students’ education.
“The narrative around AI for education needs to be one of empowerment for teachers, while acknowledging the valuable role they play in education outcomes and the wellbeing of our students,” they say.
“TEQSA’s advice is to incorporate these new technologies in a thoughtful and evidence-informed manner. The Australian education sector needs to ensure generative AI technology and training is accessible to all Australian educators and their students, so they are building their capability in these critical skills, and the ability to critically analyse outputs.”
Falkner and Vivian caution schools against jumping into purchasing AI generative software as tools need to be selected with caution.
“We will see more and more generative AI being built into existing tools schools already use. Generative AI tools need to be appropriate and sensitive to our diverse Australian populations, including the cultural rights of First Nations Australians, so it is important that work undertaken to determine adoption in schools includes diverse voices and perspectives.”
Plagiarism and cheating is also still a major concern.
“Currently there is no reliable way to detect cheating with AI, and iterations of generative AI will only get better. AI presents an opportunity for how we approach assessment in schools and consideration to how assessment can also support students to learn and work with AI in their future careers.
“Generative AI education needs to focus not only on the ability to use AI tools but also more broadly the ethics, limitations, biases, and implications of AI – skills that transcend the version of generative AI being used,” Falkner and Vivian say.
In their opinion, the Australian Framework for Generative Artificial Intelligence in Schools will help guide all school communities in harnessing the potential benefits to teaching and learning generative AI, while navigating the risks.
“As generative AI will evolve, education sectors will need to continue to revise, adapt and grow with the technology to ensure policies and practice keeps pace with the technological advancements and research evidence,” they say.
Updated courses in teaching AI
Addressing inequities in education and access to technology underpins Professor Falkner’s work. She leads the development of the Computer Science Education Research (CSER) Group’s open online courses for Australian teachers to help prepare them for the Digital Technologies Curriculum, and leads the national CSER Digital Technologies Education Program, designed to support Australian teachers in teaching Computer Science – to date, supporting more than 45,000 teachers across Australia.
Working alongside her in the CSER Group, Dr Vivian is interested in understanding how people learn – their cognitive processes, interactions, self-efficacy and self-regulated learning skills – as well as investigating how to harness technology to enhance and innovate learning experiences.
“We have been tracking educators’ use of generative AI in schools within education communities, listening and learning about how they are using the technology and their needs, and working with government to support AI in school initiatives,” Falkner and Vivian explain.
“Our CSER STEM Professional Learning program has been supporting educators to upskill in emerging technologies and providing schools access to free digital technologies kits, including AI, for a number of years now with funding from partners like Google and the Australian Government Department of Education.”
In early 2024, Falkner and Vivian will release an updated version of CSER’s courses in Teaching AI in Primary and Secondary classrooms, which previously ran for a few years pre-ChatGPT.
“This update will provide a foundation for educators in understanding how AI works and how they can teach students about AI. Our evaluation of the previous course found that by completing the course, educators broadened and deepened their understanding of AI and it gave them the confidence to teach AI in classrooms using age-appropriate activities,” they say.
“Our team are always exploring evidence-based ways of teaching and learning with technologies, and this is only the start of understanding how we can maximise generative AI in schools. Currently there is little research on the benefits of students and teachers using generative AI and we are open to partnering with industry and school sectors to explore technology-enhanced learning.”