Generative artificial intelligence (GAI) was a simmering trend in small technology communities, before bursting onto the scene earlier this year, when ChatGPT, the latest GAI tool, made headlines for its sophisticated human-like text outputs, complete with advanced essay writing and coding capabilities. In this article, Professor George Siemens, of the University of South Australia, discusses the implications of GAI for the future of education.
Humans are a tool-using species. For most of our history we have used tools to extend physical limitations. These range from mundane tools such as using a rock to crack open a coconut, to more complex tools such as tractors and airplanes.
As a species, we have also created tools that enable us to be more cognitively advanced. When information quantity increased, humanity created classification schemes, such as in the Library of Alexandria, and categorisation models to describe nature, such as the Linnean Taxonomy. Storing information has similarly advanced from cave drawings, to tablets, to scrolls, and now to the internet. Throughout this advancement, the focus has always been on humans using tools.
Now, in the current generation of Artificial Intelligence (AI), this is starting to change – tools are now starting to use people. Our digital interactions are captured, coded, and used to train and refine increasingly sophisticated AI systems. We are in an almost co-equal relationship with AI and where our actions are shaped by AI. The implications of this are enormous for our schools and universities.
For most of 2022, a simmering trend grew in small technology communities. Now classified as Generative AI (GAI), these tools can take a limited information input and produce a reasonably sophisticated output. This includes generating images from simple prompts, to creating essays from an input of only a few sentences.
While these advancements weren’t prominent in media at the time, as their outputs began to win state-level art competitions, they raised concerns about how AI could affect education, particularly if student essays were generated by AI. By late 2022 the conversation of GAI burst into the public sphere in the form of a new tool called ChatGPT, a fine-tuned model based on GPT-3 – a large language model trained and developed by Open.AI, with extensive funding and support from Microsoft.
The first month of 2023 has seen a stunning sequence of developments. Schools and universities are now confronting the potential of GAI for cheating in essays. Coding that used to take hours can now be done by ChatGPT in seconds. Similarly, a functional essay draft can be created almost immediately. For teachers, ChatGPT could create a lesson plan, a rubric, learning activities, and answer sheets.
The first media wave focussed on defining ChatGPT, evaluating its utility in teaching and learning, and determining its impact on cheating: all with a sprinkling of existential angst. And perhaps rightly so – if AI can write creatively, produce impressive images, write music, pass law exams, complete an MBA, and generate video, then what is left for humans? As a result, schools, universities, and education ministries have responded variably, with some opting to outright ban the tools and others opting to actively include them in curriculum.
RESPONDING TO GAI WITH INTENT AND PURPOSE
We are now entering a second wave where the response is less reactionary and more thoughtful and intentional. While teachers and academics globally are trying to make sense of GAI, they’re also considering more practical ways
“While teachers and academics globally are trying to make sense of GAI, they’re also considering more practical ways to incorporate AI into teaching practices and the content of what is taught.”
AI into teaching practices and the content of what is taught. Much like past concerns of the calculator threatening the quality of mathematics, so too are the worries surrounding ChatGPT. But as we know, the calculator didn’t destroy maths, so it’s useful to consider this when looking at how ChatGPT might impact future teaching practices.
For school systems, an emerging challenge is to develop AI literacies.
- What is AI? How do we use it in knowledge practices?
- How do we support its effective use?
- What do we do when AI completes learning tasks better than humans?
Defining and developing these literacies across all of society is the critical first task.
For the first time in history, humanity has a tool that can think with us. The tool of GAI uses and shapes us in ways that cause us to question which domains of learning and knowledge remain unique to us, and which have been acquiesced to a system that can produce and perform at levels that far exceeds ours – if not in content of output, then certainly in speed of production. The implications are still emerging, but it is becoming clear that they will be dramatic and transformative.
ABOUT THE AUTHOR
Professor George Siemens is well known for his research on networks, analytics, and human and artificial cognition in education. He has delivered keynote addresses in more than 35 countries on the influence of technology and media
on education, organisations, and society. His work has been profiled in provincial, national, and international newspapers. He has received numerous awards, including honorary doctorates from Universidad de San Martín de Porres and Fraser Valley University for his pioneering work in learning, technology, and networks. Professor Siemens is a Founding President of the Society for Learning Analytics Research. In 2008, he pioneered massive open online courses (MOOCS).
Further reading: