ChatGPT: friend or foe? - Education Matters Magazine
Artificial Intelligence, Expert Contributors, Technology

ChatGPT: friend or foe?


When used in the right way, ChatGPT can be an effective learning tool for teachers and students, rather than something to be feared, according to Andrew Smith, CEO of Education Services Australia.

Having spent the summer reading about the latest challenge to creative human intellect, ChatGPT, I decided to experiment with this new wonder tool to see what I could learn about its likely impact on education.

My primary conclusion is that teachers have more reason than ever to be valued for their expertise; after all it is their humanity that sets them apart from the bots being heralded by many as the end of teaching as we know it.

Being from Melbourne, I started my experiment with the most obvious question: “Where can I get the best coffee in Melbourne?”

ChatGPT’s response began with a bland statement about Melbourne is “renowned for its coffee culture” and “home to some of the best coffee shops in the world.”

After further prompting, the AI showed that Seattle, Rome and Copenhagen are all “renowned for their coffee culture” and “home to some of the best coffee shops in the world.”

Whilst the coffee aficionado in me might disagree with those statements, clearly ChatGPT is not the most creative intelligence.

This formulaic style is characteristic of ChatGPT, which starts with a bland introduction and concludes with an equally banal summary. In between the standard opening and closing, my response listed the same well-known, highly googled coffee staples that any Melbournian would list if asked by a visitor to our city.

It’s fair to say that nobody is going to discover a new coffee shop or find the best locally roasted beans in Melbourne using ChatGPT.

It is exactly this lack of creativity that means ChatGPT can be a useful tool for educators but will never replace the humanity and creativity of our teaching workforce.

The concept of ChatGPT is not a new one, it is a type of Generative AI that is able to generate content, such as text, images, or audio.

Generative AI models are trained on large datasets of existing content that has been created by others. ChatGPT is fine-tuned to generate human-like text in natural language, it can generate text in a conversational style, it can answer questions, and generate summaries and text based on any given prompt, among other natural language processing tasks. It can be used in various applications such as virtual assistance and language translation.

“ChatGPT is fine-tuned to generate human-like text in natural language, it can generate text in a conversational style, it can answer questions, generate summaries and generate text based on a given prompt, among other natural language processing tasks.”


In education, generative AI can be used to create personalized learning materials, interactive content, generate feedback and better support assessment.

It is exciting to see how quickly teachers are responding to this new tool and integrating it into their classroom practice.

Examples of how teachers are working with generative AI include:

  • Asking students to evaluate ChatGPT’s response to a prompt and then using their knowledge to improve the initial response.
  • Using the tool to generate a first draft response and applying their creative thinking skills to refine and improve the draft
  • Developing more sophisticated approaches to framing questions and forming hypotheses that elicit stronger answers from the AI.

These and other approaches help promote deeper, more engaged learning while exposing students to the limitations of such tools.  A valuable skill required in this age of misinformation, and personalised (or manipulated) experiences.

In realising these benefits, it is important to fully consider the risks and benefits of using generative AI in the classroom and to take steps to mitigate these risks.

We must remember that AI models can perpetuate biases in the data used to train them and lead to discriminatory outcomes, content can be low-quality and even harmful if the data used to train the AI model is of poor quality.

The use of AI systems in the classroom also raises important questions about the collection and use of student data, and the security and privacy of that data.

As Leslie Loble pointed out in her recently released research report Shaping AI and Edtech to Tackle Australia’s learning Divide, there is a substantial risk that the learning divide will widen further as tools using all forms of AI become more widely used and access skews toward educationally advantaged students.

The tremendous promise of AI requires good design, effective use and sound governance to ensure that the wide range of enrichment possibilities on offer are harnessed, especially those that will help best meet the needs of educationally disadvantaged students.

AI will increasingly become endemic in our work and everyday lives. Our students must be ready and prepared to take full advantage of the possibilities offered by a world profoundly shaped by AI.

Further reading:

Send this to a friend