ChatGPT3 grew up last month. For starters, it began to make sense. Full sentences. Almost error free. Just enough to pass muster in most forgiving gatherings – such as education. It certainly was as good as an average student summary that could be turned in as an assignment.
It will be used, you know. To churn assignments. To find meaningful summaries. To answer questions that teachers have asked and students are too bored to work through the answers. I mean, this is research too, is it not? The student is perfectly positioned to ask the internet a question, and use the results to submit an answer. This is what we have been doing for years. Now, the internet just does it better. Or at least well enough for the job at hand. Good enough to answer the questions that are normally asked by schools and systems.
Do we let this play itself out? The teachers set questions. Students do their ‘research’ in seconds, and turn in their ChatGPT3 (and 4 and 5 and so on) based assignments. Of course, assessment systems are also not devoid of their own AI. We have had AI based assessments for years. A bit expensive for all, but there are enough reports that say that now we can barely tell the difference between a human assessing an assignment, or an AI driven bot. So, teachers deploy that in our soon-could-be-very-real scenario. Sounds like a fun game, right? Churn out a syllabus, an assignment schedule. Students use AI to submit assignments, educators use AI to mark them. AI wars, for the marks. Credentialing done – with both students and teachers saved from the drudgery of the actual process of learning.
Credentials are not Competence of course, and this path is inexorably going to go belly up very soon. The lack of competencies will show up in very few years. Credentials are already under threat and this will further hollow them, rendering them quite useless. Play out the scenario and we find that this AI driven education certainly should avoid this path. (There are other AI applications that are not being discussed here). AI has its uses, but if we do not design the pathways towards progress, we will land up in such cul-de-sacs. Our task as educators is now to find ways to constructively build on the gains of technology.
Our first step clearly has to be to recognise the gains of AI based tech such as ChatGPT3. Specifically this, and its potential improvements in the next 2-3 years. If ChatGPT3 has reduced information gathering by students to a simple single point, then we need to recognise it in three ways. One, that student engagement requires a quest, a small struggle and a series of small victories. This has been denied to them. Two, that information itself has now been reduced to the foundational level of the quest. And Three, human and educator intelligence is still a bit ahead of the machines. The race between smart AI and smart humans has been anticipated for years but it began just now.
What does it mean for teachers today? Design smarter questions. This is the simple first step.
Assessments need to change to go beyond basic information gathering, analysis and inference. Indeed, so does teaching. (And formative assessment). This changes teaching and assessment in ways that educators have always considered essential, but not always found the room to engage. The constraints of resources, the pressure of performance and the competition funnel have held us back in ever repeating loops of mediocre learning. The goal of that mediocre learning was churning out material that the machine has already achieved. We need to step up the game.
The game now goes to the heart of how humans and machines will deal with each other. Building human capacities, competencies and capabilities that go beyond machines. What does human understanding access that machines that can churn out decent poetry and art cannot? Feelings? Emotions? Values? Do we deploy them for good? At one level these are questions for philosophers, at another level these are questions of survival. And the frontline of survival is the classroom.
For today, and the next year or so, the answer for teachers lies in three things: Emotions, Empathy and Ethics. Three zones where the machine cannot be deployed. Yet. This is where we can build deeper understanding of problems and solutions. Empathy and care are human traits, often observed in the absence – as we imitate machines in the quest for power, safety and survival. We do not need to become machines anymore – we can partner with them as they do the mechanical bits and we find our humanity again. We will have to relearn what it means to have empathy in our decisions. Decisions as basic as – how do we learn to work as student teams, how do we give peer and student feedback, how do we build products beyond its technical parts and so on. Ethical choices are essentially choices for human good – do we prioritise humans and humanity?
Immediacy is served by the 3E above, Emotions, Empathy and Ethics. But this is not enough for all. How is this practical, they ask? Practically, what changes? Should we now only focus on critical thinking? Sure, and remember, the machine can do it better than you, so keep up! Keep your students sharp. But just this will not be enough. Creativity, they ask? Yes of course – but creativity is not dreaming and doodling. Creativity must have both purpose and process and must lead to an outcome. This is the real challenge of teaching now. To unleash the mind for imagination and creativity (for all!), and to lead them to outcomes that make human lives better. This starts with smarter questions, tasks and assignments.
Task them with improvement, challenge them on their understanding and support them with multiple methodologies. There are many many ways to do this – we’ve got this. Just ask.
This, educators, is the growing up of education. Now.