Provocation Five
Most things a human teacher can do, AI can mimic.
If we cannot identify and prioritize what human teachers bring to the classroom, we risk replacement. We must research the impact of human ingenuity, creativity, and relationships.
Can AI tools really replace human teachers? I don’t think so, but the point of this provocation is to generate conversation around this very sensitive topic. There are emerging for-profit programs like the Alpha School model that propose replacing teachers with AI tools. Recently the White House held an event suggesting that an AI-powered robot would soon replace human teachers. As a response to this, we are going to poke the proverbial bear to explore some underlying issues and try to identify the value proposition of human teachers.
When the question of AI replacing human teachers comes up, one of the first responses tends to be that AI can’t tie a kindergartner’s shoes. Well, no. At least not yet. But I would also be highly reticent to base my argument for highly qualified human teachers on a task that certainly doesn’t require a masters degree. This example establishes the problem educators face when challenged by the proposition of an AI takeover of the teacher role. It isn’t hyperbolic to state that many things teachers do, AI can mimic. So our task is to identify and celebrate the uniquely human interactions that AI cannot mimic. But tying shoes, or most other physical tasks that are currently beyond the scope of robotics, are just not viable answers because they don’t tend to highlight the training and qualifications of expert teachers.
With physical tasks off the table, most conversations around this topic then head towards emotional interactions. AI can’t tell when a student is having a bad day, teachers often propose. Only this isn’t true. AI tools are able to analyze photos, videos, voice, or even textual interactions to determine with a very high level of accuracy the emotional state of the human involved. In many cases, training results in around 95% accuracy in emotional detection. This paper even posits that the emotional state of the student could then be used to modify instructional approaches to adjust for the students mood.
So what really matters? What are the uniquely human prospects that teachers can bring to the classroom? One possible answer to these questions is the concept of sincerity or authenticity. As the provocation states, there are many things AI can mimic, but mimicry falls apart when the curtain is pulled back.
Generative AI excels at mimicry. The basis of the technology is using probabilistic calculations to predict the most human continuation of a text stream. The massive training sets behind the math ensure that LLMs have almost endless example material to draw from when mimicking almost any emotion within a response. AI isn’t just good, it is literally super-human at this. Studies have consistently found that “AI-generated responses were rated as more effective than those produced by trained empathic professionals” and that “AI consistently provides empathic support without showing a decline in empathy quality.” In other words, AI mimicked responses are perceived as better than those from trained humans, and the AI can keep mimicking responses without getting tired or burning out from emotional overload. Sounds perfect, right?
The problem is the fleeting nature of insincere, mimicked emotion. The illusion only holds up so long, as one study found; once users “discover that a seemingly heartfelt message was generated by an AI, they often retrospectively reappraise it as insincere, manipulative, or emotionally vacuous.” A pair of studies reported in this paper reveal a few potentially critical ideas for us to explore in education.
First, messages that came from close sources like friends were more impactful than those from strangers or AI. Second, there is an “authenticity valley” like the uncanny valley when AI messages start to sound too real. Participants in the studies experienced dissonance when AI mimicked emotions but then failed to deliver on deeper expectations for human connections. Finally, the studies suggest that the medium of communication matters as well; handwritten notes were more impactful than typed messages. All of these suggest that the answer to the question about the uniquely human value of teachers is that they are human.
Another set of studies support the importance of human teachers simply being human. “Despite using the same stimuli for both groups, participants told they were hearing a human speaker reported markedly higher trust, empathy, and willingness to comply.” These studies also noted the “authenticity valley” seen in the earlier paper noting that “minor—but perceptible—discrepancies from human speech can evoke discomfort or mistrust.” Again, these studies suggest that the knowledge, or even the perception, that a human is behind the speech makes the speech more valid and important.
A recent paper by Thomas Corbin encapsulates this within the concept of recognition:
“In everyday terms, we could say that recognition means taking another person seriously within a particular context. If you recognise someone as a participant in a seminar you lead, then you hear their question as something that deserves an answer, not as irrelevant background noise. If you recognise someone as your lecturer, then you take their comments as guidance you are bound to consider, even if you later reject them. Without recognition, the same words may still be spoken or written, but they no longer carry the force of reasons for your behaviour – they remain at the level of mere sounds or scribbles.”
Embedded within the idea of recognition are the very human aspects of respect, vulnerability, and care. While these can certainly be mimicked within AI output, they are note authentically present. In a previous paper, where these ideas are first established, Corbin lays out how this relates to AI. Because of the limitations of authentic recognition, “GenAI feedback cannot fully replicate the pedagogical efficacy of human-provided feedback, particularly in fostering students’ self-esteem and sense of scholarly identity.” Yet there is still value in a human teacher leveraging AI tools to supplement–but not supplant–their feedback efforts. “By offloading certain extra-recognitive tasks to AI systems, educators can invest more fully in building the recognitive relationships that support deep learning.”
By intentionally and cautiously embracing AI tools, human teachers can become more effective and productive. The offloading of certain low-risk, low-emotion tasks to AI tools provide more time for the human teacher to focus on the essential humanity that they bring to the classroom. While AI tools are capable of mimicking many of the emotional and relational aspects of teaching, the uncanny valley of authenticity means that AI will fall short of real impact on students. Mimicry is simply not enough. The real value of a human teacher is their inherent humanity.