Discussion about this post

User's avatar
Stephen Fitzpatrick's avatar

Great post, Ben. I would add two observations and then conclude with a few questions. First of all, with the ability of chatbots to speak and the rise of video avatars, it will not be long before people will predominantly be interacting with something that feels more like a flesh and blood person. In fact, I'm quite sure that is possible with platforms like Replika so to your point about chatbots being only text is already dated. Additionally, one appeal I suspect of chatbots is complete devotion and attention to the user. This was highlighted in the recent Economist cover story as a huge issue going forward as future generations expectations of "relationships" become far more one sided. This only underscores the need for human connection. So I'll end with two questions - first, if users experience "empathy" from a chatbot, regardless of whether or not the AI is conscious (something I do not believe but others are convinced of), does that count? In other words, does the empathy attach to the person who expresses it or the person who experiences it? If you receive a beautiful note in the form of an e-mail expressing "empathy" for a recent loss and later find out it was simply spam from a chatbot, we recoil - but in the moment you read it, does that count as experiencing "empathy"? It seems like that study you refer to suggests this is changing (because many people prefer the AI note, at least initially), but chatbots are capable of remarkable mimicking of human empathy. It's a troubling trend to say the least, especially for teens, but it seems like the AI companies are leaning into certain aspects of it. I shudder to think of the stories that will emerge when OpenAI rolls out it's "adult" version of ChatGPT next spring. Anyway, I enjoy your work and maybe we can reconnect after the New Year. I'm still sifting through so much even as the tech continues to rapidly improve.

Alex Bull's avatar

I really appreciated this — especially the thick/thin empathy distinction and the clarity around why human contact still matters.

One question it raised for me, coming from nursing and education, is whether AI empathy is less about “real vs simulated” and more about what it can do, and in what contexts. Human nervous systems are often regulated by tone, pacing, mirroring and feeling heard — frequently before we consciously register who or what we’re interacting with — so simulated empathy can still have real effects.

In practice, though, what we work on with students isn’t generating empathy so much as helping them hold it under pressure. That depends heavily on rapport, presence, and subtle, intuitive cues that arise when two people are attending to one another. Current text-based AI seems to fall short here, not just because it lacks lived experience, but because empathy at this level depends on both participants being able to attune and be affected.

I can see AI having value as a scaffold — modelling empathic language, slowing interactions down, helping people practise difficult conversations. But empathy that actually changes us seems to require mutual presence, vulnerability, and the possibility of being altered by the encounter.

Which leaves me with a broader question: even if simulated empathy “works” locally, does it help cultivate a more empathic society, or does it risk displacing the human contact and community that thick empathy seems to depend on? I agree with your conclusion that sustained human contact remains central.

8 more comments...

No posts

Ready for more?