The AI Empathy Gap

Priya Chakraborty · April 3, 2025

My daughter came home crying last week because her best friend said something careless at lunch. It wasn't mean, exactly. It was just thoughtless. The kind of thing a ten-year-old says without realizing it would land like a brick on someone else's feelings.

I sat on the edge of her bed and listened. I didn't offer solutions. I didn't say "she probably didn't mean it." I just listened, because sometimes the most empathetic thing you can do is shut up and let someone feel what they're feeling.

I thought about that moment later, when I was reviewing a batch of AI-generated customer emails for a client. They were technically empathetic. They said things like "We understand your frustration" and "We appreciate your patience." They had all the right words. But reading them felt like being hugged by someone wearing oven mitts. The gesture was there. The warmth wasn't.

That's the AI empathy gap. And it's the thing I worry about most as AI takes on more and more of our communication.

Empathy isn't a word. It's a capacity.

When we talk about empathy in content, we usually mean "use empathetic language." Say "we understand." Acknowledge pain points. Show that you care. And AI can do all of that. It can produce text that contains empathetic words arranged in empathetic patterns.

But empathy isn't a pattern. It's a capacity. It's the ability to actually feel what another person is feeling, to let their experience change something inside you, even temporarily. When I sat on my daughter's bed and listened, I wasn't performing a listening pattern. I was remembering every time a friend said something careless to me. I was feeling her hurt because I'd felt similar hurts. I was responding from a place of genuine shared experience.

AI doesn't have shared experience. It has training data. And training data, no matter how vast, is not the same as having been hurt, or lost, or embarrassed, or afraid.

Empathetic language without empathetic capacity is just politeness. And politeness, while nice, is not the same as connection.

Where this matters most

For some content, the empathy gap doesn't matter much. Product descriptions. Feature announcements. Release notes. These don't need deep human empathy. They need clarity and accuracy, and AI is fine at those.

But for the content that sits at the intersection of brand and human emotion, the content that deals with problems, fears, transitions, and decisions, the empathy gap is everything.

Customer support emails. Crisis communications. Content about sensitive topics. Content aimed at people who are struggling. Content that says "we see you and we understand" and needs to actually mean it.

I've seen companies roll out AI-generated support responses that technically say all the right things but leave customers feeling worse than before they reached out. Because the customer can tell. They can always tell. When you're upset and someone responds with perfectly structured empathy that has no feeling behind it, it doesn't comfort you. It makes you feel like you're talking to a wall that's been painted to look like a person.

The Salesforce lesson

I work at Salesforce. We think a lot about how AI interacts with customers. And one of the things I've observed is that the most successful AI implementations are the ones that know their limits. They handle the transactional stuff, the data retrieval, the routing, the simple answers, and they escalate to humans for everything that requires emotional intelligence.

The failures happen when companies try to use AI for the whole conversation, including the parts that need a human heart behind them. The technology is not the problem. The overreach is the problem.

It's like using a rice cooker. A rice cooker makes great rice. But if you try to make biryani in a rice cooker, you're going to have a bad time. Not because the rice cooker is bad. Because you're asking it to do something it wasn't designed for. (My mother-in-law would disown me for even suggesting this, and she'd be right.)

What real empathetic content requires

I've been writing content that requires empathy for most of my career. Health content. Financial stress content. Career transition content. Content for people going through hard things. Here's what I've learned about what it takes:

You have to have been through something. Not the same thing. You don't need to have cancer to write empathetically about cancer. But you need to have suffered. You need to have experienced the universal human things, loss, fear, uncertainty, helplessness, that connect you to someone else's specific experience.

You have to be willing to be uncomfortable. Empathetic writing requires sitting in discomfort. Not rushing to solutions. Not smoothing over the hard parts. Letting the difficulty breathe on the page. AI, by design, smooths things over. It resolves tension. It moves toward positive sentiment. Real empathy sometimes means letting the tension stay.

You have to actually care. This sounds obvious, and it is, but it's worth saying: you can't fake caring. You can fake empathetic language. You can't fake the intention behind it. And readers, especially readers who are in pain, can tell the difference with heartbreaking accuracy.

The most empathetic thing you can do in content is let the reader know they're not alone. AI can say those words. It takes a human to mean them.

The path forward

I don't think we should stop using AI for content. I think we should get much, much clearer about where the empathy line is. On one side of that line: content that needs to be accurate, clear, and helpful. AI is great here. On the other side: content that needs to be felt. Humans are irreplaceable here.

The mistake is treating all content as the same. It's not. Some content is information. Some content is connection. Know which one you're making, and choose your tools accordingly.

And when in doubt, err on the side of human. Because the cost of mediocre information is low. The cost of fake empathy is trust. And trust, once lost, doesn't come back just because your next email says "we understand your frustration" with better sentence structure.