AI Can Write. It Can't Listen.
I was on a call last week with a potential client. A director of content at a mid-size tech company. She was frustrated, and she wanted to talk about AI.
"We've been using AI tools for six months," she said. "Output is up. Way up. We're publishing three times more than we used to." She paused. "And engagement is down across the board."
I asked her what she thought was happening. She sighed the sigh of someone who already knows the answer but doesn't want to say it. "I think we're publishing more, but saying less."
That, right there, is the brand voice problem with AI. Not that it writes badly. It writes fine. Sometimes it writes well. But writing and communicating aren't the same thing, the way that talking and listening aren't the same thing. AI can talk all day. It can't listen at all.
Brand voice is an act of listening
This is the thing I keep trying to explain in meetings where someone's waving an AI-generated content calendar and asking why we still need writers. Brand voice isn't a style guide. It's not a list of words you use and words you don't. It's the residue of deep, sustained listening.
When Mailchimp's voice sounds friendly and a little weird, that's not because someone wrote "be friendly and weird" in a doc. It's because real humans spent years listening to their users, understanding what made them nervous about email marketing, and developing a voice that felt like a reassuring friend who also happened to know a lot about open rates.
That voice emerged from thousands of conversations, support tickets, user research sessions, and casual observations. It's an expression of understanding. And understanding requires listening. Real listening. The kind where you hear what people aren't saying as much as what they are.
Brand voice isn't about how you sound. It's about how well you've listened. AI can replicate the sound without any of the listening.
The mimicry trap
Here's what happens when companies use AI for brand voice content. They feed it their style guide. They give it examples. They say "write in our voice." And the AI produces something that looks right. The words match. The tone feels close. The structure is on-brand.
But there's something off. Like a cover band playing your favorite song. The notes are right, but the feeling isn't. Because the feeling comes from meaning, and meaning comes from intention, and intention comes from a human who has actually absorbed the brand's relationship with its audience.
I worked with a team that used AI to write their customer success stories. On paper, they were fine. Good quotes, clear structure, proper brand voice. But their customers started declining to participate. One customer said, diplomatically: "The last one you did about us felt like it could have been about anyone."
That's the mimicry trap. The content looked right but felt generic, because it was generated from patterns, not from understanding.
What listening means in practice
When I develop brand voice for a company, here's what I actually do. None of this can be automated.
I sit in on sales calls. Not to write about them. To hear how customers talk. What words do they use? What makes them laugh? What makes them hesitate? What questions do they ask that reveal what they're really worried about?
I read support tickets. Not for data. For language. The way a frustrated customer describes a problem tells you more about your brand's emotional territory than any competitive analysis.
I talk to the people inside the company who are closest to customers. Support reps, account managers, salespeople. The people who hear the real stuff. The unfiltered version.
And from all of that listening, a voice starts to emerge. Not a voice I invented. A voice that was already there, living in the gap between the company and its customers, waiting to be articulated.
AI can't do any of that. It can process text. It can't sit in a room and notice that a customer's voice changes when they talk about their biggest challenge. It can't pick up on the fact that your users all make the same joke about onboarding. It can't feel the emotional temperature of a conversation.
Where AI helps with voice
I'm not anti-AI. I want to be clear about that because I know how these conversations go. You say one critical thing and suddenly you're a Luddite who probably still uses a fax machine. (I don't. I did until 2019, but that's a different story.)
AI is useful for brand voice in specific, limited ways:
- Consistency checks. Once a voice is established by humans, AI can help flag when something drifts off-brand. That's pattern matching, and AI is good at pattern matching.
- Scale. If you need to adapt a piece for different channels, keeping the core voice intact, AI can help with the mechanical parts of that adaptation.
- First drafts of low-stakes content. Social captions, alt text, meta descriptions. Content where the voice needs to be present but the depth of human connection isn't critical.
But the core voice work, the listening, the understanding, the articulation of who we are and who we're talking to, that's human work. And trying to automate it isn't efficient. It's just faster at getting to mediocre.
Efficiency without understanding is just faster noise.
The question to ask
If your company is using AI for content, here's the question I'd ask: is our content still a conversation, or has it become a broadcast?
Conversations require listening. Broadcasts don't. And the shift from conversation to broadcast usually happens so gradually that nobody notices until engagement is down and the content director is on a call with me, sighing.
Keep listening. Keep putting humans in the room where the understanding happens. Use AI for the parts that don't require a heartbeat. But never, ever outsource the listening.
That's the part that makes voice a voice instead of just a sound.
