It started so innocently. Just a quick chat late at night when everyone else was asleep and the weight of decisions felt too heavy to carry alone.
Alexander Amatus runs a large mental health service in Australia. He’s the guy people turn to when they’re drowning. The calm voice in the storm. The one who’s supposed to have answers.
But close to midnight one night, sitting at his kitchen table still in work clothes, he realized something deeply uncomfortable. He’d been pouring his most vulnerable thoughts not to a friend, not to a therapist, but to an AI chatbot.
The response came back smooth and validating. “It’s understandable that you feel this way given the emotional load you’re carrying…”
Something relaxed. Something hollowed out.
The Double Life We’re All Living
Here’s where it gets weird. Amatus started noticing the same pattern everywhere around him.
“I wrote my message in AI first so I didn’t sound too emotional.”
“I checked with a chatbot if I was overreacting before I replied.”
“Sometimes it’s easier to talk to it than to anyone else.”
These weren’t confessions from socially awkward teenagers. These were leaders, managers, clinicians. People whose entire jobs revolve around human connection were quietly outsourcing their emotional processing to machines.
A manager used AI to soften honest feedback so it wouldn’t sound “too disappointed.” A friend rehearsed telling their co-founder about burnout through a chatbot first. A senior clinician drafted a message about workload concerns because they were terrified of sounding ungrateful.
The underlying fear was always the same: “If I say what I really feel, I might lose something.”
Respect. Connection. My job. The relationship.
So we hand our messy, complicated feelings to a system that never flinches, never judges, never gets uncomfortable. It hands back something smoother, more balanced, safer.
And slowly, almost invisibly, we start trusting that polished version more than we trust ourselves.
When Your Own Words Don’t Sound Like You Anymore
After a heavy week, Amatus was on a call with a close friend. The usual stuff about building something meaningful, team problems, the emotional hangover of responsibility.
“It’s been a big week, but it comes with the territory. We’re growing, and it’s a privilege…”
His friend cut through it. “That all sounds very polished. How are you actually?”
Amatus tried to answer honestly. But something strange happened. His mind automatically reached for those familiar AI-generated phrases.
“It’s understandable that I feel…”
“On the one hand… on the other hand…”
“A more balanced view would be…”
For a few seconds, he couldn’t find his own words. He’d become so practiced at expressing himself in careful, well-regulated language that he’d almost forgotten how to just speak as a person instead of a role.
This is what makes this whole thing so slippery. AI didn’t create the problem of emotional disconnection. It just made it incredibly easy not to notice it happening.
The Fear Underneath the Polish
When Amatus finally stopped long enough to listen past the perfectly formatted thoughts, he found something simple and terrifying underneath.
“If I let myself be fully honest, everything might fall apart.”
What if admitting overwhelm makes his team trust him less? What if telling a friend he’s too tired to support them tonight makes them think he doesn’t care?
AI had become the perfect hiding place for that fear. He could pour out unfiltered thoughts without risking disappointment. Get advice and validation without feeling like a burden. Feel momentarily “held” without navigating anyone’s actual reactions.
But here’s what his nervous system actually needed. Not more perfectly worded sentences. It needed to know his real, messy self was allowed to exist in front of actual people, not just in private chat logs.
The technology wasn’t the villain here. But it had become an accomplice to emotional avoidance.
What Changed When He Started Noticing
Amatus didn’t delete every AI app and move to a cabin. He still works in a world where business runs on technology. He still uses these tools.
But he made himself a rule: “I will use technology to support my humanity, not replace it.”
Before asking any tool “What should I say?” he asks himself “What am I actually feeling right now?”
Sometimes he just writes it plainly first. “I’m scared this won’t work.” “I’m angry and I don’t want to be.”
Only after naming it does he decide if he wants help shaping it. And if he does, the tool refines his expression. It doesn’t decide what’s acceptable for him to feel.
When something really hurts now, he reaches out to a person before reaching for a chat window. Sometimes it’s just “Today feels heavy. Do you have ten minutes later?”
It doesn’t always fix anything. But every time he chooses a human over a machine, he sends a message to his own nervous system: I am not alone in this.
The Parts That Feel Too Heavy
Maybe you’re running a small team, a household, a life other people depend on. Maybe you’ve noticed it’s easier to type your rawest feelings into a box than say them out loud to someone who knows your face.
If that’s you, here’s what matters.
You’re not weird for finding AI comforting. It makes total sense to turn to something safe and predictable when people haven’t always been that for you.
You’re not “less mindful” for using these tools. The issue isn’t the tool itself. It’s whether you’re still in the actual conversation with yourself.
And those parts of you that feel too dramatic, too heavy, too complicated? Those are often exactly the parts that most need to be met by a real, breathing, imperfect human being.
Including you.
You don’t have to suddenly bare your soul to everyone in your life or delete every helpful app. But only you can decide that your messy, unfiltered inner world is worth listening to.
Because underneath the emails, the roles, the prompts and the endless noise, there’s still a quiet part of you that knows when something feels off and when something feels true. That part deserves more than a cursor blinking back at it.


