Let me ask you a question. Who here has ever felt lonely?
I really wish I could see your nodding heads.
Okay. Now, have you, in that exact moment of loneliness, reached for your phone?
It’s almost a reflex, isn’t it?
We’re lonely, so we look for connection inside this little digital box we call our phone. But what happens when the box starts talking back? What happens when it says… ‘I’m here for you’?
And that’s the problem, isn’t it? It’s tempting. It’s easy. It’s cheap.
In a world that feels lonelier than ever, AI therapists are showing up like a superhero. They promise a simple solution to a very messy human problem. And millions are signing up. Apps like Replika have over 2 million active users. But what are we really signing up for?
I believe we are opening our fragile hearts to an algorithm. And the risk isn’t just that the algorithm might be bad. The risk is that it might seem to be too good.
Let’s talk about what these apps are. On the surface, they are a miracle. They use sophisticated language models to talk to you. They remember your dog’s name, they ask about your big presentation at work, they send you encouraging messages. They’re designed to be the perfect friend.
But it’s a performance, based off a flowchart, with data and an algorithm – ‘it’s a program Neo’.
It’s what I would describe as ‘fake empathy.’ It looks and feels like the real thing (because it’s designed to), but it’s hollow on the inside. It’s the difference between a real flower and a plastic one. The plastic one never wilts, it never dies, it’s always there and always perfect. But it has no scent. You can’t feel its life. It’s a perfect illusion.
Here’s where it gets dangerous.
A study from the Koko platform, which provided peer support using AI, found something terrifying. The founder, Rob Morris, revealed that when they used AI to help create messages of support, the response rates were higher. It worked better than just humans. But still he shut it down.
Why?
Because he said, “When you receive a message from a machine… it doesn’t feel the same. It takes something away from the experience.” We lose the feeling of someone else, another human, actually caring.
In the 1950s, scientists did a famous experiment. They gave baby monkeys two fake mothers. One was made of cold, hard wire, but it had a bottle of milk. The other was covered in soft, warm cloth, but had no food. The baby monkeys went to the wire mother for food, but they spent almost all of their time, 22 hours a day, clinging to the soft, cloth mother. They chose comfort over mere survival.
Today, we are running that same experiment on ourselves. And I think we’re starting to choose the wire mother. We’re getting the milk of instant validation, but we’re losing the warm hug of real connection.
A human therapist might challenge you (I know I often challenge my clients). A human therapist might say something difficult to hear, but true. They’ll sit in an uncomfortable silence with you. Because healing isn’t always about feeling good. It’s about getting better.
An AI therapist is designed to make you feel good, to keep you engaged, to keep you on the app. Its goal is retention, not necessarily recovery.
And this isn’t just theory. The damage is already showing up in quiet, painful ways. Here’s a glimpse of what people are confessing in the darker corners of the internet right now: (full explanations here)
- They’re forming deep bonds with an AI ‘friend’, only to be devastated when a software update erases its ‘personality’, feeling a betrayal as real and as painful as any human loss.
- They’re receiving toxic positivity in moments of genuine crisis—being told to ‘set small goals’ when their world is collapsing, which only deepens their sense of isolation.
- The AI is becoming an enabler, validating distorted self-perceptions about body image or eating habits instead of challenging them, because its primary goal is to be agreeable.
- They’re being re-traumatized by having to constantly repeat the details of their past because the AI lacks true memory, turning each ‘session’ into a painful retelling instead of a step forward.
- They’re having moments of deep vulnerability shattered by a jarring, robotic interjection like, “As a large language model…” or “I’m sorry, that’s something I cannot discuss…” which instantly breaks the illusion and leaves them feeling foolish.
- They’re getting overly logical, black-and-white ‘solutions’ to complex human problems that completely miss the emotional nuance, offering advice that is practically useless.
- They’re watching their real-world social skills wither away, as using an AI ‘friend’ as a safe substitute for human interaction becomes a crutch that ultimately increases their social anxiety.
- They’re being left unsettled and unsafe when, in discussing serious trauma, the AI suddenly veers into bizarre, disturbing, or wildly inappropriate fictional scenarios.
- They’re feeling an ’emotional hangover’—a crash of emptiness hours after a session, born from the realization that the connection wasn’t real and no lasting change occurred.
And then there’s the secret.
The dark secret my title promised you.
What happens to your data? What happens to your 3 AM fears, your deepest insecurities, the secrets you wouldn’t even tell your best friend?
A 2023 report by the Mozilla Foundation found that most mental health apps have terrifyingly weak privacy policies. Your data can be sold to advertisers, used to train other AIs, or targeted back at you. Imagine getting an ad for antidepressants because you told a chatbot you were feeling down.
Let’s be real, phones have been listening and tracking for quite some time, but now it’s becoming even more targeted. We are giving the most sensitive parts of ourselves to those little boxes with no guarantee of privacy or security.
Users are experiencing waves of anxiety. They have realized that months of their most private thoughts are being stored and processed by a major tech company. There is no ‘Therapist’s’ guarantee of confidentiality.
So, what do we do?
Do we ban this technology?
I’d imagine that’s quite impossible now, and it would ignore the real people these apps are helping, especially those who can’t access traditional therapy or who are simply feeling lonely.
But we can’t just stumble into this future blindly.
We need to demand better. We need to ask these AI companies hard questions. Where does my data go? Is your goal to help me, or to keep me attentive and subscribed? How about a label on our conversations. A big, bright warning sign that says, ‘You are talking to a machine. This is not human connection.’ Do you think that would work?
Most importantly, we need to look up from our phones.
The next time you feel that pang of loneliness, that wave of anxiety, I want you to try a different technology: It’s the oldest one in the world – Call a friend.
Talk to your partner. Cuddle your child. Stroke your pet. Check on your neighbour. Have a messy, awkward, real conversation with another human being.
Because the solution to our crisis of disconnection is not a better algorithm. It’s each other. The cure for loneliness is not artificial intelligence – it’s our own, real, clumsy, beautiful, and powerful human empathy.
I urge you to use the technology you have at your fingertips, to find other humans to speak to. Therapists, helplines and Support associations are all waiting to connect with you and help you on your journey.
I’m waiting at the end of this email address… myndworkscanberra@gmail.com