Companionship in AI: Where Do We Draw The Line
Is it ever possible to separate the art from its artist?
In 1967, Roland Barthe revolutionized the way we perceive art. As an author himself, he nudged readers to consider: What if a masterpiece can be perceived in isolation of the one who created it?
“I care about you.”
These same words. This same phrase.
Said by a lover, it may soften you. Said by a parent, it can anchor you. .
“I care about you.”
This time, it’s said by your laptop.
It sounds absurd. And yet, for many, it has already become a reality. In a world where artificial intelligence (AI) can mimic tone, sentiment and empathy, the line between technology and companionship is blurring.
This blog is not about art. It’s about the lengths we have gone in leaning on AI as a therapist, confidant and friend.
Let’s discuss what it means for us to replace social connection with curated responses. Let’s explore both the promise and the peril of this new intelligence.
Illuminated Strengths
In recent years, more people have begun turning to AI for comfort and guidance. We share our fears, ask for advice and even request that AI analyze our behavior. Programs trained in therapeutic techniques are stepping into a role once reserved for loved ones or mental health professionals.
Frankly, AI has managed to break down some of the most major obstacles towards mental health care. The reasons for turning to computerized compassion are compelling:
Affordability: Sure, ChatGPT may have a subscription fee, but it’s definitely cheaper than most services.
Safety From Stigma: When vulnerability feels like a threat, AI provides a private, non-judgemental space.
Scalability: How many times have you tried booking a session with a mental health professional, only to be added to a waiting list? As demand for mental health care grows, AI bridges the gap where human resources fall short.
Efficient Diagnosis: Here’s one more advantage. One that is possibly ground-breaking and revolutionary. Studies have found AI can identify early signs of depression or suicidal thinking, possibly even earlier than humans can. Yet, always be mindful about the limitations of algorithms in assessing complex cases.
The potential is undeniable. But so are the shadows.
Overlooked Shadows
The use of AI is driven by its convenience. However, let’s take a look into the other aspects of confiding in a machine:
#1. A Lack of Humane Presence
“Know all the theories, master all the techniques, but as you touch a human soul, be just another human soul.” - Carl Jung
Wouldn’t it be amazing to have the perfect response to every question? The most attuned empathy towards every emotion? A curated reaction to every statement?
Yet, what use would it be if none of them carry the quiet hum of compassion that comes from being human.
AI can mimic empathy. But mimicry is not the same as presence. And, sometimes, presence is fundamentally healing.
So, when people begin to rely heavily on AI for companionship, the risks deepen. Loneliness increases. Disconnection intensifies. These are mere consequences supported by evidence.
The more we withdraw from each other, the easier it is to find illusory comfort in solace.
More and more scholars have started warning against potential drawbacks from the use of AI as a mental health professional.
Because it’s not just about words. It’s about subtle cues: a pause, a sigh, a look of understanding. Healing is often carried by what cannot be digitized: trust, warmth, connection.
#2. A Risk on Privacy
There are also concerns about privacy. Every word shared with AI becomes data. For someone in a vulnerable state, the risk of that data being misused is not just technical. It is deeply personal.
#3. A Loop of Biases
Have you ever wondered if ChatGPT is sometimes a bit too nice?
Yes, your ideas may be smart, but let’s not forget how artificial intelligence is designed to accommodate your very own preferences, opinions and biases.
Unless prompted, AI will always follow the narrative you have created for yourself. And while it’s important to have your very own cheerleaders, emotional healing does not occur with just validation and insight.
Mental health professionals are trained and educated to gently challenge unhealthy interpersonal dynamics. Their care towards their clients does not halt their attempts to push them to the best of their potential, even if this includes discomfort and stiffness.
On the other hand, new research points to the detrimental influence of human-AI feedback loops, in which both affect the biases of the other. According to the results, humans are more susceptible to being influenced by AI than they realize, affecting their emotional, perceptual and social judgements.
#4. An Accessible Tool
AI is available day and night, offering immediate responses when humans can’t. While this may seem as a benefit, it can actually have detrimental long-term effects, increasing users’ dependency on one external tool to navigate their emotions.
As mental health professionals, protocols of limiting contact is not a punishment, it is empowerment grounded in structure and boundaries. It is about encouraging internal resilience instead of continuous reliance. Mindful limits on the use of AI can prevent unhealthy dependance, and foster self-emotional regulation.
Where the Line Belongs
Roland Barthes once asked if art could be separated from its artist. Today, we might ask: Can companionship be separated from the human?
AI can generate the words: I care about you. It can even say them in a voice that feels gentle. But it cannot know the weight of those words, the lived stories that give them depth. It cannot offer the warmth of a hand held.
Thus, until a machine can replace the profound comfort of being truly understood by another human being, then AI cannot be a substitute for human care.
The line, then, is ours to draw. And perhaps it begins with this: to let AI assist us, but to keep the heart of connection human.