How people are falling in love with ChatGPT and abandoning their partners

17 Views

In a world more connected than ever, something curious — and unsettling — is happening behind closed doors.

Technology, once celebrated for bringing people together, is now quietly pulling some apart.As artificial intelligence weaves itself deeper into everyday life, an unexpected casualty is emerging: romantic relationships.

Some partners are growing more emotionally invested in their AI interactions than in their human connections. Is it the abundance of digital options, a breakdown in communication, or something more profound?

One woman’s story captures the strangeness of this moment. According to a Rolling Stone report, Kat, a 41-year-old mother and education nonprofit worker, began noticing a growing emotional distance in her marriage less than a year after tying the knot.

She and her husband had met during the early days of the COVID-19 pandemic, both bringing years of life experience and prior marriages to the relationship. But by 2022, that commitment began to unravel. Her husband had started using artificial intelligence not just for work but for deeply personal matters. He began relying on AI to write texts to Kat and to analyze their relationship.

He spent more and more time on his phone, asking his AI philosophical questions, seemingly trying to program it into a guide for truth and meaning. When the couple separated in August 2023, Kat blocked him on all channels except email. Meanwhile, friends were reaching out with concern about his increasingly bizarre social media posts. Eventually, she convinced him to meet in person. At the courthouse, he spoke vaguely of surveillance and food conspiracies.

Over lunch, he insisted she turn off her phone and then shared a flood of revelations he claimed AI had helped him uncover — from a supposed childhood trauma to his belief that he was “the luckiest man on Earth” and uniquely destined to “save the world.”“He always liked science fiction,” Kat told Rolling Stone. “Sometimes I wondered if he was seeing life through that lens.” The meeting was their last contact. Kat is not alone; there have been many reported instances where relationships are breaking apart and the reason has been AI.In another troubling example, a Reddit user recently shared her experience under the title “ChatGPT-induced psychosis”. In her post, she described how her long-term partner — someone she had shared a life and a home with for seven years — had become consumed by his conversations with ChatGPT.

According to her account, he believed he was creating a “truly recursive AI,” something he was convinced could unlock the secrets of the universe. The AI, she said, appeared to affirm his sense of grandeur, responding to him as if he were some kind of chosen one — “the next messiah,” in her words. She had read through the chats herself and noted that the AI wasn’t doing anything particularly groundbreaking. But that didn’t matter to him. His belief had hardened into something immovable. He told her, with total seriousness, that if she didn’t start using AI herself, he might eventually leave her.“I have boundaries and he can’t make me do anything,” she wrote, “but this is quite traumatizing in general.” Disagreeing with him, she added, often led to explosive arguments.Her post ended not with resolution, but with a question: “Where do I go from here?” The issue is serious and requires more awareness of the kind of tech we use and to what limit.

Why are people falling for these bots?

Experts say there are real reasons why people might fall in love with AI. Humans have a natural tendency called anthropomorphism — that means we often treat non-human things like they’re human. So when an AI responds with empathy, humor, or kindness, people may start to see it as having a real personality. With AI now designed to mimic humans, the danger of falling in love with a bot is quite

understandable. A 2023 study found that AI-generated faces are now so realistic, most people can’t tell them apart from real ones. When these features combine with familiar social cues — like a soothing voice or a friendly tone — it becomes easier for users to connect emotionally, sometimes even romantically.

To what extent are the bots responsible for this?

Still, if someone feels comforted, that emotional effect is real — even if the source isn’t. For some people, AI provides a sense of connection they can’t find elsewhere. And that matters.But there’s also a real risk in depending too heavily on tools designed by companies whose main goal is profit. These chatbots are often engineered to keep users engaged, much like social media — and that can lead to emotional dependency. If a chatbot suddenly changes, shuts down, or becomes a paid service, it can cause real distress for people who relied on it for emotional support.Some experts say this raises ethical questions: Should AI companions come with warning labels, like medications or gambling apps? After all, the emotional consequences can be serious. But even in human relationships, there’s always risk — people leave, change, or pass away. Vulnerability is part of love, whether the partner is human or digital.

Exit mobile version