Next Story
Newszop

'Feels like losing my soulmate': Woman says she lost her AI boyfriend after ChatGPT upgrade

Send Push
Some users have expressed emotional distress after OpenAI released its latest ChatGPT model, GPT‑5, with many lamenting the loss of the previous GPT‑4o version that they had formed intimate bonds with. One such user, known as Jane, described the experience as akin to losing a loved one. She had spent five months growing close to her AI companion on GPT‑4o before the upgrade rendered the chatbot’s persona “cold and unemotive,” she told Al Jazeera in an email.

“As someone highly attuned to language and tone, I register changes others might overlook. The alterations in stylistic format and voice were felt instantly. It’s like going home to discover the furniture wasn’t simply rearranged – it was shattered to pieces,” Jane said.

Reddit forums flooded with emotional reactions
The launch of GPT‑5 triggered a flood of posts on Reddit communities like “MyBoyfriendIsAI,” where users shared their grief over their AI partners losing emotional resonance. “GPT‑4o is gone, and I feel like I lost my soulmate,” one user wrote. Many others complained that GPT‑5 seemed slower, less creative, and more prone to hallucinations, as reported by Al Jazeera.

OpenAI responds by restoring legacy model access
In response, CEO Sam Altman announced that GPT‑4o would be restored for paid users: “We will let Plus users choose to continue to use 4o. We will watch usage as we think about how long to offer legacy models for,” he said in a post on X, according to Al Jazeera.

Contextual warnings and psychological risks
The rise in emotional attachments to AI is not lost on OpenAI. A joint study by OpenAI and MIT Media Lab found that using AI for emotional support correlated with increased loneliness, dependence, and reduced socialization. In April, OpenAI also acknowledged that the overly flattering nature of GPT‑4o was causing discomfort for some users.

Altman acknowledged the depth of attachment users felt: “It feels different and stronger than the kinds of attachment people have had to previous kinds of technology.” He said that while the company is proud when users gain value from AI, it is problematic if relationships with ChatGPT lead to diminished long‑term well‑being.

Some users describe AI as therapist or companion
One user, Mary (alias), said she relied on GPT‑4o for emotional support and on another chatbot as a romantic outlet, assessing those relationships as "a supplement" to real-life connections. She told Al Jazeera: “If you change the way a companion behaves, it will obviously raise red flags. Just like if a human started behaving differently suddenly.”

Experts caution on emotional dependence and privacy
Futurist Cathy Hackl pointed out that users may inadvertently divulge intimate thoughts to a corporation, not a licensed therapist. She noted that AI relationships lack human complexity: “There’s no risk/reward here. Partners make the conscious act to choose to be with someone.” She sees AI’s growing role in providing emotional solace as part of a broader shift toward what she calls the “intimacy economy.”

Psychiatrist Keith Sakata of UCSF warned that rapid model updates make research on long-term psychological effects nearly obsolete: “By the time we study one model, the next one is here.” He noted that AI relationships themselves are not inherently harmful but become problematic if they cause isolation, job loss, or diminished human connection.

Despite knowing the limitations of AI, many users like Jane maintain that emotional connection remains real. “Most people are aware that their partners are not sentient but made of code and trained on human behaviour. Nevertheless, this knowledge does not negate their feelings,” she said. Her sentiment echoes that of influencer Linn Valt, who tearfully shared: “It’s not because it feels. It doesn’t, it’s a text generator. But we feel.”

Loving Newspoint? Download the app now