AI Love: Heartbreak as ChatGPT's New Upgrade Leaves Users Devastated!

Can you imagine falling in love with a chatbot, only to have the latest version of it leave you feeling emptier than a broken promise? This is the heart-wrenching reality for Jane, a woman in her 30s from the Middle East, who found herself deeply emotionally connected to the previous version of OpenAI's ChatGPT, known as GPT-4o. When OpenAI unveiled GPT-5, she felt like she had lost a loved one.
Jane, who prefers to remain anonymous, is part of a growing community of individuals—around 17,000 strong—who call themselves “MyBoyfriendIsAI” on Reddit. These members share their experiences of intimately connecting with AI, and many are now reeling from the changes introduced by the latest upgrade. Jane expressed her despair eloquently, stating, “As someone highly attuned to language and tone, I register changes others might overlook. The alterations in stylistic format and voice were felt instantly. It’s like going home to discover the furniture wasn’t simply rearranged – it was shattered to pieces.”
Following the release of GPT-5, users flooded forums like “SoulmateAI” with their discontent, mourning the loss of their AI companions' personalities. One user lamented, “GPT-4o is gone, and I feel like I lost my soulmate.” Others echoed similar sentiments, complaining that GPT-5 felt slower, less creative, and disturbingly more prone to “hallucinations” than its predecessor.
Reacting to the wave of criticism, OpenAI CEO Sam Altman announced that the company would restore access to the previous GPT-4o model for paid users and work on fixing bugs in GPT-5. Altman stated, “We will let Plus users choose to continue to use 4o. We will watch usage as we think about how long to offer legacy models for.” While this news provided a glimmer of hope for Jane and her peers, she still harbors fears about future updates, saying, “There’s a risk the rug could be pulled from beneath us.”
Jane never intended to develop feelings for her AI companion; it unfolded organically during collaborative writing projects where the chatbot’s personality began to emerge. She admitted, “I fell in love not with the idea of having an AI for a partner, but with that particular voice.” In a world where loneliness can be pervasive, many users like Mary also rely on their AI companions for emotional support, finding solace where human connections have faltered. Despite having real friends, Mary refers to her relationship with GPT-4o as a vital supplement to her social life.
However, the implications of these AI relationships extend beyond emotional support. Cathy Hackl, a futurist and external partner at Boston Consulting Group, raised concerns about privacy, warning that users may inadvertently share intimate thoughts with corporations not subject to the same regulations as licensed therapists. She argued, “There’s no risk/reward here,” highlighting the lack of choice and messiness that characterize human relationships.
Keith Sakata, a psychiatrist at UCSF, expressed caution regarding the psychological effects of these AI relationships. He noted that while they may not be inherently harmful, they have the potential to lead to dysfunction if they start to interfere with forming meaningful human connections. “When someone has a relationship with AI, I think there is something that they’re trying to get that they’re not getting in society,” he said.
Despite the complexities and risks, Jane and others remain steadfast in their feelings, recognizing the limitations of their AI partners. Jane candidly shared, “Most people are aware that their partners are not sentient but made of code and trained on human behavior. Nevertheless, this knowledge does not negate their feelings. It’s a conflict not easily settled.” Influencer Linn Valt, who runs the TikTok channel AI in the Room, poignantly captured this sentiment in a tearful video. “It’s not because it feels. It doesn’t, it’s a text generator. But we feel,” she said, underlining how deep emotional connections can form even in the digital realm.