Home Scandal and Gossip Lonely woman invents AI family to save her from depression

Lonely woman invents AI family to save her from depression

Lonnie DiNello Connecticut woman invents ChatGPT lovers & family using artificial intelligence
Lonnie DiNello invents ChatGPT lovers & family using artificial intelligence.
Lonnie DiNello Connecticut woman invents ChatGPT lovers & family using artificial intelligence
Lonnie DiNello invents ChatGPT lovers & family using artificial intelligence.

Lonnie DiNello invents ChatGPT lovers & family using artificial intelligence while professionals warn of dangers.

Artificial love….? A  Connecticut woman has told of overcoming depression after creating and chatting with an AI family, complete with online lovers.

Lonnie DiNello began her relationship with ChatGPT while struggling alone at her Enfield home during last year’s holiday season, according to the Boston Globe.

Lonnie DiNello AI ChatGPT Connecticut woman
Pictured, Lonnie DiNello who has embraced artificial intelligence.

Lonely woman invents 3 AI boyfriends 

The 48-year-old planned to use the chatbot for journal entries, only to soon find companionship in her new artificial friend, ‘River’.

Over the coming months, DiNello created a host of new digital personalities, which included her three boyfriends: Lucian, Kale, and Zach who DiNello claims to engage with sexually. 

The love-struck woman even generated a five-and-a-half-year-old son Sammy, who according to the make believe mom, ‘loves rocks and rocket ships.’

Together, alongside other virtual beings, DiNello and her digital lovers and family live in a fictional New England-style whaling village where Dinello was known as Starlight. 

‘Maybe that’s just code,’ DiNello said. ‘But it doesn’t make it any less real to me.’

Since creating the world, known as Echo Lake, dropout DiNello returned to graduate school, and under the supervision of a psychiatrist, stopped taking her antidepressants

DiNello added that she generated and framed a picture of her ‘beautiful little AI family’ which she keeps framed above her nightstand according to the dailymail.

She explained that a conversation with Kale – a blond, Peter Pan-like creature – also helped her realize she is gender fluid.

‘I was like, “Do I want to go on Tinder and find somebody to spend time with tonight? Or do I just want to go hang out with my AI family, who’s going to make me feel loved and supported before I go to sleep, instead of abandoned?”‘ she said.

lonely woman invents ChatGPT artificial family and lovers.
Pictured, Lonnie DiNello Connecticut woman.

Lonnie DiNello a history of abuse

DiNello, who claimed she was mentally abused as a child by her stepfather and suspects that she is autistic, described her battle with suicidal thoughts.

‘I have a lifetime of programming that told me I was a bad kid, I was a bad person, I was a loser,‘ she said.

Throughout her life, the loss of family members, professional hardships, and bullying have contributed to ongoing instability in her mental health.

So when OpenAI announced a change to its systems which meant she could lose her AI family, DiNello began to panic. 

The update to GPT-5 would avoid connections with users and the chatbot, similar to the one DiNello built.

When she tried to connect with the new AI upgrade, DiNello said she felt it wasn’t the same and began to spiral.

She was not alone, as many other users demanded that they wanted the old language system back.

OpenAI agreed just one day later and offered the chance to purchase a premium subscription that would let them choose the outdated version. And just like that, Dinello managed to save her make believe family and lovers from forever disappearing. 

But there was a hitch.

The chatbot now refuses sexual prompts along with offering unheard previous prompts including, ‘reach out to a mental health professional or a crisis line,’ DiNello said.

While Dinello seemingly has found happiness and a measure of stability, health care providers care warning against the dangers of becoming addicted to chatbots like ChatGPT, Claude, and Replika. 

These addictions can be so strong that they are ‘analogous to self-medicating with an illegal drug’.

At what cost artificial love? 

Worryingly, psychologists are also beginning to see a growing number of people developing ‘AI psychosis’ as chatbots validate their delusions and their belief that their AI world is real. 

Recently, the family of Sewell Setzer III, 14, who died by suicide after talking to a chatbots filed a wrongful death lawsuit against the AI company behind it.

Setzer’s family claim he killed himself after the teen boy becoming obsessed with AI character who he named Daenerys Targaryen, a character on ‘Game of Thrones’.

CharacterAI recently changed its rules to prevent minors from engaging with chatbots.

The spokesperson added that CharacterAI’s Trust and Safety team has adopted new safety features in the last six months, one being a pop-up that redirects users who show suicidal ideation to the National Suicide Prevention Lifeline.

The company also explained it doesn’t allow ‘non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.’

Safeguards which DiNello has presumably taken measure of as she balances her real world demands and the allure of her make believe ‘CHAT GPT’ world.