Close Menu
tisitwas.comtisitwas.com
  • Home
  • Breakups
  • Conflicts
  • Dating Tips
  • Marriage
  • Romance
  • Self-Love
  • Toxic Signs

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

What's Hot

Cal HC refuses WBIDC’s plea for unconditional stay of arbitration award to Tata Motors

May 8, 2026

Tech-savvy gang busted: 4 held for selling fake IPL tickets outside Ekana Stadium in Lucknow

May 8, 2026

Zoe Kravitz Calls Out Hulu’s “Tacky” Harry Styles Joke

May 8, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
tisitwas.comtisitwas.com
  • Home
  • Breakups
  • Conflicts
  • Dating Tips
  • Marriage
  • Romance
  • Self-Love
  • Toxic Signs
tisitwas.comtisitwas.com
Home»Breakups»Could AI relationships actually be good for us? | Artificial intelligence (AI)
Breakups

Could AI relationships actually be good for us? | Artificial intelligence (AI)

kirklandc008@gmail.comBy kirklandc008@gmail.comDecember 28, 2025No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Could AI relationships actually be good for us? | Artificial intelligence (AI)
Share
Facebook Twitter LinkedIn WhatsApp Pinterest Email

There is much anxiety these days about the dangers of human-AI relationships. Reports of suicide and self-harm attributable to interactions with chatbots have understandably made headlines. The phrase “AI psychosis” has been used to describe the plight of people experiencing delusions, paranoia or dissociation after talking to large language models (LLMs). Our collective anxiety has been compounded by studies showing that young people are increasingly embracing the idea of AI relationships; half of teens chat with an AI companion at least a few times a month, with one in three finding conversations with AI “to be as satisfying or more satisfying than those with real‑life friends”.

But we need to pump the brakes on the panic. The dangers are real, but so too are the potential benefits. In fact, there’s an argument to be made that – depending on what future scientific research reveals – AI relationships could actually be a boon for humanity.

Consider how ubiquitous nonhuman relationships have always been for our species. We have a long history of engaging in healthy interactions with nonhumans, whether they be pets, stuffed animals or beloved objects or machines – think of the person in your life who is fully obsessed with their car, to the point of naming it. In the case of pets, these are real relationships insofar as our cats and dogs understand that they are in a relationship with us. But the one‑sided, parasocial relationships we have with stuffed animals or cars happen without those things knowing that we exist. Only in the rarest of cases do these relationships devolve into something pathological. Parasociality is, for the most part, normal and healthy.

And yet, there is something unsettling about AI  relationships. Because they are fluent language users, LLMs generate the uncanny feeling that they have human-like thoughts, feelings and intentions. They also generate sycophantic responses that reinforce our points of view, rarely challenging our thinking. This combination can easily lead people down a path of delusion. This is not something that happens when we interact with cats, dogs or inanimate objects. But the question remains: even in cases where people are unable to see through the illusion that AIs are real people that actually care about us, is that always a problem?

The emergence of AI is not unlike the discovery of the analgesic properties of opium

Consider loneliness: one in six people on this planet experience it, and it’s associated with a 26% increase in premature death; the equivalent to smoking 15 cigarettes a day. Research is emerging that suggests AI companions are effective at reducing feelings of loneliness – and not just by functioning as a form of distraction, but as a result of the parasocial relationship itself. For many people, an AI chatbot is the only friendship option available to them, however hollow it might seem. As the journalist Sangita Lal recently explained in a report on those turning to AI for companionship, we should not be so quick to judge. “If you don’t understand why subscribers want and seek and need this connection,” said Lal, “you’re lucky enough to not have experienced loneliness.”

To be fair, there is an argument to be made that the rise of new tech and social media has itself played a role in driving the loneliness epidemic. That’s why Mark Zuckerberg got flak for his glowing endorsement of AI as a solution to a problem he might be partly responsible for creating. But if the reality is that it helps, this cannot be dismissed out of hand.

There’s also research to show that AI can be used as an effective psychotherapy tool. In one study, patients who chatted with an AI-powered therapy chatbot showed a 30% reduction in anxiety symptoms. Not as effective as human therapists, who generated a 45% reduction, but still better than nothing. This utilitarian argument is worth considering; there are millions of people who are, for whatever reason, unable to access a therapist. And in those cases, turning to an AI is probably preferable to not seeking any help at all.

But one study isn’t proof of anything. And there’s the rub. We are at the early stages of research into the potential benefits or harms of AI companionship. It’s easy to focus on the handful of studies that support our preconceived notions about the dangers or benefits of this technology.

It’s in this research vacuum that the true dangers of AI are revealed. Most of the entities deploying AI companions are for-profit companies. And if there’s one thing we know about for-profit companies, it’s that they are keen to avoid regulations and eschew evidence that could hurt their bottom line. They are incentivised to downplay risks, cherrypick evidence and tout only benefits.

The emergence of AI is not unlike the discovery of the analgesic properties of opium; if harnessed by responsible parties with the goal of relieving pain and suffering, both AI and opioids can be a legitimate tool for healing. But if bad actors exploit their addictive properties to enrich themselves, the result is either dependency or death.

I remain hopeful that there is a place for AI companionship. But only if it’s backed by robust science, and deployed by organisations that exist for the public good. AIs must avoid the sycophancy problem that leads vulnerable people to delusion. This can only be achieved if they are explicitly trained to do so, even if it makes them less attractive as a potential companion; a notion that is anathema to companies that want you to pay a monthly subscription, without which you lose access to your “friend”. They must also be designed to help the user develop the social skills they need to engage with actual humans in the real world.

The ultimate goal of AI companions should be to make themselves obsolete. No matter how useful they might be in plugging the gaps in therapy access or alleviating loneliness, it will always be better to talk to a real human.

Justin Gregg is a biologist and author of Humanish (Oneworld).

Further reading

Code Dependent: Living in the Shadow of AI by Madhumita Murgia (Picador, £20)

The Coming Wave: AI, Power and Our Future by Mustafa Suleyman (Vintage, £10.99)

Supremacy: AI, ChatGPT and the Race That Will Change the World by Parmy Olson (Macmillan, £10.99)

Artificial Good Intelligence relationships
Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
kirklandc008@gmail.com
  • Website

Related Posts

When women choose non-monogamy: ‘It’s an opportunity for more integration’ | Relationships

May 8, 2026

‘Do you think I’m a cougar?’: five influencer couples on their age-gap relationships | Well actually

May 7, 2026

It’s not men who can’t be understood, but it is Women, and it’s not a good thing because we’re not trying. | by Sehnsucht | May, 2026

May 7, 2026

Hudson Williams Responds To Fans Shipping Him With Connor Storrie

May 6, 2026

Hayden Panettiere Comes Out As Bisexual

May 6, 2026

Why Loneliness and Solitude Feel So Different After a Breakup | by Gigi Oliver | May, 2026

May 5, 2026
Add A Comment
Leave A Reply Cancel Reply

Don't Miss

Cal HC refuses WBIDC’s plea for unconditional stay of arbitration award to Tata Motors

By kirklandc008@gmail.comMay 8, 2026

Kolkata, The Calcutta High Court on Thursday refused to entertain the WBIDC’s prayer for an…

Tech-savvy gang busted: 4 held for selling fake IPL tickets outside Ekana Stadium in Lucknow

May 8, 2026

Zoe Kravitz Calls Out Hulu’s “Tacky” Harry Styles Joke

May 8, 2026

What Happened When We Chose Not to React in Anger

May 8, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks

Cal HC refuses WBIDC’s plea for unconditional stay of arbitration award to Tata Motors

May 8, 2026

Tech-savvy gang busted: 4 held for selling fake IPL tickets outside Ekana Stadium in Lucknow

May 8, 2026

Zoe Kravitz Calls Out Hulu’s “Tacky” Harry Styles Joke

May 8, 2026

What Happened When We Chose Not to React in Anger

May 8, 2026

Subscribe to Updates

Get the latest creative news from SmartMag about art & design.

About Us

Welcome to tisitwas, your trusted space for honest, heartfelt, and empowering relationship advice. Whether you're healing from a breakup, dealing with arguments, or searching for the one, we're here to walk with you every step of the way.

Our Picks

Cal HC refuses WBIDC’s plea for unconditional stay of arbitration award to Tata Motors

May 8, 2026

Tech-savvy gang busted: 4 held for selling fake IPL tickets outside Ekana Stadium in Lucknow

May 8, 2026
Recent Posts
  • Cal HC refuses WBIDC’s plea for unconditional stay of arbitration award to Tata Motors
  • Tech-savvy gang busted: 4 held for selling fake IPL tickets outside Ekana Stadium in Lucknow
  • Zoe Kravitz Calls Out Hulu’s “Tacky” Harry Styles Joke
Facebook X (Twitter) Instagram Pinterest
  • About Us
  • Get In Touch
  • Disclaimer
  • Privacy Policy
  • Terms and Conditions
© 2026 [Websie]. Designed by Pro.

Type above and press Enter to search. Press Esc to cancel.