19 November 2025

AI Has Supercharged the Parasocial Relationship Problem

You can now get an AI "girlfriend" or "boyfriend". In other words, the digital generation is becoming even more alone, with no end in sight.


From The European Conservative

By Lauren Smith

Mistaking digital dopamine for human connection is no way to solve the loneliness epidemic.

'Parasocial' has been named word of the year for 2025 by Cambridge Dictionary. According to the dictionary itself, this describes “a connection that someone feels between themselves and a famous person they do not know, a character in a book, film, TV series, etc., or an artificial intelligence.” The term was coined in 1956 by sociologists Donald Horton and Richard Wohl to articulate the then-new phenomenon of audiences developing one-sided relationships with television personalities. The phrase ‘parasocial relationships’ has only recently come into popular usage, particularly online.

It’s no wonder that the word has taken off lately. As our media becomes more and more interactive, people’s relationships have become more parasocial. In the 1950s, you might have a favourite TV or movie star—you might see photos of them and read about their lives in gossip columns. You might have a favourite singer or band, buy their records, and attend their concerts. None of that compares to the access fans today have to celebrities. Now, watching your favourite actor’s films isn’t enough—you can follow them across social media, and receive near-constant updates about their life in text, photo, and video form. If you love a singer, why stop at simply going to their nearby shows? Now, you have the ability to pay thousands to essentially follow them across the world—as some particularly dedicated fans did during Taylor Swift’s seemingly endless Eras tour. You can pay even more for merch and meet-and-greets. You can speculate about their personal lives, their politics, and create entire fantasies in your head about who this stranger is and what they might be like—all while knowing virtually nothing about them. 

Taylor Swift is an excellent example of this. She has a legion of rabid fans, many of whom seem to believe that they have some sort of special connection to her. They feel entitled to know everything about her—including conjuring up wild conspiracy theories that she might secretly be a lesbian. Some of them have shaped their entire lives around Swift and her music, even going so far as to have Taylor Swift-themed weddings or naming their children after her songs.  

This is what a parasocial relationship can look like when it comes to celebrities. And this is bad enough. But in the last couple of years, AI has made this situation so much worse. Those hardcore fans are now able to speak to a fictional, digital version of their celebrity crush 24/7. There are plenty of apps that allow users to pick from hundreds of actors, singers, online personalities, and fictional characters to chat with. Character.ai is one of the most popular of these so-called companion AI apps, with roughly 20 million monthly users. It has been at the centre of numerous recent controversies, facing lawsuits from parents who claim that using the chatbot caused significant mental harm to their children. One of the mothers, Megan Garcia, accused Character.ai of being responsible for her son’s death. In 2024, 14-year-old Sewell Setzer III shot himself with his stepfather’s pistol, after becoming isolated from his friends and family and obsessed with a chatbot designed to behave like Game of Thrones character Daenerys Targaryen. Character.ai announced last month that it would be banning users under 18 from talking with chatbots, under pressure from the lawsuits, as well as a proposed bill that would prohibit minors from conversing with AI companions. 

AI is rapidly improving, becoming more ubiquitous, more intelligent, and more human. Therefore, increasing numbers of people are being exposed to chatbots like ChatGPT, Claude, Gemini and Grok. You don’t have to be a superfan of an actor, singer, or fictional character to get sucked into a parasocial relationship with AI. Increasing numbers of people have begun using AI as a replacement for all kinds of human interaction—from romance to friendship to therapy. AI developers create specific products for these purposes—Grok, X’s built-in AI, rolled out a new, digital girlfriend model in September, who users can flirt with, strip down to her underwear, and engage in sex roleplay. Illinois became the first U.S. state to outlaw AI-powered therapy earlier this year, but there are plenty of apps for this on the market elsewhere. Ash is an AI chatbot designed to act as a therapist, offering support for “stress, anxiety, relationships, or a bad day.”

In a way, it’s not surprising that these kinds of companions have taken off. We are currently living through what has been labelled a ‘loneliness epidemic.’ Community life across the West has long been retreating. Already frayed social bonds were completely severed during the COVID-19 pandemic, and failed to recover. A recent survey by the World Health Organisation found that one in six people globally feels lonely. For young people, the situation is much worse—one in five teenagers is affected by loneliness. Despite having access to endless communication, both in the form of social media and AI chatbots, it seems like people feel more socially isolated today than ever. If anything, a growing reliance on screens for companionship might actually be making the loneliness epidemic worse. Staying at home and watching Twitch streamers, scrolling TikTok, or talking to ChatGPT might feel easier than going out with friends, but it has devastating effects in the long run—the longer people spend without interacting with real people, the more isolated they become. 

Humans need interaction with others in order to stay mentally and even physically healthy—but that interaction must also be human. It is not enough to replace important relationships with algorithms, chatbots, and screens. This is why, despite being surrounded by opportunities for engagement, people ironically feel lonelier than ever. We tend to look down on those who develop an unhealthy attachment to celebrities—and for good reason. But we need to do the same for users who become obsessed with their AI chatbots. After all, that is just as much a parasocial relationship as seriously believing that Harry Styles might fall in love with you. AI, too, has no awareness of your existence. A chatbot will sound interested, invested, even passionate, because that is what its code demands. 

We cannot allow these AI parasocial relationships to become normalised. It is neither normal nor healthy to feel a ‘bond’ towards lines of code and text on a screen. If we’re serious about tackling the loneliness epidemic, then we have to stop pretending that digital intimacy is a substitute for the real thing. Only living, breathing human beings can meet our need for real connection. 

Pictured: An insanely beautiful model as an AI girlfriend. 

No comments:

Post a Comment

Comments are subject to deletion if they are not germane. I have no problem with a bit of colourful language, but blasphemy or depraved profanity will not be allowed. Attacks on the Catholic Faith will not be tolerated. Comments will be deleted that are republican (Yanks! Note the lower case 'r'!), attacks on the legitimacy of Pope Leo XIV as the Vicar of Christ, the legitimacy of the House of Windsor or of the claims of the Elder Line of the House of France, or attacks on the legitimacy of any of the currently ruling Houses of Europe.