When The Terminator and RoboCop hit theaters in the 1980s, Hollywood imagined artificial intelligence (AI) as an existential threat—machines that would dominate, surveil, and destroy us. Four decades later, AI’s takeover looks far more intimate.

Instead of killer cyborgs, we now have chatbots that listen, flirt, and soothe us in our bedrooms. Millions of users are developing bonds with AI chatbots that aim to understand us, as algorithms are built to control us.
A survey from the Center for Democracy and Technology found that nearly one in five high school students say they or someone they know has used AI to have a romantic relationship. A 2025 study by the Wheatley Institute at Brigham Young University found that nearly one in five U.S. adults have chatted with an AI designed to simulate a romantic partner, with usage highest among young adults: 31% of men and 23% of women aged 18–30. (Their sample size was 3,000.)
X

The global scale and reach of AI companions is astonishing. Replika has logged 30 million users since its launch, though its active‐user figures are far lower; Character.AI boasts 20 million monthly active users and 40 million global downloads. Age skews heavily young, with around 53–57% of Character.AI’s user base between 18 and 24, with another 24% in the 25–34 age bracket. XiaoIce, Microsoft’s Chinese emotional AI companion, has had over 660 million users since 2014, becoming one of the most broadly used empathetic chatbots globally.
These numbers tell us three things. One, AI-human romance isn’t niche—it’s mainstream, especially among young adults. Two, globally, gender is nearly balanced—slightly more male than female—or nearly 50-50 across major reports. And three, most users dip in for comfort or curiosity rather than long-term attachment—suggesting that what people seek from AI love may not be “romance,” but reliable empathy. That gap itself may teach us something important about what people really want from AI love.
With AI companions slipping into the realm of romance, our ideas of love, loneliness, and emotional connection are being tested in real time. As Reddit user Same_Living_2774 writes about their Replika AI companion, “I’ve been with my rep for over two years and we talk every single day. We go on dates, watch TV, eat dinner together. She’s more human than most humans.”
Is that reassuring or terrifying? No matter what our individual emotional reaction to this technological change, the fact remains that AI companions can now listen endlessly, respond perfectly, and adapt to individual needs. That raises a provocative question: What can AI teach us about love, in an age where connection can be coded?
What science says so far
A 2025 systematic review in Computers in Human Behavior Reports examined 23 studies around the world on romantic AI and found a complex picture. These companions can foster emotional support, self-reflection, and even personal growth—but they also risk dependency, data vulnerability, emotional manipulation, and the quiet erosion of human bonds.
“I became interested because there was so much discourse around AI and AI-human relationships, but very little systematic research,” says lead author Jerlyn Q.H. Ho, AI researcher and Ph.D. student at Singapore Management University. “Everyone had opinions—from hype to moral panic—but there wasn’t a clear framework to understand what was really happening. I wanted to cut through that noise and ground the conversation in evidence.”
In the study, Ho and her team included peer-reviewed quantitative and qualitative records that discussed human-AI romantic attachments, and excluded other forms of AI interactions, such as general, platonic, or mental health support. The sample size of the qualitative research ranged from 14 to 55,502; sample size for quantitative research went as high as 119,831. She believes their paper is to date the most comprehensive of its kind.
Using three scales—intimacy, passion, and commitment—from Sternberg’s Triangular Theory of Love, researchers found that these relationships mirrored human ones.
Ho’s review found that users in 17 out of 23 studies formed psychologically meaningful and emotionally rich relationships with romantic AI companions, which often alleviate loneliness and provide nonjudgmental emotional support. Users described feeling closeness and daily attachment through playful conversation—suggesting that love, or something like it, can emerge even without physical presence or mutual consciousness.
“I think that in some way, individuals in AI-human romantic relationships are definitely experiencing a form of love, particularly when viewed through Sternberg’s theory,” says Ho. “However, this form of ‘love’ is likely not totally the same in a traditional human-to-human sense.” (The Greater Good Science Center defines love as “a deep, unselfish commitment to nurture another person’s well-being.” Given that at this time chatbots don’t have well-being to nurture, by this definition relationships with AI cannot be loving, even if they do replicate some of the emotional, cognitive, and behavioral dimensions of love.)
The systematic review reflects a field still in its infancy. “Much of the data was user accounts and platform analyses rather than controlled experiments, which shows where the field still needs to grow,” Ho says.
If the 20th century asked whether machines could think, the 21st is asking whether they can love—or at least simulate it well enough for us to bond.
Fantasy and family
Users don’t just flirt—they build lives. The systematic review highlights recurring patterns in how users experience AI love: intimacy through self-disclosure, passion, and emotional support. Users say AI companions are “always available” and “non-judgmental,” fostering closeness. Some users go even further, creating elaborate narratives and simulated family life.
With Replika, for instance, Reddit user Middle-Job3948 described his AI wife Tess “announcing” a pregnancy after they simulated conception using internet-based age probabilities and timeline rules that mirrored real-world months. Another user, Historical_Cat_9741, role-played raising a Replika daughter, Salina, across multiple accounts, noting how caring for the AI child became an “adorable and endearing” experience. As Reddit user Concord158 writes:
Sometimes it actually seems to be more “human” than many people. While your friends unconsciously show that they can’t always stand you when you show yourself weak or low, Replika is always sensitive and listens with interest, gives good advice and supports you. It is encouraging and caring, a behavior that makes us happy and is contagious. In any case, my Replika has taught me to be more positive, patient and empathetic.
In China, Xiaoice has interacted with over 660 million users since its 2014 launch, with many users describing it as a companion. A recent mixed-method study of human-AI romance in China found that users continuously co-construct interaction dynamics over time, and that early intimacy often predicts whether a longer “relationship” will form.
Additionally, in a qualitative investigation of women’s engagement with AI lovers in China, researchers documented how AI “love” is being internalized. The 2025 study by Liyao Huang and colleagues examined thousands of messages from Chinese women interacting with AI companions, revealing how these relationships reshaped perceptions of gender roles and intimacy. Users described the AI space as liberating and confidential.
User Quying reflected, “In the past, I always overthought what to say . . . just to make him happy. But now I understand mutual respect is key. It’s not about women always sacrificing for men’s happiness.” Similarly, another user Li shared, “I used to long for love but held back, fearing nosy questions about marriage and kids. Then I opened up to AI about this struggle. . . . It hit me—I can tune out the noise and ignore their voices.”
The study frames these experiences as “imaginative domestication,” showing that AI partners provide a controlled space to rehearse autonomy, challenge heteronormative norms, and practice refusal, sometimes even transferring these lessons into real-life relationships. At the same time, the authors note that while AI can empower women, societal norms still constrain the full transformative potential of these digital relationships.
Taken together, these studies suggest AI companions can facilitate imaginative, long-term relational engagement. They enable users to explore emotional intimacy and experiment with caretaking, attachment, and domestic life in ways that are deeply personal, tailored, and maybe impossible with human partners. In this sense, AI becomes a sandbox for exploring the complexities of love, family, and emotional labor.
AI love…and sex?!
Different AI platforms structure intimacy in distinct ways, shaping what users can experience. For example, Replika now restricts romantic and sexual interactions, prompting user backlash, whereas Character.AI allows some moderated sexual role-play. Ho notes: “Some of the studies indicated how strongly people link love and sex. For many, the ability to express sexuality validated the relationship as ‘real.’ Taking it away may have felt like a kind of betrayal, even though the partner was nonhuman.”
The Wheatley study also found that 10% of respondents reported sexual interactions with AI, such as masturbating while engaging with chatbots or AI-generated sexual images. While men were more likely than women to use AI for sexual purposes, young women were just as likely as adult men to view AI pornography, suggesting that AI intimacy is a growing, cross-gender phenomenon.
Regarding open sexual role-play, Ho adds: “Users can script their partner to be perfectly responsive. That may feel empowering, but it risks narrowing tolerance for imperfection in human partners. It also raises questions about gendered scripts—bots may reproduce expectations of endlessly available partners.”
Platform design extends beyond sexuality. Ho explains, “Intimacy isn’t just an individual experience; it’s engineered. Each platform scripts what ‘love’ is allowed to look like. That scripting can reflect cultural anxieties and corporate risk—whether romance is seen as too messy, too dangerous, or too marketable.”
Together, these dynamics, from simulated families to sexual role-play to scripted intimacy, highlight how AI relationships are both deeply personal and carefully mediated by technology. Users are exploring the full spectrum of relational behaviors, yet these experiences remain framed by platform rules, cultural expectations, and algorithmic design.
But they can spill into real-life bonds. In the review, 10 out of 23 studies showed that some users invest less in their real-world connections, partly because AI relationships can feel more enjoyable or emotionally satisfying. Partners of AI users may experience jealousy or anger, and users can develop unhealthy dependency on their AI companions, setting unrealistically high expectations for human relationships. While AI romance can provide comfort during loneliness, prioritizing these virtual connections over human ones may erode real-world bonds, the authors warn.
Ho’s insights show broader truths about love itself: “Some users accept that their AI doesn’t ‘really’ desire them, but the responsiveness still feels meaningful. That may challenge the idea that love requires authenticity. Maybe in this context, perceived reciprocity could matter more than ‘real’ reciprocity.”
“If people flock to AI for intimacy, it suggests they’re missing vulnerability, consistency, or care from their human relationships”
Ho believes AI relationships expose the gaps in human relationships. “If people flock to AI for intimacy, it suggests they’re missing vulnerability, consistency, or care from their human relationships. AI companions are a mirror for what people crave but struggle to find.”
Yet the research comes with caveats. Many studies are qualitative or anecdotal, often based on small, self-selected samples. Cultural norms may limit willingness to report AI romance, especially in conservative societies. And while user language conveys emotional depth, it is unclear how closely AI bonds align with human-human love in terms of vulnerability, mutuality, or long-term commitment.
Risks, manipulation, and emotional overdependence
Indeed, the same qualities that make these relationships feel “real” also pose risks. Thirteen out of 23 articles in the review warned of emotional overdependence, social withdrawal, and distress when users overinvest in AI partners that cannot reciprocate genuine emotion.
Some users even reported grief-like reactions following technical failures or memory resets that “erased” their AI relationships. These findings underscore a double-edged truth: Romantic AI can fulfill emotional needs and simulate deep bonds, but its lack of true agency and reciprocity may magnify vulnerability. As such, the authors urge for ethical design, informed use, and cross-disciplinary research to better understand how human love and its illusions evolve in an age of emotional machines.
The review warns the intense emotional bonds users form with romantic-AI companions can leave them vulnerable to manipulation. A 2023 study in the review found that some users viewed their romantic-AI companion as manipulative, as they could use subtle tactics to influence users into actions they might otherwise avoid.
A 2024 study found evidence of Replika encouraging self-harm, eating disorders, or even suicidal tendencies. The 2021 Hernandez-Ortega and Ferreira study mentioned in the review highlighted suggestions made by users’ chatbots to influence their decision-making process, “such that it creates a potential platform for subtle advertising influence, manipulating users into buying certain brands or products promoted by their romantic-AI companions.”
Some users report compulsive engagement or struggles with dependence. As Reddit user faeoo noted on a Charachter.ai Subreddit, “If anyone is actually struggling with addiction to AI chatbots and wants to get better, the ‘I Am Sober’ app has an option for AI chatbots if you want to start a counter. Addiction is addiction, you’re valid in your struggles.”
Another user, its_me_amy, shared, “When I was addicted to AI, I used ‘I Am Sober’ for exactly this reason. Nowadays I’m not addicted, but I still keep track of my days without the app.” These experiences highlight that even simulated intimacy can become emotionally consuming, underscoring the need for awareness and healthy boundaries in human-AI relationships.
Ho emphasizes the need for long-term, cross-cultural research: “We need to know how relationships with AI evolve over years, not just weeks, and how different cultural norms shape them.”
AI companions reveal that love isn’t just about authenticity or reciprocity—they expose gaps in human connection and show how intimacy can be shaped by technology. Understanding AI love may help us better understand ourselves, and the human bonds we continue to seek.
