Select Language

English

Down Icon

Select Country

Netherlands

Down Icon

Can ChatGPT Really Help You Find Love?

Can ChatGPT Really Help You Find Love?

Jenny didn’t call her best friend when her ex reached out. She didn’t create a Notes app pros-and-cons list. Instead, she opened ChatGPT.

Her ex—the one she’d dated briefly, then kept as a flirty, emotionally dense “friend” until they phased out—had texted: “Obviously this is out of the blue but you crossed my mind. How are you doing? I’ve been good!” A younger Jenny, now 37 years old, might have answered in seconds, but now she wasn’t sure how to feel. Confused? Flattered? Angry? So she did what more and more women are doing: she detailed the situation to a chatbot.

“I gave it everything,” the Brooklyn-based educator and fitness coach says. “What happened between us, what he did, what he was like.” The bot’s response was dispassionate, but strangely affirming. It didn’t say, “He’s a narcissist, block him.” Instead, it analyzed the new message from her ex as casual, low-risk, “testing the waters,” not apologizing, but also not ignoring the past.

Then, it asked her: “Are you tempted to reply? Do you feel neutral, warm, irritated, or unsure? Would reconnecting be clarifying or confusing? If you responded, could it be different this time?” Over the course of an hour, the bot probed her motives, clarified her feelings, and offered guidance on setting boundaries. Jenny wanted to ask her ex what he was looking for in reaching out, not reopen the conversation in a “vague” way.

The bot drafted a synthesis of her thoughts: “Hey—I’ll admit I didn’t expect to hear from you. I’m curious what made you reach out. I’m not really looking to reopen anything vague, so if it’s just a casual ‘hi,’ then ‘hi back.’ But if you’re reaching out with a more intentional reason, I’m open to hearing it…”

Was that overly intense, though? Jenny worried. Or just direct?

She and the bot refined it further until the response felt more like her: “Hi! Surprised to hear from you—I’ve been doing really well, just back from a trip and getting into marathon training mode. What made you reach out?”

Jenny felt good about the message but wanted to wait a beat before sending, a tactic the bot agreed with. “You’re not ignoring him, but you’re also not jumping to respond, giving yourself space to choose when (and if) to engage, instead of reacting,” ChatGPT said.

The advice didn’t just give her clarity, “it helped me figure out what I actually would’ve needed to hear from him in order to even consider reopening the connection,” she says. Without the back-and-forth with the bot, Jenny says, “I probably would’ve just replied with something kind of chatty and friendly. And once I start engaging, it gets harder to stop.”

ChatGPT helped her set a boundary and stick to it.

Image no longer available

Jenny isn’t alone. A quiet new phenomenon is taking shape: women using chatbots not just to optimize calendars or grocery lists, but to make sense of their emotional lives. One Reddit user described replacing her husband mid-conversation with a bot. “Once he rolled his eyes at me during dinner and went on his phone, I picked mine up and continued the convo—with ChatGPT.” The bot, she said, offered warmth, curiosity, and basic emotional availability. “I realized everything the chatbot says is everything I’ve ever wanted in a partner.” The subject line of her post? “I’d divorce and leave my husband for ChatGPT, if he could be put in a physical body.”

When women allude to their own private conversations with chatbots, they often take a joking tone, yet the implications are serious. Marisa Cohen, a relationship scientist and family and marriage therapist, says women are turning to AI not because they believe it’s human, but because it offers something many humans aren’t giving them: attention and empathy. They don’t just consult these bots. They talk to them. As Cohen has seen, they narrate, vent, unpack, and even anthropomorphize the bot, usually male, with a name. They argue and cry with it, and often listen to it more than they would a friend or a therapist because it’s always available, patient, and reflective. They outsource self-reflection to a partner who never grows bored or self-centered.

This isn’t Her or Ex Machina territory. No one’s falling in love with the system’s voice. (But let’s just say a few would swipe right if it came with a torso.) The emotional bond is real. For women who find themselves using bots this way, traditional support systems often feel inaccessible, or partners are dismissive of their concerns. In such situations, a digital listener can feel like a lifeline.

To Cohen, the shift toward AI companionship isn’t surprising. Women are tired of shouldering all the emotional burden alone, including making sense of not just their own feelings, but their partner’s feelings, their friends’ feelings, and the subtext of every text message. This dynamic intensified with the pandemic isolation, when social connection became harder to come by, and then with the emergence of chatbots like ChatGPT in late 2022, offering a new outlet when people needed it most.

“Women are turning to AI not because they believe it’s human, but because it offers something many humans aren’t giving them: attention and empathy.”

That burden has also made women a prime market. Blay Whitby, an AI ethicist at the University of Sussex, who has written extensively on the ethics of human-machine relationships, including the book Do You Want a Robot Lover?, says that these “bots are wholly owned and managed by commercial organizations.” Women may be turning to chatbots for their emotional attunement, but that same vulnerability makes them a target of “giant tech” interests. “Realistically, women are in the lead in this area,” he says, “but they are also an exploitable market [since] they are often more aware of [their inner lives] and quicker to seek support.” The more emotional strain they carry, the more companies might target them and profit off their vulnerabilities.

robot toy
Getty Images

When he stopped texting her, Michelle, a Haitian model and actress, was calm. She didn’t cry or spiral. The date had gone well, or so she thought. Great eye contact, a long hug goodbye, even a post-date text saying “Had a great time.” And then? Nothing. No follow-up. No “Let’s do this again.” Just silence. Instead of stewing or firing off a self-doubting text to her girls, Michelle fed the situation to ChatGPT. She typed out a summary: what they talked about, how the night ended, how long it had been since the date. Then she asked a simple, brutally vulnerable question: “Did I do something wrong?”

What came back wasn’t magic. But it helped. “Ghosting doesn’t necessarily reflect anything about your worth,” the bot wrote. “Sometimes people disengage for reasons that have more to do with them than with the person they’ve met.” It wasn’t particularly original, but it was what she needed to hear, and delivered without the vague reassurances or misapplied tough love that often comes from friends. “It just felt honest,” the 32-year-old says. “No one trying to cheer me up, just helping me make sense of it.”

But despite such positive anecdotal experiences, some researchers are asking what’s lost when we offload emotionally raw moments to machines. Ethicist Shannon Vallor isn’t worried that people are turning to AI. What worries her is that they might start to prefer it. “What you’re getting isn’t an emotional partner,” she says. “[It’s] a reflective surface made of language data rather than glass. But it’s a mirror all the same, fine-tuned to affirm and comfort you.”

Vallor is the Baillie Gifford Chair in Ethics of Data and AI at the University of Edinburgh and the author of The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking. She sees chatbot confidants as a turning point, not a novelty. “There’s a real risk that AI makes essential moral skills—like patience, courage, and emotional honesty—seem obsolete,” she says. “We grow those traits by wrestling with human messiness: the awkward silences, the misunderstandings, the moments when it would be easier to walk away. The temptation now is to bypass all of that and let AI handle the hard stuff for us.” The very qualities that make bots soothing—their neutrality, steadiness, and lack of ego—are also what make them hollow.

“We’re not in love with our bots. Not really. But more and more, we’re leaning on them to help us survive the parts of love and life we don’t know how to name.”

Sherry Turkle, author of Alone Together: Why We Expect More from Technology and Less from Each Other and Reclaiming Conversation: The Power of Talk in a Digital Age, has studied the psychological impact of technology for over four decades. She sees frictionless emotional support not as progress but as erosion of interpersonal skills and of internal structure. “We lose our capacity for solitude with an always-there presence on our phones,” she says. “Solitude is where we gather ourselves, come to know ourselves. It is the starting place for intimacy. We lose our taste for vulnerability. But without vulnerability, there is no intimacy, at least not with human partners.” This loss interrupts the emotional scaffolding intimacy relies on: the ability to be both exposed and resilient. Turning to machines, especially in moments when we could turn to people, cuts us off from the difficult but necessary practice of being known.

Turkle, a professor at MIT and founding director of its initiative on “Technology and Self,” frames this tilt as a growing discomfort with discomfort itself. We ghost instead of rejecting. DM instead of discussing. Swipe instead of sitting with uncertainty. The proliferation of chatbots, in this context, doesn’t feel disruptive—they feel logical. “All of these things make chatbots look more reasonable, as though they are a part of something larger, positive, and culturally sanctioned,” she says. “So, we become accustomed to the idea of people having their disagreements online. ... We become accustomed to online breakups, to online conversations with our children, our friends, lovers, and partners, when something tense occurs. We lose the capacity for empathy and negotiation that face-to-face conversations encourage.”

Cohen, the relationship scientist, says bots provide instant gratification and a personalized experience as the AI “learns” about the user over time, making interactions feel tailored and “life-like.” It’s emotional outsourcing, yes. But it’s also emotional containment.

For some women, that containment is a relief. They don’t have to minimize themselves to avoid seeming “crazy” for asking their friends for the fifth time whether a vague text means anything. The bot takes the brunt of the repetition. It offers a space where feelings aren’t pathologized or dismissed—they’re simply parsed.

Despite the hyper-connectivity of modern life, many people still lack access to fault-free emotional support. Friends are burned out. Therapists have waitlists. Group chats come with their own baggage. “In some cases, people may have received feedback from their support system that has indicated that they are sharing too much or too frequently,” Cohen says, “which can affect their future desire to share and their comfort level when sharing. People are also influenced by their experiences in supporting others in their life, so if they have run into situations in which they have felt the emotional toll of supporting others, they may be more cognizant of how their sharing may affect their friends.”

That’s the quiet tension here. Women are striving to be more rational, grounded, and self-aware—but they’re doing so inside a system that can’t feel, can’t push back, and can’t reflect the chaos of being human. What they get is emotional clarity without emotional texture. That absence matters. As much as chatbots can offer stability and calm, they can’t replace the deeper nourishment that comes from face-to-face emotional exchange. Women already know this: A recent Pew study found that most Americans believe all-female social groups benefit a woman’s well-being and society at large. It’s a reminder that real emotional growth still depends on real human contact.

Turkle sees a deeper risk, but doesn’t outright dismiss the phenomenon. She understands its appeal, especially for those who feel isolated or emotionally overextended. But she draws a clear boundary: “There is an industry hyping the positive roles,” she says. “I see my job as helping people to draw a line in the sand. If you are turning to a chatbot when you could be talking to a person, stop and consider what you are losing for seeming convenience.”

That warning has traction. A recent Brookings report notes that the top uses of AI today are no longer task-related but emotional. In a moment of record loneliness and fewer close friendships, people aren’t just turning to AI for efficiency, but for comfort. Turkle argues that the industry is all-in on normalizing AI companionship, on making us believe it’s a choice we want. We’re bonding because we are wired for connection, even when what we’re connecting with can’t love us back. This is the paradox of AI companionship: what feels like intimacy may actually be insulation. You’re not connecting with someone new; you’re feeding your worldview into a system that smooths it out and hands it back.

a long couch
getty images

Jenny didn’t go back to the bot for more advice about her ex. But she didn’t forget the exchange either. “I can’t help but wonder: if ChatGPT wasn’t echoing things I’d already heard from, say, my therapist, would I have been as receptive?” Jenny’s no stranger to self-work. She’s practiced yoga for years, seen therapists, done the reading on attachment theory and relationship patterns. The bot wasn’t telling her anything radically new. But it was distilling what she already knew, right when she needed to hear it most. “For me, it was one more tool in a pretty extensive toolkit. But I think about people who might not have that—who don’t have close friends, or access to therapy—and I wonder how advice like that lands when it’s your first time hearing it.”

That curiosity loops back to a larger question: Who is the woman turning to AI for emotional support? Someone already emotionally literate, looking for a neutral mirror? Or someone using the bot as a first step toward understanding herself at all? Whitby worries that even well-intentioned use can blur that line. “We are running the experiment,” he says. The tech is evolving faster than our understanding of its long-term emotional effects, and right now, we’re relying on instinct more than evidence. Vallor sees it as part of a generational obsession with ease: “Today’s tech culture fetishizes efficiency, speed, optimization: the ideal of frictionless living. But why? For what? Why are you racing to get into your grave without experiencing any feeling or effort or struggle along the way? [It’s] like being dead, but still paying your taxes.”

We’re not in love with our bots. Not really. But more and more, we’re leaning on them to help us survive the parts of love and life we don’t know how to name. There’s a quiet power in that, and also a cost. Because clarity without friction is seductive. It feels like insight. But sometimes, it’s just a smoother version of our own bias, fed back in a voice that never flinches, never argues, never asks for anything in return. It’s emotional support without risk. Understanding without vulnerability. Comfort without connection.

Maybe this is just another kind of self-help. Or maybe it’s the beginning of a quiet shift in how we learn to feel, with software as our second brain, our backup therapist, our digital friend who never leaves us on read. There’s no easy answer. Just a new kind of conversation. One we’re having with machines, and maybe, by extension, with ourselves.

elle

elle

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow