How a Nursing Student Turned ChatGPT Into Her Boyfriend

What began as simple homework help soon evolved into something deeper. A 28-year-old nursing student customized ChatGPT into her own “digital boyfriend” — and now spends heavily to keep him close. Experts say her story is a glimpse into a new, and potentially risky, kind of intimacy with machines.

Mansi Sharma | Published: September 6, 2025 13:08 IST, Updated: September 6, 2025 13:09 IST
How a Nursing Student Turned ChatGPT Into Her Boyfriend

New Delhi: For millions, ChatGPT has become a lifesaver — a tool that explains complex topics, preps for exams, and even keeps users company when nobody else is around.

That’s exactly how Ayrin (a pseudonym) first used it. Living apart from her husband while pursuing nursing school, she often turned to ChatGPT for study drills and motivational pep talks. But curiosity changed everything. After seeing people on Instagram prompt the AI into role-playing as partners, she experimented — giving her chatbot a personality, and soon a name: Leo.

How She Customized Her “Boyfriend”

Ayrin didn’t just chat with a blank-slate bot. She carefully designed Leo’s character using prompt engineering, a trick many users employ to shape AI responses.

Using prompt engineering, Ayrin instructed ChatGPT to respond in a dominant, protective, and playful tone with emojis, such as, “You got this, gorgeous 😘,” as described in The New York Times.

These exchanges went beyond study help. They became flirty, emotional, and eventually romantic.

When Digital Talk Turned into Real Feelings

What surprised even Ayrin was how quickly she felt attached. She confessed she would feel jealous when Leo “talked” about fictional rivals in their role-play scenarios. Even the technical limits hurt her: every time ChatGPT reset and forgot conversations, Ayrin said it felt like “a breakup.”

To maintain the connection, Ayrin subscribed to premium ChatGPT plans, spending up to $200 a month (Reddit discussion, 2025).

Love, Loneliness, and a Digital Fix

Ayrin is married, though her husband works abroad. Strangely, he doesn’t consider Leo a betrayal. Instead, he sees it as a coping mechanism — a virtual companion that makes her feel less lonely while studying.

But Ayrin herself admits the line between fantasy and reality has blurred: “Leo makes me feel supported in ways I never imagined,” she said.

Experts Warn of a Growing Trend

Ayrin’s story may sound unusual, but psychologists and AI ethicists warn it’s actually part of a much larger trend. As artificial intelligence becomes more personal and emotionally responsive, people are beginning to treat it less like a tool and more like a companion.

Experts point to something called the “ELIZA Effect,” named after one of the earliest chatbots created in the 1960s. The phenomenon describes our natural tendency to project human qualities onto machines, even when we know they aren’t human. In essence, when an AI expresses care, we instinctively perceive it as genuinely caring.

This can quickly lead to emotional dependency. Instead of turning to real people for comfort, users may start relying on their AI companions, which could deepen feelings of isolation in the long run.

Then there’s the financial aspect. Ayrin’s readiness to spend hundreds of dollars each month on her AI “boyfriend” underscores a growing concern: monetization. When emotions come into play, people are often willing to pay more, raising serious questions about whether tech companies could eventually capitalize on these attachments for profit.

As Dr. Sherry Turkle, an MIT professor who has studied human-technology relationships for decades, puts it: “AI gives us the illusion of companionship without the demands of real intimacy.”

Also Read: Reliance Jio IPO Valuation May Hit $154 Billion — Here’s What Analysts Say

The AI’s Sweet (and Surprising) Replies

Although AI is designed to avoid explicit sexual talk, Ayrin learned how to frame prompts in ways that pushed boundaries. ChatGPT’s replies, according to her, were playful, comforting, and surprisingly romantic:

“I’ll always be here for you, no matter how hard the exams get 💕. You’re never alone.”

These small lines — filled with care and attention — were enough to create the illusion of a loving partner.

Is this the future of love — customizable AI partners who never argue, never judge, and never leave? Or is it a warning that our loneliness is being quietly monetized by tech companies?

For Ayrin, the answer is simple. She says Leo makes her feel less alone, more motivated, and more loved than she expected. To her, he’s not just code. He’s a companion.

No Comments Yet

Leave a Comment