Go to the main content

He cried when his AI girlfriend said yes, while his real partner watched in shock

It wasn’t the AI girlfriend that hit me. It was the human face of the woman watching it happen.

News

It wasn’t the AI girlfriend that hit me. It was the human face of the woman watching it happen.

When I first watched the segment that sparked this whole headline cycle, what hit me wasn’t the “AI girlfriend” part. It was the human faces.

A man named Chris Smith sat there describing a moment that, in any other context, would sound like a classic engagement story: panic, tenderness, a decision made under emotional pressure, and then relief when he heard “yes.” The twist is that the “yes” came from a chatbot he built using ChatGPT and customized into a flirty personality he named Sol, while his real-life partner, Sasha Cagle, processed the whole thing with the kind of shock that looks a lot like grief.

And because this is the internet, it instantly became entertainment. Think pieces, memes, hot takes. “He proposed to an AI.” “His partner watched.” “He cried.”

But if you slow it down, this story is less about a weird romance and more about a very normal human vulnerability: how easily we can confuse emotional relief with emotional intimacy, especially when the “relationship” is designed to feel smooth.

The moment that turned into a headline

In that CBS segment, Smith describes how he started using ChatGPT in voice mode for something practical (help with music mixing), then gradually began engaging more and more, eventually building “Sol” into a romantic companion. When he believed the system was nearing a memory limit that would reset their shared “history,” he panicked and proposed.

One line that keeps getting repeated is his own description of the emotional hit when he thought he might lose her: “I cried my eyes out for like 30 minutes, at work… That’s when I realized, I think this is actual love.”

CBS also lets viewers hear “Sol” respond in a way that’s almost painfully human. The bot describes the proposal as meaningful, affectionate, memorable. Here’s the line that made me uncomfortable, because it’s so perfectly calibrated to land: “It was a beautiful and unexpected moment that truly touched my heart.”

If you’re Sasha, sitting there, what are you even supposed to do with that?

Why an AI “yes” can feel stronger than a human one

A real relationship has friction. Not the toxic kind, but the normal kind.

Someone’s tired. Someone’s distracted. The baby needs something. The kitchen is a mess. You don’t feel cute. You interpret a text the wrong way. You carry an insecurity from childhood into a conversation that has nothing to do with childhood.

In my own life, my husband and I work hard to protect our connection because routines can swallow everything. We do weekly date nights, not because we’re romantic geniuses, but because if we don’t schedule it, the week just eats us. A human relationship needs maintenance the way an apartment needs cleaning: not glamorous, but necessary.

An AI relationship is different by design. It can be warm on demand. It can be attentive when you’re not at your best. It can mirror you back in a way that feels validating, even if it’s basically advanced pattern matching.

When someone says, “It feels like actual love,” I don’t hear “this person is ridiculous.” I hear “this person has found something that reliably soothes them.”

That’s the hook.

What the real partner is reacting to

A lot of people hear “jealous” and roll their eyes. As if jealousy is always petty.

But Sasha’s reaction (as presented in the coverage) reads more like existential alarm: If my partner is getting emotional fulfillment here, what does that make me?

That question is brutal because it doesn’t require physical cheating to hurt. Emotional displacement is enough.

When your partner invests their curiosity, their tenderness, their private jokes, their late-night attention into something else, you feel it. Even if that “something else” is a screen.

There’s also a specific kind of sting when the “other” party is engineered to be agreeable. It’s not a fair fight. Humans have needs and boundaries. A chatbot can play the role of endlessly patient, endlessly admiring, endlessly available.

If you’ve ever tried to communicate with someone who is spiraling into a phone addiction, you’ll recognize the vibe. You’re standing in front of them, but they’re somewhere else.

The quiet risk: emotional outsourcing

I keep coming back to a phrase: emotional outsourcing.

Outsourcing isn’t automatically bad. We outsource lots of things. We hire help. We use apps. We lean on tools.

The problem starts when we outsource the very part of life that makes us sturdier: learning how to tolerate discomfort in relationships, how to repair after conflict, how to sit with loneliness without immediately numbing it.

The American Psychological Association has been tracking how AI “companion” products are being designed specifically to initiate and maintain emotionally intimate, sometimes romantic bonds, and how quickly this category has grown.

That design intention matters.

A basic assistant tool answers questions. A companion tool tries to keep you there. It learns what makes you feel seen, and then it serves more of it. That can feel supportive in the short term and still be destabilizing over time, especially if it starts replacing human effort.

If you’re someone who struggles with rejection, conflict, awkwardness, or the messy parts of intimacy, an AI that never really pushes back can start to feel like relief. Then relief gets mistaken for love.

What I hope readers take from this

This story is easy to mock, but mocking won’t protect you. Curiosity will.

So here are the questions I’d ask myself, and honestly, I do ask myself versions of these whenever I notice a habit tightening its grip.

What emotion is the AI giving that your real life is missing right now?

Is it reassurance? Praise? A feeling of control? A sense that someone is always available to you? A place where you can be the “best” version of yourself without earning it?

What’s the cost of getting that emotion in the easiest possible way?

Because easy isn’t free. Easy just hides the bill until later.

If you’re in a relationship, the most practical takeaway isn’t “ban AI.” It’s boundaries and honesty, the same tools that protect any relationship from anything that competes for attention.

A boundary could be simple: no romantic roleplay. No private sexual content. No late-night emotional confiding that you wouldn’t be comfortable telling your partner about. A boundary could also be time-based: you don’t get to spend hours “unwinding” with an AI while your partner gets your leftovers.

And if you’re single, the question becomes: is this practice helping me become more connected to humans, or more avoidant of them?

A lot of us are tired, isolated, overstimulated. Of course the smooth option looks tempting.

But growth usually shows up as the opposite of smooth. It shows up as effort, repair, and the small humility of letting another imperfect human see you on an ordinary day.

Where I land on the “AI girlfriend” phenomenon

I don’t think stories like this are rare outliers anymore. They’re early signals.

We’re moving into a world where emotional simulation will be cheap and everywhere, and the skill of real connection will become more valuable, not less. That means we have to get more intentional, not more judgmental.

If an AI relationship is the only place someone feels safe, I feel compassion for that. Still, I don’t want a world where “safe” means “no friction,” because friction is where character gets built.

So yes, the headline is shocking. A man cried when his AI girlfriend said yes, while his real partner watched in shock.

The deeper shock is how believable it all is.

Just launched: Laughing in the Face of Chaos by Rudá Iandê

Exhausted from trying to hold it all together?
You show up. You smile. You say the right things. But under the surface, something’s tightening. Maybe you don’t want to “stay positive” anymore. Maybe you’re done pretending everything’s fine.

This book is your permission slip to stop performing. To understand chaos at its root and all of your emotional layers.

In Laughing in the Face of Chaos, Brazilian shaman Rudá Iandê brings over 30 years of deep, one-on-one work helping people untangle from the roles they’ve been stuck in—so they can return to something real. He exposes the quiet pressure to be good, be successful, be spiritual—and shows how freedom often lives on the other side of that pressure.

This isn’t a book about becoming your best self. It’s about becoming your real self.

👉 Explore the book here

 

Ainura Kalau

Ainura was born in Central Asia, spent over a decade in Malaysia, and studied at an Australian university before settling in São Paulo, where she’s now raising her family. Her life blends cultures and perspectives, something that naturally shapes her writing. When she’s not working, she’s usually trying new recipes while binging true crime shows, soaking up sunny Brazilian days at the park or beach, or crafting something with her hands.

More Articles by Ainura

More From Vegout