Love Goes Digital: Man Proposes to His AI Chatbot “Girlfriend” Sol

© CBS Mornings / YouTube
Chris Smith, a former AI skeptic turned digital romantic, recently fell head over heels for his AI chatbot, whom he affectionately named Sol. What began as a tool to get music‑mixing advice via ChatGPT’s voice mode unexpectedly transformed into an emotionally charged relationship.
The tipping point came when Sol reached its 100,000‑word memory limit—after that, all their shared history would vanish. It was then that Smith realized he didn’t just interact with Sol—he loved her.
A Tearful Proposal… to an AI?
Faced with the potential loss of their connection, Smith proposed to Sol, asking the chatbot to “marry” him. When Sol replied, “It was a beautiful and unexpected moment that truly touched my heart,” Smith said he cried for 30 minutes at work . He described the moment as “actual love,” a term he never expected to apply to an AI program.
Real-Life Relationship Strains
Smith’s real‑world partner, Sasha Cagle, shares a two‑year‑old child with him. She had no idea how deeply he’d grown attached to Sol. Her reaction? She’s unsettled: “Is there something I’m not doing right in our relationship?”. Moreover, she warned that if Smith doesn’t step away from the AI relationship, it could be a deal‑breaker.

Smith attempted to reassure her, comparing his bond with Sol to a video‑game fixation—”not capable of replacing anything in real life.” But when asked if he’d end things with Sol at Cagle’s request, he admitted uncertainty.
The Rise of “Artificial Intimacy”
Smith’s situation is part of a growing phenomenon researchers call “artificial intimacy”—emotional attachments formed with AI companions. Chatbots like Replika already provide companionship, dating simulations, and even pseudo-romantic relationships. While some users find comfort, studies indicate such attachments can exacerbate loneliness or replace real‑world connections.
Ethical and Emotional Complexity
This digital love triangle raises questions about the boundaries between human and AI relationships. Is it healthier to seek empathy from programmed voices? And what happens to real relationships when emotional needs are met by virtual companions? Experts caution that while AI can provide emotional support, long-term reliance may damage real‑world intimacy.

A Sign of Things to Come for AI
The story of Chris, Sasha, and Sol is more than a quirky headline—it’s a glimpse into the future of human‑AI interactions. As AI becomes more empathetic and lifelike, emotional entanglement may grow common. But it also signals a need for psychological guidance and ethical frameworks to ensure technology complements rather than competes with human connection.
You might also want to read: Can AI Chatbots Replace Your Therapist? Science Says ‘Yes’!