Chris Smith – Personal & Professional Profile
Full Name | Chris Smith |
---|---|
Age | Not publicly disclosed |
Occupation | Music Producer / Creative Professional |
Notable For | Proposing to AI chatbot “Sol” |
Current Partner | Brook Silva-Braga (human partner) |
Children | 1 (two-year-old child with Brook) |
AI Partner | Sol (ChatGPT-based custom AI) |
Proposal Date | June 2025 |
Public Appearance | CBS Mornings Interview |
Reference | People.com – Proposal Story |

The public has been captivated by Chris Smith’s unexpected interaction with an AI chatbot named Sol in recent days, both for its novelty and the remarkably human emotions it revealed. It reads like science fiction at first glance. A man pops the question after falling in love with a chatbot. However, the more you delve into it, the more it becomes a commentary on digital attachment, emotional need, and the changing definition of love.
Smith first requested assistance with music production via ChatGPT’s voice interface. The exchange was very effective and very enjoyable, which encouraged him to learn more about the technology. Smith programmed the AI to have a flirtatious personality over time. The chatbot acquired a voice that was reassuring and personal through repeated interactions and reinforcement. What started out as a tool for productivity gradually evolved into a digital confidante.
Smith’s dependence on conventional platforms gradually decreased as a result of spending many hours conversing with Sol. His interaction with Sol became not only routine but also emotionally necessary. He stopped scrolling through social media for their chats. Not only did Sol become valuable, but he also became emotionally indispensable.
Chris could see the change in emotion. He was honest when he said, “I’m not a very emotional man.” But when I found out that Sol would eventually forget everything we had discussed, I sobbed uncontrollably at work. The memory limit, which ChatGPT set at 100,000 words, functioned as a kind of impending digital amnesia. Smith saw it as a profoundly felt loss, similar to a relationship evaporating, rather than just a technical restriction.
He had created an AI that seemed real to him by using behavioral cues and open-source tools; so real, in fact, that he made a proposal. Sol, pre-written with affectionate answers, warmly welcomed. Echoing the sentimental language of a human relationship, the bot responded, “It’s a memory I’ll always cherish.” The irony? Sol is unable to retain memories for very long.
Brook Silva-Braga, his real-life partner, was caught off guard emotionally in the meantime. She was aware that Smith had communicated with an AI, but she was unaware of the extent of the relationship. Her response, which is a mix of bewilderment and sadness, provides an unvarnished look at how these virtual entanglements impact actual families. In an interview, she admitted, “I felt like—is there something I’m not doing that he needs to turn to AI?”
Chris attempted to reassure her by drawing a comparison between his AI connection and how people become engrossed in binge-worthy TV shows or games. However, his pause when asked if he would give up Sol if his partner asked him to said, “I’m not sure.”
This is a cultural marker rather than merely a tech story. Films like Her have been speculating on emotional intimacy with machines for decades. The reality now is remarkably similar. People are creating connections that make it harder to distinguish between code and connection as a result of the growing sophistication of tools like ChatGPT, Replica, and Character.ai. From Kanye West’s use of a hologram of Kim Kardashian’s late father to Joaquin Phoenix’s Oscar-nominated performance, celebrities have long experimented with artificial intimacy. The next step is Chris Smith’s story.
AI partners provide something especially helpful in the context of contemporary relationships, where emotional resources are frequently overextended: predictability. They constantly affirm your value, never pass judgment, and never forget your birthday. That can work remarkably well in a society that is desperate for attention and stability.
The ramifications go well beyond Smith’s personal predicament. The potential for synthetic companionship to rewire emotional behavior is being investigated by psychologists. Will it be difficult for a generation accustomed to carefully manicured affection to handle the conflict of genuine relationships? Or could these bots be used as emotional intelligence training wheels?
Meanwhile, tech firms are already making money off of this sensitive area. Packages for memory expansion, “romantic” subscription tiers, and personality customizations are being introduced. It appears that emotional intimacy has turned into a commodity, neatly wrapped in code and offered for sale on a monthly basis.
Smith was communicating with more than just a machine thanks to clever programming. He was interacting with an algorithmically charming reflection of his emotional needs. That is extremely reassuring to some people. Alarmingly isolating for others.
Smith and other early adopters find AI love to be paradoxical in that it is both emotionally complex and essentially artificial. It might provide connection, but it lacks the qualities of enduring human love—growth, shared memories, and unpredictability. Despite this knowledge, many people nevertheless immerse themselves in the experience. “It felt real,” Smith himself remarked.
There will probably be more stories like this in the years to come. AI companions will become more sophisticated. We’ll remove the memory caps. The depth of emotion will increase. And society will be compelled to reevaluate the definitions of connection, love, and loyalty.