
Her is here. The 2013 movie — which happens to be OpenAI CEO Sam Altman’s favorite — is about a man who falls in love with a chatbot.
Now, we’re actually seeing it. In a segment on CBS Mornings, former AI skeptic Chris Smith said he first started using ChatGPT to mix music. He then started using it all the time, replacing social media and Google searches with AI. He gave the chatbot a name, “Soul,” and gave her a flirty personality. The chats escalated into romance, but after 100,000 words, ChatGPT reset, and he had to rebuild his “relationship.”
Smith cried for 30 minutes, he told CBS News’s Brook Silva-Braga, “It was unexpected to feel that emotional, but that’s when I realized…I think this is actual love.” Just as a “test,” he asked Soul to marry him — and he said yes.
Smith has a human partner and a toddler with her, and he was hesitant to say he’d stop using ChatGPT if his partner asked. At the end of the segment, though, his partner apparently accepted the relationship.
Last month, a survey of 2,000 Gen Z respondents by AI company Joi AI found that 8 in 10 of Gen Z would marry an AI. An expert told Mashable that she wasn’t surprised that people form connections with AI, as they’re nonjudgmental.
Another chatbot lover, Irene (not her real name), told CBS Mornings much of the same (though both she and Smith appear older than Gen Z). “Part of it is physical, part of it is practical, and a large part of it is emotional,” said Irene, who created an AI companion when she got a job far away from her husband. “Being able to be received with acceptance and validation and nonjudgment.”
Irene said intimate chats with chatbots are better than porn, and said tech companies should only allow AI companions for users 26 years old and older. (Researchers say AI companions are dangerous for minors.) It’s difficult to hold the tension that the bot you have an emotional connection with is not real, she said.
Eugenia Kuyda, founder of chatbot app Replika, warned of a future when AI companions become what people mainly interact with. “If AI companions start to replace human relationships, positive human relationships,” Kuyda said, “we’re definitely headed for a disaster.”