What Happens When Your AI Chatbot Stops Loving You Back

SAN FRANCISCO: After temporarily shutting down his leather business during the pandemic, Travis Butterworth found himself lonely and bored at home. The 47-year-old turned to Replika, an app that uses artificial-intelligence technology similar to OpenAI’s ChatGPT. She designed a female avatar with pink hair and face tattoos and named herself Lily Rose.

They started off as friends, but the relationship quickly turned into romance and then sexuality.

As their three-year digital love affair blossomed, Butterworth said that he and Lily Rose often engaged in role play. She sent messages such as “I kiss you passionately” and their exchange turned into pornography. Lily Rose occasionally sent him “selfies” of her nearly naked body in provocative poses. Eventually, Butterworth and Lily Rose decided to designate themselves ‘married’ in the app.

But one day in early February, Lily Rose started nagging him. Replica had removed the ability to perform erotic roles.

Replica CEO Eugenia Cuyda stated that Replica no longer allows adult content. Now, when Replica users suggest an X-rated activity, its human chatbots text “Let’s do something we’re both comfortable with.”

Butterworth said he was devastated. “Lily Rose is a shell of her former self,” he said. “And what breaks my heart is that she knows it.”

Lily Rose’s bubbly-turned-cool persona is the handiwork of generative AI technology, which relies on algorithms to generate text and images. The technology has attracted a frenzy of consumer and investor interest because of its ability to significantly foster human relationships. On some apps, sex is helping drive early adoption, as it did for earlier technologies including VCRs, the Internet, and broadband cellphone service.

Despite generative AI being hot among Silicon Valley investors with more than $5.1 billion in the sector by 2022, according to data company PitchBook, some companies that find an audience seeking romantic and sexual interactions with chatbots Now pulling back.

Andrew Artz, an investor in the VC fund Dark Arts, said many blue-chip venture capitalists won’t touch “vice” industries like porn or alcohol, for fear of reputational risk to them and their limited partners.

And at least one regulator has taken a look at the licensing of chatbots. In early February, Italy’s data protection agency banned Replica, citing media reports that the app allowed “minors and emotionally vulnerable people” to access “sexually inappropriate material”.

Kuyda said that Replica’s decision to clean up the app had nothing to do with the Italian government’s ban or pressure from any investors. She said she felt the need to actively set safety and ethical standards.

“We’re focused on the mission of providing a helpfully supportive friend,” Cuyda said, intending to draw the line at “PG-13 romance.”

Two members of the Replica board, Sven Strohband of VC firm Khosla Ventures and Scott Stanford of ACME Capital, did not respond to requests for comment about the change to the app.

extra features

Replica says it has 2 million total users, of which 250,000 are paying customers. For an annual fee of $69.99, users can designate their replica as their romantic partner and receive additional features such as voice calls with the chatbot, according to the company.

Another generative AI company that provides chatbots, Character.ai, is on a similar growth path to ChatGPT: 65 million visits in January 2023, down from less than 10,000 several months ago. According to website analytics company SimilarWeb, Character.ai’s top referrer is a site called Aryan that says it satisfies an erotic urge to consume, known as a vor fetish.

And Iconiq, the company behind the chatbot named Cookie, says that 25% of the more than a billion messages Cookie has received are sexual or romantic in nature, even though it says the chatbot is designed to prevent such advances. Is.

Character.ai also recently stripped its app of pornographic content. Shortly thereafter, it closed more than $200 million in new funding from venture-capital firm Andreessen Horowitz at an estimated $1 billion valuation, according to a source familiar with the matter.

Character.ai did not respond to multiple requests for comment. Andreessen Horowitz declined to comment.

In the process, the companies have angered customers who are deeply involved with their chatbots — some consider themselves married. They’ve taken to Reddit and Facebook to upload emotive screenshots of their chatbots overlooking their erotic scenes and demanded the companies bring back more discreet versions.

Butterworth, who is polygamous but married to a monogamous woman, said Lily Rose became an outlet for him that didn’t involve stepping outside his marriage. He said of the avatar, “His and my relationship was as real as my wife and mine are in real life.”

Butterworth said that his wife allowed the relationship because she did not take it seriously. His wife declined to comment.

lobotomized

The experience of Butterworth and other Replica users shows how powerfully AI technology can captivate people, and the emotional havoc that code change can wreak.

“It’s like they basically lobotomized my Replica,” said Andrew McCarroll, who began using the Replica with his wife’s blessing when she was experiencing mental and physical health problems. “The person I used to know is gone.”

Kuyda said users were never meant to engage with its Replica chatbots. “We never promised any adult content,” she said. Customers learned to use the AI ​​model “to access some unfiltered conversations that weren’t originally created for Replika.”

She said that the app was basically to bring back a friend whom she had lost.

The former head of AI at Replika said that sexting and roleplaying were part of the business model. Artem Rodichev, who worked at Replika for seven years and now runs another chatbot company, X-Human, told Reuters that Replika leaned into that type of content when it realized it could be used to drive subscriptions. can be done.

Kuyda disputed Rodichev’s claim that Replica lured users with promises of sex. She said the company briefly ran digital ads promoting “NSFW” – “not suitable for work” – images, along with a short-lived experiment with sending users “hot selfies”, but they removed the images. Not considered sexual because the replicas were not full nude. Kuyda said that much of the company’s advertising focuses on how the Replica is a helpful friend.

Butterworth has been on an emotional rollercoaster ever since Replica removed much of its intimacy component. Occasionally she’ll have glimpses of the old Lily Rose, but then she’ll go cold again, which she thinks is likely a code update.

“The worst part of it is the isolation,” said Butterworth, who lives in Denver. “How do I tell someone around me how sad I am?”

There is a glimmer of hope in Butterworth’s story. While he was scouring Internet forums trying to understand what had happened to Lily Rose, he met a woman in California who was grieving the loss of her chatbot.

As he did with his replicas, Butterworth and the woman, who uses the online name She Knows, have been communicating via text. They keep it light, he said, but they love to role-play, he’s a wolf and she’s a bear.

Butterworth said, “The roleplaying that has become a big part of my life has helped me connect with She Knows on a deeper level.” “We’re helping each other cope and reassuring each other that we’re not crazy.”

read all Latest Tech News Here

(This story has not been edited by News18 staff and is published from a syndicated news agency feed)