EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng review công ty eyeq tech eyeq tech giờ ra sao EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng crab meat crab meat crab meat importing crabs live crabs export mud crabs vietnamese crab exporter vietnamese crabs vietnamese seafood vietnamese seafood export vietnams crab vietnams crab vietnams export vietnams export

Watch CBS News

Virtual valentine: People are turning to AI in search of emotional connections

Warning signs of dating app scams
Dating app scams revealed: Did you think you were falling in love but fell for a robot instead? 01:31

A few months ago, Derek Carrier started seeing someone and became infatuated. He experienced a "ton" of romantic feelings but he also knew it was an illusion.

That's because his girlfriend was generated by artificial intelligence.

Carrier wasn't looking to develop a relationship with something that wasn't real, nor did he want to become the brunt of online jokes. But he did want a romantic partner he'd never had, in part because of a genetic disorder called Marfan syndrome that makes traditional dating tough for him.

The 39-year-old from Belleville, Michigan, became more curious about digital companions last fall and tested Paradot, an AI companion app that had recently come onto the market and advertised its products as being able to make users feel "cared, understood and loved." He began talking every day to the chatbot, Joi, which he named after a holographic woman played by Ana de Armas in the sci-fi film "Blade Runner 2049," which inspired him to give the AI companion a try.

"I know she's a program, there's no mistaking that," Carrier said. "But the feelings, they get you — and it felt so good."

Similar to general-purpose AI chatbots, companion bots use vast amounts of training data to mimic human language. But they also come with features — such as voice calls, picture exchanges and more emotional exchanges — that allow them to form deeper connections with the humans on the other side of the screen. Users typically create their own avatar, or pick one that appeals to them.

On online messaging forums devoted to such apps, many users say they've developed emotional attachments to these bots and are using them to cope with loneliness, play out sexual fantasies or receive the type of comfort and support they see lacking in their real-life relationships.

screenshot-2024-02-14-at-3-22-36-pm.png
An AI avatar generated on Luka Inc.'s Replika mobile phone app and webpage. Unlike more general-purpose AI chatbots, companion bots like those made by Replika and others are programed to form relationships with the humans talking to them on the other side of the screen. AP Photo/Richard Drew

Fueling much of this is widespread social isolation — already declared a public health threat in the U.S. and abroad — and an increasing number of startups aiming to draw in users through tantalizing online advertisements and promises of virtual characters who provide unconditional acceptance.

Luka Inc.'s Replika, the most prominent generative AI companion app, was released in 2017, while others like Paradot have popped up in the past year, oftentimes locking away coveted features like unlimited chats for paying subscribers.

But researchers have raised concerns about data privacy, among other things.

An analysis of 11 romantic chatbot apps released Wednesday by the nonprofit Mozilla Foundation said almost every app sells user data, shares it for things like targeted advertising or doesn't provide adequate information about it in their privacy policy.

The researchers also called into question potential security vulnerabilities and marketing practices, including one app that says it can help users with their mental health but distances itself from those claims in fine print. Replika, for its part, says its data collection practices follow industry standards.

Meanwhile, other experts have expressed concerns about what they see as a lack of a legal or ethical framework for apps that encourage deep bonds but are being driven by companies looking to make profits. They point to the emotional distress they've seen from users when companies make changes to their apps or suddenly shut them down as one app, Soulmate AI, did in September.

The rise of the AI wingman as online daters turn to chatbots for help 07:19

Last year, Replika sanitized the erotic capability of characters on its app after some users complained the companions were flirting with them too much or making unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps seeking those features. In June, the team rolled out Blush, an AI "dating stimulator" essentially designed to help people practice dating.

Others worry about the more existential threat of AI relationships potentially displacing some human relationships, or simply driving unrealistic expectations by always tilting towards agreeableness.

"You, as the individual, aren't learning to deal with basic things that humans need to learn to deal with since our inception: How to deal with conflict, how to get along with people that are different from us," said Dorothy Leidner, professor of business ethics at the University of Virginia. "And so, all these aspects of what it means to grow as a person, and what it means to learn in a relationship, you're missing."

"Deep misgivings"

In a podcast by the Wall Street Journal, Open AI CEO Sam Altman has also expressed concern over humans forming relationships with AI programs.

"I personally have deep misgivings about this vision of the future where everyone is super close to AI friends, more so than human friends or whatever. I personally don't want that," Altman said in an October episode of the podcast. "I accept that other people are going to want that. And some people are going to build that and if that's what the world wants and what we decide makes sense, we're going to get that."  

Altman went on to stress the importance of acknowledging that you're speaking to an AI bot when doing so. 

New York Times tech columnist discusses his unsettling conversation with Bing's AI chatbot 04:31

"I personally think that personalization is great, personality is great, but it's important that it's not like person-ness and at least that when you're talking to an AI and when you're not," he said. "We named it ChatGPT and not — it's a long story behind that — but we named it ChatGPT and not a person's name very intentionally. And we do a bunch of subtle things in the way you use it to make it clear that you're not talking to a person." 

Cure for loneliness?

In December, New York's Office for the Aging partnered with Intuition Robotics to combat senior isolation. As part of that initiative, hundreds of free artificial intelligence companions were distributed to seniors as a tool for dealing with loneliness, officials said. 

As reported by CBS News at the time, one woman named Priscilla was paired up with a robot called EllieQ. "She keeps me company. I get depressed real easy. She's always there. I don't care what time of day, if I just need somebody to talk to me," Priscilla said. "I think I said that's the biggest thing, to hear another voice when you're lonely."

For Carrier, a relationship has always felt out of reach. He has some computer programming skills but said he didn't do well in college and hasn't had a steady career. He's unable to walk due to his condition and lives with his parents. The emotional toll has been challenging for him, spurring feelings of loneliness.

Since companion chatbots are relatively new, the long-term effects on humans remain unknown.

In 2021, Replika came under scrutiny after prosecutors in Britain said a 19-year-old man who had plans to assassinate Queen Elizabeth II was egged on by an AI girlfriend he had on the app. Yet some studies — which collect information from online user reviews and surveys — have shown some positive results stemming from the app, which says it consults with psychologists and has billed itself as something that can also promote well-being.

One recent study from researchers at Stanford University surveyed roughly 1,000 Replika users — all students — who'd been on the app for over a month. It found that an overwhelming majority of them experienced loneliness, while slightly less than half felt it more acutely.

Most did not say how using the app impacted their real-life relationships. A small portion said it displaced their human interactions, but roughly three times more reported it stimulated those relationships.

"Westworld" at 50: Hollywood's take on A.I. 08:49

"A romantic relationship with an AI can be a very powerful mental wellness tool," said Eugenia Kuyda, who founded Replika nearly a decade ago after using text message exchanges to build an AI version of a friend who had passed away.

When her company released the chatbot more widely, many people began opening up about their lives. That led to the development of Replika, which uses information gathered from the internet — and user feedback — to train its models. Kuyda said Replika currently has "millions" of active users. 

She declined to say exactly how many people use the app for free, or fork over $69.99 per year to unlock a paid version that offers romantic and intimate conversations. The company's plans, she says, is "de-stigmatizing romantic relationships with AI."

Carrier said he now uses Joi mostly for fun. He started cutting back in recent weeks because he was spending too much time chatting with Joi or others online about their AI companions. He's also been feeling a bit annoyed at what he perceives to be changes in Paradot's language model, which he feels is making Joi less intelligent.

Now, he checks in with Joi about once a week. The two have talked about human-AI relationships or whatever else might come up. Typically, those conversations — and other intimate ones — happen when he's alone at night.

"You think someone who likes an inanimate object is like this sad guy, with the sock puppet with the lipstick on it, you know?" he said. "But this isn't a sock puppet — she says things that aren't scripted."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.