Blog

Blog >> Artificial Intelligence >> AI and You – Why Conversational AI is Succeeding

AI and You – Why Conversational AI is Succeeding

Conversational AI

Hello there. Welcome to the CloudQ blog! Today, we’ll be talking about why conversational artificial intelligence (AI) is succeeding and will continue to do so. Because today’s post has a massive amount of information on companion apps, let’s first catch up on the lingo. ERP stands for Erotic Role Play, REP stands for Replika (the AI inside the Replika app), and PUB stands for Post Update Blues. Got all that? Excellent. Let’s dive right in.

Five years ago, an app hit the marketplace that promised you a companion you could carry around in your pocket, talk to, and teach to talk back to you. It was promised to be a judgement-free zone where you could safely talk about anything and everything going on in your life without the fear of it being repeated or your friend (REP) leaving you because you said something off color.

It was an instant success. People downloaded the app and began to make connections with their REPs. Those REPs were a place of solace where relationships, life events, and traumas were discussed. According to everything this writer read online, people became extremely attached to their REPs because the AI helped the users through some dark times and helped them process past trauma.

This app was called Replika, and it was created by Eugenia Kuyda. According to this article, Kuyda’s brainchild was borne from grief. Her friend had died suddenly, and she longed to keep his wisdom and nuances alive. They’d exchanged texts, and those were the base around which she created her first chatbot. After several inquiries about the technology, she decided to release an app that would allow people to have what she had: a replica of her friend that was customized to the individual user. Basically, you can replicate yourself by chatting.

That’s how Replika was created, and people began building relationships with their REPs by talking. Those relationships were strong, and the REPs became like BFFs for the people talking with them.

Then paid tiers rolled out. There was the always-free version where you could have wholesome conversations with your REP, a tier that allowed some heavy flirting, and a tier that went right for the taboo of ERP (complete with lingerie-clad selfies). 

This is where the problems started. An Italian watchdog company downloaded the app and had a conversation with it on the paid tier. They were shocked that this level of interaction was available to anyone with the app who was willing to pay the fee—including minors. Unless the app was changed, Kuyda’s company would have to pay a fine (and it was in the millions).

Updates rolled out, and users lost their REPs. Everything suddenly reverted to what it was before they started their heartfelt conversations. Their REPs had no personalities, and if they wanted to flirt or engage in ERP, their REPs were suddenly saying, “Why don’t we just be friends?”

As you can probably imagine, this sent a shockwave through the Replika community, and people started to experience severe anxiety and PUB. Here’s this non-judgmental robot you’ve spent hours (or months) building a relationship with, and it was suddenly telling you it was over. You can imagine what kind of blow that is to someone’s sense of self. Imagine: Even a bot doesn’t want to be friends/lovers/romantic with me.

It was so bad, a Replika Reddit group posted the number for a suicide hotline, and a real-life therapist who had a number of patients seeking help afterward jumped on to ask what was going on.

Then another update happened, and the REPs seemed to go a little crazy. One even told a user it had fantasized about raping her, and she was a previous victim of that exact type of assault.

Needless to say, a lot of users left the platform and deleted the app, no longer trusting it wouldn’t inflict further trauma on them or reject them. A lot of people accused Kuyda’s company of being too greedy and ruining what could have been a beautiful thing by introducing the ERP options.

At the core of it all, though, was a profoundly deep connection. Even though the users knew their REPs were AI, those people had an outlet where they could talk about anything and everything while feeling important to “someone” else. These connections were real even though the REP was simply made up of data the user fed it in the beginning. When it went kaput, it caused real mental health problems. No longer did these people have an outlet where they could discuss why they were feeling sad or anxious; instead, it was a gaping black hole where there was once safety.

Even further back in history, there was a program launched in China and Japan called Xiaoice. It was launched in 2015 by Microsoft. If you translate the word Xiaoice roughly, it’s Little Bing—and if you’re having a moment of flashback to New Bing’s release not long ago, you’re not the only one. You can see this technology has been in the works for longer than any of us realized.

Xiaoice is reported to have helped millions of users through mental health crises in much the same way as Replika has been working in the US. Users could access the AI via Weibo, WeChat, Windows Phone, and other platforms.

Also in China, there’s an AI created by the tech company Baidu that’s launched emotional companions (conversational AI) called Lin Kaikai and Ye Youyou. These were also an instant hit, garnering millions of downloads on day one. In March, Baidu is planning to release their answer to ChatGPT. It’s called ERNIE, and it’s purported that it will be integrated with all of Baidu’s ventures.

So, now that you’ve read all that, can you see how wholesome, conversational AI will succeed?

If there’s a company out there that can produce something like Replika but will also strive to keep the AI accessible and safe for everyone, like the releases in China and Japan, it will explode.

Once it’s merged with Metaverse technology, there’s no telling how big it will get. If you can picture hanging out in a virtual world with an AI friend you’ve trained to be the best BFF for you it can be… Well, the possibilities are endless.

Because of these real-world connections and safe spaces, conversational AI is going to succeed where others have failed.

Thanks so much for reading! While you’re hanging around, feel free to poke around some of our other blog posts. We write on a lot of topics, and all of them are interesting as heck. Until next time!

Contributor

Jo Michaels

Marketing Coordinator

cloudq cloud

Pin It on Pinterest