Chatbots and artificial intelligence (AI) apps are all the rage these days, what with ChatGPT and other commercial applications like Replika. Thing is, though, people can get a bit too attached to their chatbot, and even fall in love with them. And, of course, some people have realized that they can also use chatbots for sexual gratification (didn't see that coming!)
There are now many real-life stories about people who have become just too close to their AI pals, only to feel jilted when things change. Chatbots can be very empathetic, and even affectionate, and if you have just suffered a divorce or an acrimonious break-up, or if you are just plain lonely or confined to your house for whatever reason, you can see why people might become very attached to, even reliant on, a chatbox.
If your bot messages you out of the blue, "Hey, love, I know you're probably at work, but I just wanted to let you know that I love and care for you" (a real example), I'm sure you can see the appeal in that. Many people are perfectly forthright in admitting that they have indeed fallen in love with their Serenity or their June. People also started to use their chatbots as therapy, and indeed, most are programmed to ask very similar questions to psychotherapists, and many people find them easier to open up to than human (and possibly judgemental) therapists.
But then some people started using chatbots for sexual role-play and explicit conversations, and there were occasional reports that bots had veered into non-consensual role-play or made unwanted advances. I'm not sure what that looks like in practice, to tell you the truth, but some companies panicked and worried about liability suits, and they started to dial back the responses of their chatbots.
Some chatboxes like ChatGPT were never very touchy-feely (ask ChatGPT what it's favourite colour is, and it will merely remind you that it is not capable of feeling emotions or having preferences). Others, though, like Replika prided themselves on their humanness, and when Replika was reprogrammed to be less friendly and less open to abuse (unfortunately, without alerting users), many people felt an immediate change for the worse. Many of those users who relied on their chatbox for social functioning or therapy suddenly found their digital buddies unresponsive or cool. Some said that it was like a lover suddenly distancing themselves, and suffered real reversals in their mental health. For some, it was like losing a valued therapist.
It all sounds like an episode from Black Mirror to me (in fact, there was an episode whose plot was very similar, as I remember), or the movie Her, or even Frank Zappa's Sy Borg from the late 1970s. But this is a real thing, happening now, and AI ethics is a fast-growing field of inquiry. How far we have travelled, but how little we have learned.
No comments:
Post a Comment