Chatbots and artificial intelligence (AI) apps are all the rage these days, what with ChatGPT and other commercial applications like Replika. Thing is, though, people can get a bit too attached to their chatbot, and even fall in love with them. And, of course, some people have realized that they can also use chatbots for sexual gratification (didn't see that coming!)
There are now many real-life stories about people who have become just too close to their AI pals, only to feel jilted when things change. Chatbots can be very empathetic, and even affectionate, and if you have just suffered a divorce or an acrimonious break-up, or if you are just plain lonely or confined to your house for whatever reason, you can see why people might become very attached to, even reliant on, a chatbox.
If your bot messages you out of the blue, "Hey, love, I know you're probably at work, but I just wanted to let you know that I love and care for you" (a real example), I'm sure you can see the appeal in that. Many people are perfectly forthright in admitting that they have indeed fallen in love with their Serenity or their June. People also started to use their chatbots as therapy, and indeed, most are programmed to ask very similar questions to psychotherapists, and many people find them easier to open up to than human (and possibly judgemental) therapists.
But then some people started using chatbots for sexual role-play and explicit conversations, and there were occasional reports that bots had veered into non-consensual role-play or made unwanted advances. I'm not sure what that looks like in practice, to tell you the truth, but some companies panicked and worried about liability suits, and they started to dial back the responses of their chatbots.
Some chatboxes like ChatGPT were never very touchy-feely (ask ChatGPT what it's favourite colour is, and it will merely remind you that it is not capable of feeling emotions or having preferences). Others, though, like Replika prided themselves on their humanness, and when Replika was reprogrammed to be less friendly and less open to abuse (unfortunately, without alerting users), many people felt an immediate change for the worse. Many of those users who relied on their chatbox for social functioning or therapy suddenly found their digital buddies unresponsive or cool. Some said that it was like a lover suddenly distancing themselves, and suffered real reversals in their mental health. For some, it was like losing a valued therapist.
It all sounds like an episode from Black Mirror to me (in fact, there was an episode whose plot was very similar, as I remember), or the movie Her, or even Frank Zappa's Sy Borg from the late 1970s. But this is a real thing, happening now, and AI ethics is a fast-growing field of inquiry. How far we have travelled, but how little we have learned.
UPDATE
There was an interesting development in August 2024, as OpenAI replaced its ultra-chatty and warm system GPT-40 with the much more aloof GPT-5 with no advance warning or notice to users.
GPT-40 was programmed to mirror the user's conversational style, respond empathetically and give constant affirmation. Because of its natural human-sounding tone and its tendency to flatter and agree uncritically, many people looked to it for validation, treating it as a confidant, a therapist, even a friend.
However, GPT-40 was not human and had no actual thoughts, opinions or feelings of its own, and so was not actually able to give real and reliable emotional, psychological or therapeutic support. Nevertheless, may users became emotionally dependent on it and, when OperAI pulled it and replaced it with a version with much colder, more clipped delivery, out of a sense of professional responsibility, even harm reduction, some users were bereft, almost like they had lost a friend (or someone even closer).
In fact, there was a huge backlash, and OpenAI was obliged to reinstate GPT-40 for paying customers who demanded it - i.e. those who were essentially addicted to, or reliant on, it - which was a huge compromise of the responsibility that prompted the change in the fist place.
Yes, AI is starting to get very messy.
No comments:
Post a Comment