Saturday, January 05, 2019

If you worry about privacy, digital assistants may not be for you

Several friends now these so-called digital assistants (or voice assistants or virtual assistants or just smart devices), such as Amazon Echo (Alexa), Google Home (Assistant), Apple Homepod (Siri), etc, and most of them are now well and truly hooked, line and sinker.
Personally, I am not particularly tempted - phones and computers work well enough for me. But neither am I someone who is unduly perturbed by the amount of data various services and devices have on me. My take is that if I don't have anything to hide, and I am intelligent enough to make my own decision on purchases and not be swayed by advertising, then personalized content can only help me, and make my life easier.
That said, there is still something a bit unsettling about the fact that these devices are always listening. They have to be always listening, so that when you call out "Hey Alexa" or "OK Google", the device is ready to respond. But this also means that they don't only start listening when you verbally activate them, and that they are privy to (and possibly also recording and relaying to head office) private conversations not intended for the little box in the corner.
There are many anecdotal reports of people receiving blanket advertising on their various devices on an obscure subject which might have just been mentioned casually in a private conversation in the same room as an (apparently sleeping) digital assistant, and children "accidentally" ordering expensive items from Amazon. However, it seems far fron clear just how much data is being recorded while the machines are on stand-by, and even the different conpanies' policies on selling your data seems woefully poorly defined.
Apple's privacy policy (and its anecdotal record) seems to be better than that of Amazon and Google, although it is still not spotless. But then, they are not in the same game of accumulating data for advertising purposes as the other two companies, otherwise I have no doubt that Apple would be pushing the boundaries of privacy laws too.
There certainly seems to be cause for concern, shall we say, and some commentators are actively warming people off the devices.
It's not my intention to warn people of these undeniably fun and useful machine. I'm just saying: go in with your eyes open. If you are the kind of person who worries about privacy, they may not be a good idea for you.

Oh, and just as an aside, what's with all the female voices and names these machines have? There's a whole load of psycho-socio-political baggage behind that. It seems that, even in this day and age, if people want competent, efficient, reliable, polite and deferential, they automatically and  overwhelmingly think "female". These tech companies have done their marketing homework, and it seems that the vast majority of users do not want an authoritative, hyper-intelligent male figure like IBM's Watson IA computer, but a friendly, unthreatening, even slightly sexy, persona, who can banter and give sassy responses to certain preprogrammed stupid questions. (For the record, both Siri and Google Assistant can be changed to have a male voice, although Alexa and Microsoft's Cortana cannot.)
Sexist? Maybe, maybe not. More like just human nature. And women should not necessarily be offended by being considered more competant and polite than men - sounds like a good thing to me.

No comments: