TECH FIRMS BEING ASKED TO GIVE UP THIER AWKWARD SECRET
Among the thousands upon thousands of words that make up the privacy policies of the tech giants, one you rarely find is “human”.
Zero uses of the word in Amazon’s privacy policy for its Alexa voice assistant. The same goes for Apple’s Siri. And Google’s Assistant.
Facebook’s policy uses the word just once - to inform you that it will record data about your device to make sure that you are human.
Only Microsoft states what all of these other privacy policies arguably should: "Our processing of personal data … includes both automated and manual (human) methods of processing."
I point this out because the issue of human labour is a touchy subject in Silicon Valley right now; a fresh atmosphere of controversy for which companies only have themselves to blame.
First, Amazon was revealed to have been using human contractors to listen to clips picked up by Alexa. Then it was discovered Google was doing much the same, as were Apple. All three companies said they would halt the practice in response to the negative publicity.
And then Facebook, it was reported by Bloomberg last week, was using humans to listen to recordings gathered via its Messenger app. The company insisted proper consent had been obtained, but subsequent reporting found that wasn’t quite accurate. Well, it wasn’t at all accurate. Facebook didn’t say a word about it to users.
A final fallback
The stories have dropped like dominoes. Any voice-powered or “automated” system is now quite rightly being put under scrutiny. Do Microsoft contractors sometimes listen to audio captured via Skype? Yep. Do they listen to audio of gamers speaking through their Xbox? Sure do.
For reporters, the low-hanging fruit is everywhere you look. I picked an “automated” service I use often - Expensify, the expenses-logging app that can “smartscan” receipts - and looked into it. Did humans have any role in how it worked? Yes, it turns out they do.
“SmartScan is a complex, multi-layered system that takes multiple approaches to extracting information from receipts,” the company told me.
“The final, fallback approach is that it is sent to one of our contractors. When we're unable to extract the required information from the receipt, the receipt is reviewed by one of our contractors to fill in the missing information.”
No mentions of humans in Expensify’s privacy policy, either.
"a comma could be added"
There is a reasonable explanation for all this - if only the egotistical, secrecy-obsessed tech giants could bring themselves to say it. Without humans, most of these products would suck.
Expensify, to its credit, admitted to me that without human review, it would likely only manage about 80-85% accuracy. It would mean users would need to check all of their receipts for errors, defeating the point of using the software entirely. So, on balance, if a contractor has to step in once in a while - that’s fair. Not mentioning it anywhere on their website? Considerably less so.
Amazon, Google, Apple, Facebook and Microsoft all face the same challenge. Those companies capture these recordings not to spy on you, but to figure out when and why their technologies, still early in their development, get things wrong.
But what’s most frustrating to me, is that I don’t think most people have a problem with that. I know I don’t: my expectation is that because these products aren’t perfect, firms need to work on improving them. It’s almost reassuring that humans are still so relied upon.
But my other expectation is for tech companies to be up front about it, and on that, most have failed. I interviewed Amazon’s head of Alexa earlier this year and asked him if he felt Amazon could do more to make it clear in their privacy policy that humans had even just a minor role in Alexa’s systems.
“I suppose we could add a comma after that and be more specific,” he told me (spoiler: they haven’t done that).
Put it on the damn box, I say. Be honest.
The tech industry needs to quickly understand that the tech-buying public will forgive them for using human labour, far more readily than it will for blatant and lazy abuses of trust.
But then, doing so would mean admitting an imperfection, weakness or inefficiency. And that’s just not the done thing around here.
Comments
Post a Comment