LinkNYC has been replacing all of New York's public payphones with advertising emblazoned wifi kiosks. Residents and visitors curious about what those kiosks will do with data their routers, cameras and Bluetooth beacons collect about them might look on its website for some kind of privacy policy. There is one there, but it's not that one. Columbia professor Benjamin Dean got a big laugh at this weekend's HOPE XI conference in Manhattan when he pointed out that the privacy policy on LinkNYC's website only applies to the website itself, not to the actual network of kiosks.
It's not quite as bad as it sounds. In LinkNYC's defense, the page in question points out the difference between the two policies up top, but given the cursory way most people read online, it wouldn't be surprising if many users initially missed it (I did). Meanwhile, it's encouraging that Dean and his co-panelist, New York Civil Liberties Union attorney Mariko Hirose, actually did read those two privacy policies and that a room full of people showed up to hear what they found.
Tech privacy policies in general have been constructed primarily to guard companies against liability and discourage users from reading closely. This is unfortunate. They could be standardized and easy to understand. That said, Dean and Hirose's workshop this weekend showed the internet's power of crowdsourced self-defense.
In a way, it countered a working paper by communications professors Jonathan Obar and Anne Oeldorf-Hirsch which received attention last week for showing that out of a group of 543 college students asked to check out a new website, only 11 noticed that when they signed up they agreed to terms that gave its operators rights to the users' first born children.
Coverage of the paper focused on the fact that so few users even looked at the papers, but what surprised me was that anyone looked at all. In other words, there's a Dean or a Hirose in any group.
"I don't think the answer is to find a way to have everyone read every privacy policy. This, as I note in the paper above, is an 'unattainable ideal.'" Obar wrote the Observer in an email. "So efforts to standardize language, even simplify language, even in accordance with American law aimed at making policies easier to understand, continue to fail."
To a degree, it doesn't matter if everyone checks everything. Terms of service apply to lots of people at once. If one person looks them and sees something disturbing, he or she can put out the word. If it's bad enough the internet will respond.
That's what's powerful about this weekend's workshop: Dean and Hirose told a roomful of people what they found. Those people chattered about it on Twitter and then a reporter at Inverse wrote a detailed story breaking what they had to say down.
LinkNYC, by the way, is a product of Intersection, which is owned by Sidewalk Labs, which is owned by Alphabet (the company formerly known as Google). Google likes to link data about your behavior to your location. Your phone's wi-fi antenna constantly broadcasts a map of your favorite places, as artists dramatized with a multimedia installation next year. It wouldn't be surprising if LinkNYC kiosks started recording which wifi access points the phones passing it are looking for.
As Hirose pointed out in the talk, just because LinkNYC kiosks don't use a sensor now doesn't mean they won't. "They essentially have a privacy policy that says, 'we can collect anything and do anything' and that sets the outer bound," she said.
The policy does forbid the company from some activities. For example, it promises not to use facial recognition, which has become dramatically more powerful; however, nothing stops the company from retracting that guarantee. In fact, Hirose said that she's been told by the company that the kiosk's cameras haven't even been turned on yet, but it is also under no obligation to tell the public when the cameras go live.
And when users notice an objectionable change in policy: what can they do from there? If a website really did try to reserve rights to all members' first born child, the outcry of the internet would likely be enough to get the policy retracted (though the provision could also attract a niche audience for a company, but only if it promised).
The trouble is, here in America, that's really all we have to defend ourselves with: outrage. If it's deafening, it can be enough, but if it doesn't amount to more than a slow rumble, tech companies will shrug it off with a bit of spin.
For example, the NYCLU sent a letter to the city in March pushing for stronger language around data volume, retention, government use and sharing, but the city—which has secured a substantial promised payday from Sidewalk Labs—has ignored it. There's little more attorneys can do under current law. We have this outmoded idea called the "third party doctrine," that once someone uses a company's service, they give up all rights to privacy over whatever data that use generates.
For example, I don't really like Facebook, but I keep it active because I want to get the invite if someone wants me to come to a party. Yet the fact that I want to get invitations leaves everything else I might have done on the site over the last 10 years up for grabs by the government and Facebook's commercial partners. Of course, I could quit the site, but walking away from those invitations would impose a very high cost. Most people are unwilling to pay the price of giving up such networks, which allows their owners to gradually ratchet up the rights they grant themselves under terms of service that we can't negotiate unless we do it en masse.
It's not that way everywhere. Last week, French authorities called out Microsoft for releasing the Windows 10 operating system, which turns personal computers into the sort of tattletales that smartphones have been all along. It's one more way in which all Americans envy the French.
"Data privacy self-management is a fallacy under the current model," Obar wrote. "More needs to be done to involve users (automated systems, infomediaries…) and ensure that the mosaic of Big Data approaches and threats can be properly monitored."
Big tech companies find new ways to watch us every day. It's reassuring, at least, to know that there are a few people out there watching back.
UPDATE: A previous version of this story incorrectly identified Benjamin Dean as Benjamin Read. July 29, 2016 10:52 AM.
No comments:
Post a Comment