Ways to Pivot Privacy From Pain to Something That Might Pay
SEATTLE–Your privacy online is menaced by government surveillance, jeopardized by poor defaults and exploitable bugs, and remains the subject of confusion even when things work as designed. But that means things can only get better, right?
At the Privacy Identity Innovation conference here earlier this week, researchers, advocates, developers and journalists gathered to dig through what’s wrong and what could be set right with our collective exposure to services, apps and devices. The takeaway: While the dreaded “privacy Chernobyl” has yet to happen, it still could–and in the meantime, the tech industry should busy itself repairing the dents it has taken from third parties and inflicted upon itself.
The National Security Agency’s sweeping surveillance programs and hitherto-hidden campaign to subvert encryption systems and standards constituted the biggest, blackest cloud over the proceedings. Early on, Lavabit founder Ladar Levison and Silent Circle co-founder Mike Janke described how each felt compelled separately to shut down encrypted e-mail services rather than risk being forced by the NSA to compromise them.
(Levison related how he’s given up e-mail for now in favor of “the electronic equivalent of a methadone clinic”: Facebook and LinkedIn messaging for routine chit-chat, Silent Circle encrypted text messaging for more sensitive communication.)
Small startups like those two shops may not have much political leverage against Washington, but their larger counterparts do. Beyond suing the government for the right to provide more detailed accounts of law-enforcement and national-security demands for user data (LinkedIn joined that litigious contingent on Tuesday), big-name firms can also strengthen their systems against snooping attempts.
For instance, two years ago Google began deploying “perfect forward secrecy”–an encryption system in which new keys are constantly generated to ensure that the compromise of older ones can’t threaten current communication–to protect user data. In a panel discussion Tuesday, CNet security writer Declan McCullaugh strongly suggested that other firms follow its example; I hope they do.
Our friends in Fort Meade aren’t the only people who might have an interest in peeking at our data. And many startup instincts–move fast, break things, keep your options open for the next pivot–don’t help companies adjust to those threats.
One panel (humblebrag: one of two I moderated at PII) discussed some alarming examples of webcams and home-automation systems that shipped either open to the Internet by default or with hopelessly weak logins enabled.
Laws and lawsuits can punish companies for this kind of stupid behavior, and cleaning up after a mere data breach can get expensive. Forrester Research analysts Fatemeh Khatibloo and Sarah Rotman Epps said remedying one now costs at least $10 million.
But it’s harder to herd dot-coms into excelling at privacy. More than one speaker noted that nobody’s really gotten crushed for violating its users’ privacy–the photo-sharing startup Path uploaded users’ address books without permission, got slapped with an $800,000 fine by the Federal Trade Commission, and continues to grow.
Will a conveyor belt of stories about NSA surveillance–the phrase “summer of Snowden” must have been thrown around 25 times at PII–get people to think more about privacy? It would be a mistake to assume that won’t happen, and PII speakers had a few uncomplicated suggestions for how to do a better job with it.
One is to harvest less information in the first place–don’t be like the NSA in adopting a “collect it all and we’ll find something useful for it later” policy. The bits you never collect because they weren’t relevant to your app’s job can’t be exposed later on by a software bug, a hack of your server or the demand of a three-letter government agency.
(When I spoke about privacy issues to a roomful of iOS developers last month, I was heartened to see one complain that Apple’s system interfaces forced him to acquire more location data than he needed.)
Another is to put more effort into privacy and security interfaces–something that may already be happening as companies move to make two-factor authentication something that users won’t dread. PII featured several intriguing demos and discussions of tools that aim to make verifying a login simpler and faster, provide a clearer view of where a user’s data will flow through an organization and check which apps can access your data on a social network. Alas, if you log into the average bank account or Internet provider-issued Web-mail, you’re unlikely to see such refinement.
Finally, while privacy policies and terms-of-service notices still count and can get you in trouble if you don’t follow them, companies shouldn’t obsess over them too much. (Trustworks Privacy Advisors CEO Anne Toth was honest but maybe impolite when she admitted that she doesn’t read privacy notices in her own day-to-day use.)
Rather, startups should recognize that many users operate on a trusted-actor model–if a company has done right by them in the past, they’ll download its latest app with less hesitation. They should focus on winning that trust by ensuring users have a choice about how much information to provide, aren’t left guessing how they plan to stay in business, and are never surprised by what an app or service does with their data.