A transgendered woman recently committed suicide after her updated Android phone failed to keep her personal life personal. After the upgrade to KitKat, the new Google Hangouts app used her old male name in a text to a coworker, unwittingly outing her. Panicked, she reached out to Google for help. They never responded.
Also in January, Liz Eden, a blogger and graduate funding officer at the University of Oxford, discovered that her Google Calendar sent event invitations to email addresses she listed in the description of a private reminder. For example, setting a reminder for herself to email her boss about a pay triggered an email to her boss, inviting him to the “event.” Not tragic, but still embarrassing. And it might be forgivable if it hadn’t already been happening for years.
Google furnishes our lives with conveniences. It provides vast amounts of free information instantaneously, keeps track of our communication, and gives us driving directions. Its participation in the Reform Government Surveillance coalition and support of reforming the Electronic Communications Privacy Act would suggest that Google values users’ privacy more than we tend to give it credit for.
Still, these products aren’t actually free, even if we don’t pay for them. Google gives us what we want in exchange for what it wants: our data. Inevitably, the human-programmed systems created to collect and store this data will fail.
Good UX design is supposed to exploit the human capacity for laziness. It should be clear and simple, marking a stone path for an easy, frictionless UX. Google’s UI design is well known for this kind of bare bones cognitive fluency. So when something bad happens with this thing we’ve come to appreciate for its integrity and reliability, we experience cognitive dissonance. Trust was broken, expectations went unmet. Ultimately, we were betrayed by a friend. Phillips explained over email how this misalignment of interpretations is problematic:
“These are examples of designers and users interpreting things very differently, with horrifying results–and of a corporate culture that seems willing to let a dangerous situation remain uncorrected for years, doing real harm to real people, because, to them, these people are obviously just wrong.”
There’s a name for this kind of “evil interface”: Zuckering. The definition came about in 2010 when people were pissed off enough at Facebook’s deceptive UIs to give a name to “the act of creating deliberately confusing jargon and user-interfaces which trick your users into sharing more info about themselves than they really want to.” The difference here is that GCal is not intentionally trying to trick you into send defaming emails, but Phillips thinks the omission is just as worrisome:
“These interfaces are ‘bad’ in a larger sense, then: although they make it easy to get work done quickly, they fail to make it obvious in all cases what actions GCal intends to take on the users’ behalf. For some kinds of software, this is OK. But when the software is in intimate contact with sensitive information, it’s not good enough.”
So, are we walking on eggshells when supplying our information for life’s niceties?
“It’s frustrating they’ve suckered me in with a great promise,” Mr. Eden said, assuring his eventual delinking from Google, a pattern he thinks will be common in the upcoming years. Information leaks, from parsed NSA documents to strangers’ sudden abilities to email you through Google+, are substantiating traditionally fringe phobias. 2013 retrospectives assign the year ominous titles like the “year of the personal data breach,” and data hacks from LivingSocial, Adobe, Facebook (again), and the good ol’ U.S. government make the claim difficult to refute. Are reactive outcries the only way to get companies to care about privacy?
“If one of your friends has their life exposed, people might temporarily change their behavior,” says Eden. “How we get those new behaviors to stick is another matter. Companies have a responsibility to think about the most vulnerable of their customers.”