Three Laws for Web Pages

by John Wunderlich


Wunderlich's Three Laws for Web Sites

(with all due reverence to Asimov)

  1. No web site may violate a human being's right to privacy, or through inaction [1], allow a human being's privacy to be violated.
  2. A web site must respect a human being's consent with respect to her personal information, even if it conflicts with the first law.
  3. A web site must protect it's own existence as long as such protection does not conflict with the 1st or 2nd law.

[1] Inaction includes not taking responsibility for what ad networks are doing with your users' information.


Privacy UX

by John Wunderlich


Privacy by Design: UX & UI

With the announcement of IOS 7 elements of the blogosphere have become awash in commentary back and forth about the new design. Does the fact that Apple has chosen Helvetica Ultra Light as the default font have implications for privacy? Not so much. But privacy and design are connected, and all the commentary that I’m seeing about Apple’s new mobile operating system are focussed on the immediate and and the transient. This makes me think about Privacy by Design (PbD).

The focus around IOS 7 is on what the immediate user experience (IUX if you will). Focussing on the IUX is, I would argue, what gets organizations in trouble and does not meet PbD principle #1 - Proactive & Preventative. This is because the user experience of privacy is not immediate, except in the obvious egregious cases such as where web sites demand personal information for registration. A user’s privacy experience with an organization is cumulative and evolves transaction by transaction. 

This is not to say that the IUX is not important. Of course it is, and it is the result of well thought through user interface choices, one of which is Privacy by Design principle #2 - Privacy as the default setting. But have designers fulfilled their PbD goals by making privacy options both available and the default? Again, not so much. On the face of it, by doing this designers will have met most of the PbD requirements:

  1. Designers have proactively included privacy interface features
  2. The system has privacy protective default settings
  3. The system has embedded privacy protective options
  4. Designer ensures that there is full functionality 
  5. Architects ensure the site is designed with end to end security
  6. Privacy officers ensure that the privacy is visible and transparent
  7. By focussing on UI and UX, designers assume that they are user-centric

So what’s the problem? It’s a variant of the old saw in computer programming, when the programmer asks for a set of requirements and builds a prototype for their customer. When shown the prototype, the customer shakes their head and says, “You’ve given me everything I asked for, but that’s not what I wanted." Privacy, it seems to me, is the same thing. If designers focus on the immediate experience they are likely to encounter unintended consequences down the road. Data that is accumulated over time is called longitudinal data. This is the kind of data that is used for epidemiological studies, or changes in a population over time. So I propose to borrow the term and suggest that Privacy by Design requires an understanding of the Longitudinal User Experience (LUX).

Only when system designers study the long term impacts on user privacy will they be proactively addressing and preventing privacy issues. This includes checking back with users on a regular basis for privacy status checks and validation, proactively notifying users of changes impacting their privacy and not implementing changes that could reasonably be construed to be less privacy protective than existing design choices. Above all, it means recognizing that privacy is embodied in the relationship and transactions with the users, not in a series of policy statements.

Bear with me, but this reminds me about a joke about a couple. She says, “You haven’t told me you love me in a long time." He replies, “I told you once, and I’ll let you know if the situation changes". That attitude doesn’t work in relationships and saying, “We told you that we would protect your privacy when you signed on to the service, and will let you know if that changes" doesn’t work that well either.

Meeting the PbD Proactivity principle means regularly engaging with your users about privacy, without beating them over the head with policy statements. Their user experience, in every transaction, needs to reflect your ongoing commitment to giving them control over the information you collect about them. Sometimes that means sacrificing immediate gratification for long term satisfaction. That’s how adults behave, and that’s how you prevent the need for remedial action.

Definitions

User Experience: According to the Wikipedia entry on User Experience: ISO 9241-210[1] defines user experience as “a person’s perceptions and responses that result from the use or anticipated use of a product, system or service". According to the ISO definition user experience includes all the users’ emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviors and accomplishments that occur before, during and after use. The ISO also list three factors that influence user experience: system, user and the context of use.

User Interface: According to the Wikipedia entry on User Interface: The user interface, in the industrial design field of human–machine interaction, is the space where interaction between humans and machines occurs. The goal of this interaction is effective operation and control of the machine on the user’s end, and feedback from the machine, which aids the operator in making operational decisions. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology.


All Those Companies that Can't Afford Dedicated Security

by John Wunderlich


All Those Companies that Can’t Afford Dedicated Security:

This is interesting:

In the security practice, we have our own version of no-man’s land, and that’s midsize companies. Wendy Nather refers to these folks as being below the "Security Poverty Line." These folks have a couple hundred to a couple thousand employees. That’s big enough to have real data interesting to attackers, but not big enough to have a dedicated security staff and the resources they need to really protect anything. These folks are caught between the baseline and the service box. They default to compliance mandates like PCI-DSS because they don’t know any better. And the attackers seem to sneak those passing shots by them on a seemingly regular basis.

[…]

Back when I was on the vendor side, I’d joke about how 800 security companies chased 1,000 customers — meaning most of the effort was focus on the 1,000 largest customers in the world. But I wasn’t joking. Every VP of sales talks about how it takes the same amount of work to sell to a Fortune-class enterprise as it does to sell into the midmarket. They aren’t wrong, and it leaves a huge gap in the applicable solutions for the midmarket.

[…]

To be clear, folks in security no-man’s land don’t go to the RSA Conference, probably don’t read security pubs, or follow the security echo chamber on Twitter. They are too busy fighting fires and trying to keep things operational. And that’s fine. But all of the industry gatherings just remind me that the industry’s machinery is geared toward the large enterprise, not the unfortunate 5 million other companies in the world that really need the help.

I’ve seen this trend, and I think it’s a result of the increasing sophistication of the IT industry. Today, it’s increasingly rare for organizations to have bespoke security, just as it’s increasingly rare for them to have bespoke IT. It’s only the larger organizations that can afford it. Everyone else is increasingly outsourcing its IT to cloud providers. These providers are taking care of security — although we can certainly argue about how good a job they’re doing — so that the organizations themselves don’t have to. A company whose email consists entirely of Gmail accounts, whose payroll is entirely outsourced to Paychex, whose customer tracking system is entirely on Salesforce.com, and so on — and who increasingly accesses those systems using specialized devices like iPads and Android tablets — simply doesn’t have any IT infrastructure to secure anymore.

To be sure, I think we’re a long way off from this future being a secure one, but it’s the one the industry is headed toward. Yes, vendors at the RSA conference are only selling to the largest organizations. And, as I wrote back in 2008, soon they will only be selling to IT outsourcing companies (the term “cloud provider" hadn’t been invented yet):

For a while now I have predicted the death of the security industry. Not the death of information security as a vital requirement, of course, but the death of the end-user security industry that gathers at the RSA Conference. When something becomes infrastructure — power, water, cleaning service, tax preparation — customers care less about details and more about results. Technological innovations become something the infrastructure providers pay attention to, and they package it for their customers.

[…]

The RSA Conference won’t die, of course. Security is too important for that. There will still be new technologies, new products and new startups. But it will become inward-facing, slowly turning into an industry conference. It’ll be security companies selling to the companies who sell to corporate and home users — and will no longer be a 17,000-person user conference.

(Via Schneier on Security)