Or Reorientating how engineers think about privacy.

This blog is extracted from my chapter Blending the practices of Privacy and Information Security to navigate Contemporary Data Protection Challenges in the forthcoming book "Trans-Atlantic Data Privacy Relations as a Challenge for Democracy", Kloza & Svantesson (editors).

One of the leading efforts to inculcate privacy into engineering practice has been the "Privacy by Design" movement, or "PbD". It's a set of guidelines developed in the 1990s by the then privacy commissioner of Ontario, Ann Cavoukian. The movement seeks to embed privacy "into the design specifications of technologies, business practices, and physical infrastructures". PbD is basically the same good idea as build in security, or build in quality, because retrofitting these things too late in the design lifecycle leads to higher costs* and compromised, sub-optimal outcomes.

Privacy by Design attempts to orientate technologists to privacy with a set of simple callings:

  1. Proactive not Reactive; Preventative not Remedial
  2. Privacy as the Default Setting
  3. Privacy Embedded into Design
  4. Full Functionality - Positive-Sum, not Zero-Sum
  5. End-to-End Security - Full Lifecycle Protection
  6. Visibility and Transparency - Keep it Open
  7. Respect for User Privacy - Keep it User-Centric.

PbD is a well-meaning effort, and yet its language comes from a culture quite different from engineering. PbD's maxims rework classic privacy principles without providing much that's tangible to working systems designers. The first three principles are common generalisations. No. 5 and no. 6 simply reword standard privacy principles of security and openness.  User centricity (No. 7) is problematic in the era of Big Data and the Internet of Things, where the vast majority of Personal Information is collected or synthesised behind our backs, beyond our control. "User centric" is hollow as a call to action.

PbD principle no. 4 exemplifies the most problematic aspect of Privacy by Design -- its idealism. Politically, PbD is partly a response to the cynicism of national security zealots and the like who tend to see privacy as quaint or threatening. Infamously, NSA security consultant Ed Giorgio was quoted in "The New Yorker" of 21 January 2008 as saying "privacy and security are a zero-sum game". Of course most privacy advocates (including me) find that proposition truly chilling. Privacy should not be traded mindlessly for security.  And yet PbD's response is frankly just too cute with its slogan that privacy is a "positive sum game".

The truth is privacy is full of contradictions and competing interests, and we ought not sugar coat it. For starters, the Collection Limitation principle - which I take to be the cornerstone of privacy - can contradict the security or legal instinct to always retain as much data as possible, in case it proves useful one day. Disclosure Limitation can conflict with usability, because Personal Information may become siloed for privacy's sake and less freely available to other applications. And above all, Use Limitation can restrict the revenue opportunities that digital entrepreneurs might otherwise see in all the raw material they are privileged to have gathered.

Now, by highlighting these tensions, I do not for a moment suggest that arbitrary interests should override privacy. But I do say it is naive to flatly assert that privacy can be maximised along with any other system objective. It is better that IT designers be made aware of the many trade-offs that privacy can entail, and that they be equipped to deal with real world compromises implied by privacy just as they do with other design requirements. For this is what engineering is all about: resolving conflicting requirements in real world systems.

So a more sophisticated approach than "Privacy by Design" is privacy engineering in which privacy can take its place within information systems design alongside all the other practical considerations that IT professionals weigh up everyday, including usability, security, efficiency, profitability, and cost.

See also my "Getting Started Guide: Privacy Engineering" from Constellation Research.

*Footnote

Not unrelatedly, I wonder if we should re-examine the claim that retrofitting privacy, security and/or quality after a system has been designed and realised leads to greater cost! Cold hard experience might suggest otherwise. Clearly, a great many organisations persist with bolting on these sorts of features late in the day -- or else advocates wouldn't have to keep telling them not to. And the Minimum Viable Product movement is almost a license to defer quality and other non-essential considerations. All businesses are cost conscious, right? So averaged across a great many projects over the long term, could it be that businesses have in fact settled on the most cost effective timing of security engineering, and it's not as politically correct as we'd like?!