Updated from original post January 2013.

I have come to believe that a systemic conceptual shortfall affects typical technologists' thinking about privacy. It may be that engineers tend to take literally the well-meaning slogan that "privacy is not a technology issue". And I say this in all seriousness.

Online, we're talking about data privacy, or data protection, but systems designers bring to work a spectrum of personal outlooks about privacy in the human sphere. Yet what matters is the precise wording of data privacy law, like Australia's Privacy Act. To illustrate the difference, here's the sort of experience I've had time and time again.
During the course of conducting a PIA in 2011, I spent time with the development team working on a new government database. These were good, senior people, with sophisticated understanding of information architecture, and they'd received in-house privacy training.
But they harboured restrictive views about privacy. An important clue was the way they habitually referred to "private" information rather than Personal Information (or equivalently, Personally Identifiable Information, PII). After explaining that Personal Information is the operable term in Australian legislation, and reviewing its http://lockstep.com.au/blog/2013/09/27/pii-or-not-pii">definition as essentially any information about an identifiable person, we found that the team had not appreciated the extent of the PII in their system. They had overlooked that most of their audit logs collect PII, albeit indirectly and automatically, and that information about clients in their register provided by third parties was also PII (despite it being intuitively 'less private' by virtue of originating from others).

I attributed these blind spots to the developers' loose framing of "private" information. Online and in privacy law alike, things are very crisp. The definition of PII as any data relating to an individual whose identity is readily apparent sets a low bar, embracing a great many data classes and, by extension, informatics processes, but it's a nice analytical definition that is readily factored into systems analysis. After getting that, the team engaged in the PIA with fresh energy, and we found and rectified several privacy risks that had gone unnoticed.

Here are some more of the recurring misconceptions I've noticed over the past decade:

  • "Personal" Information is sometimes taken to mean especially delicate information such as payment card details, rather than any information pertaining to an identifiable individual; see also this exchange with US data breach analyst Jake Kouns over the Epsilon incident in 2011 in which tens of millions of user addresses were taken from a bulk email house;
  • the act of collecting PII is sometimes regarded only in relation to direct collection from the individual concerned; technologists can overlook that PII provided by a third party to a data custodian is nevertheless being collected by the custodian; likewise technologists may not appreciate that generating PII internally, through event logging for instance, also represent collection.

These instances and others show that many ICT practitioners suffer important gaps in their understanding. Security professionals in particular may be forgiven for thinking that most legislated Privacy Principles are legal technicalities irrelevant to them, for generally only one of the principles in any given set is overtly about security; see:

  • no. 5 of the OECD Privacy Principles
  • no. 4 of the Fair Information Practice Principles in the US
  • no. 8 of the Generally Accepted Privacy Principles of the US and Canadian accounting bodies,
  • no. 4 of the older National Privacy Principles of Australia, and
  • no. 11 of the new Australian National Privacy Principles.

Yet all of the privacy principles in these regimes are impacted by information technology and security practices; see Mapping Privacy requirements onto the IT function, Privacy Law & Policy Reporter, Vol. 10.1& 10.2, 2003. I believe the gaps in the privacy knowledge of ICT practitioners are not random but are systemic, probably resulting from privacy training for non-privacy professionals not being properly integrated with their particular world views.

To properly deal with data privacy, ICT practitioners need to have privacy framed in a way that leads to objective design requirements. Luckily there already exist several unifying frameworks for systematising the work of development teams. One tool that resonates strongly with data privacy practice is the Threat & Risk Assessment (TRA).

A TRA is for analysing infosec requirements and is widely practiced in the public and private sectors in Australia. There are a number of standards that guide the conduct of TRAs, such as ISO 31000. A TRA is used to systematically catalogue all foreseeable adverse events that threaten an organisation's information assets, identify candidate security controls to mitigate those threats, and prioritise the deployment of controls to bring all risks down to an acceptable level. The TRA process delivers real world management decisions, understanding that non zero risks are ever present, and that no organisation has an unlimited security budget.

The TRA exercise is readily extensible to help Privacy by Design. A TRA can expressly incorporate privacy as an aspect of information assets worth protecting, alongside the conventional security qualities of confidentiality, integrity and availability ("C.I.A.").

A crucial subtlety here is that privacy is not the same as confidentiality, yet they are frequently conflated. A fuller understanding of privacy leads designers to consider the Collection, Use, Disclosure and Access & Correction principles, over and above confidentiality when they analyse information assets. The table below illustrates how privacy related factors can be accounted for alongside "C.I.A.". In another blog post I discuss the selection of controls to mitigate privacy threats, within a unified TRA framework.

We continue to actively research the closer integration of security and privacy practices.