The relationship between privacy regulators and technologists can seem increasingly fraught. A string of adverse (and sometimes counter intuitive) privacy findings against digital businesses – including the “Right to be Forgotten”, and bans on biometric-powered photo tag suggestions – have left some wondering if privacy and IT are fundamentally at odds. Technologists may be confused by these regulatory developments, and as a result, uncertain about their professional role in privacy management.

Several efforts are underway to improve technologists’ contribution to privacy. Most prominent is the “Privacy by Design” movement (PbD), while a newer discipline of ‘privacy engineering’ is also striving to emerge. A wide gap still separates the worlds of data privacy regulation and systems design. Privacy is still not often framed in a way that engineers can relate to. Instead, PbD’s pat generalisations overlook essential differences between security and privacy, and at the same time, fail to pick up on the substantive common ground, like the ‘Need to Know’ and the principle of Least Privilege.

There appears to be a systematic shortfall in the understanding that technologists and engineers collectively have of information privacy. IT professionals routinely receive privacy training now, yet time and time again, technologists seem to misinterpret basic privacy principles, for example by exploiting personal information found in the ‘public domain’ as if data privacy principles do not apply there, or by creating personal information through Big Data processes, evidently with little or no restraint.

See also ‘Google's wifi misadventure, and the gulf between IT and Privacy’, and ‘What stops Target telling you're pregnant?’.

Engaging technologists in privacy is exacerbated by the many mixed messages which circulate about privacy, its relative importance, and purported social trends towards promiscuity or what journalist Jeff Jarvis calls ‘publicness’. For decades, mass media headlines regularly announce the death of privacy. When US legal scholars Samuel Warren and Louis Brandeis developed some of the world’s first privacy jurisprudence in the 1880s, the social fabric was under threat from the new technologies of photography and the telegraph. In time, computers became the big concern. The cover of Newsweek magazine on 27 July 1970 featured a cartoon couple cowered by mainframe computers and communications technology, under the urgent upper case headline, ‘IS PRIVACY DEAD?’.Of course it’s a rhetorical question. And after a hundred years, the answer is still no.

In my new paper published as a chapter of the book “Trans-Atlantic Data Privacy Relations as a Challenge for Democracy”, I review how engineers tend collectively to regard privacy and explore how to make privacy more accessible to technologists. As a result, difficult privacy territory like social networking and Big Data may become clearer to non-lawyers, and the transatlantic compliance challenges might yield to data protection designs that are more fundamentally compatible across the digital ethos of Silicon Valley and the privacy activism of Europe.

Privacy is contentious today. There are legitimate debates about whether the information age has brought real changes to privacy norms or not. Regardless, with so much personal information leaking through breaches, accidents, or digital business practices, it’s often said that ‘the genie is out of the bottle’, meaning privacy has become hopeless. Yet in Europe and many jurisdictions, privacy rights attach to Personal Information no matter where it comes from. The threshold for data being counted as Personal Information (or equivalently in the US, ‘Personally Identifiable Information’) is low: any data about a person whose identity is readily apparent constitutes Personal Information in most places, regardless of where or how it originated, and without any reference to who might be said to ‘own’ the data. This is not obvious to engineers without legal training, who have formed a more casual understanding of what ‘private’ means. So it may strike them as paradoxical that the terms ‘public’ and ‘private’ don’t even figure in laws like Australia’s Privacy Act.

Probably the most distracting message for engineers is the well-intended suggestion ‘Privacy is not a Technology Issue’. In 2000, IBM chair Lou Gerstner was one of the first high-profile technologists to isolate privacy as a policy issue. The same trope (that such-and-such ‘is not a technology issue’) is widespread in online discourse. It usually means that multiple disciplines must be brought to bear on certain complex outcomes, such as safety, security or privacy. Unfortunately, engineers can take it to mean that privacy is covered by other departments, such as legal, and has nothing to do with technology at all.

In fact all of our traditional privacy principles are impacted by system design decisions and practices, and are therefore apt for engagement by information technologists. For instance, IT professionals are liable to think of ‘collection’ as a direct activity that solicits Personal Information, whereas under technology neutral privacy principles, indirect collection of identifiable audit logs or database backups should also count.

The most damaging thing that technologists hear about privacy could be the cynical idea that ‘Technology outpaces the Law’. While we should not underestimate how cyberspace will affect society and its many laws borne in earlier ages, in practical day-to-day terms it is the law that challenges technology, not the other way round. The claim that the law cannot keep up with technology is often a rhetorical device used to embolden developers and entrepreneurs. New technologies can make it easier to break old laws, but the legal principles in most cases still stand. If privacy is the fundamental ‘right to be let alone’, then there is nothing intrinsic to technology that supersedes that right. It turns out that technology neutral privacy laws framed over 30 years ago are powerful against very modern trespasses, like wi-fi snooping by Google and over-zealous use of biometrics by Facebook. So technology in general might only outpace policing.

We tend to sugar-coat privacy. Advocates try to reassure harried managers that ‘privacy is good for business’ but the same sort of naïve slogan only undermined the quality movement in the 1990s. In truth, what’s good for business is peculiar to each business. It is plainly the case that some businesses thrive without paying much attention to privacy, or even by mocking it.

Let’s not shrink from the reality that privacy creates tensions with other objectives of complex information systems. Engineering is all about resolving competing requirements. If we’re serious about ‘Privacy by Design’ and ‘Privacy Engineering’, we need to acknowledge the inherent tensions, and equip designers with the tools and the understanding to optimise privacy alongside all the other complexities of modern information systems.

A better appreciation of the nature Personal Information and of technology-neutral data privacy rules should help to demystify European privacy rulings on matters such as facial recognition and the Right to be Forgotten. The treatment of privacy can then be lifted from a defensive compliance exercise, to a properly balanced discussion of what organisations are seeking to get out of the data they have at their disposal.