Privacy by design is a fairly old concept in systems engineering and its general meaning is pretty obvious.
Wikipedia describes it as “not about data protection” but rather “designing so data doesn’t need protection,” with the “root principle based on enabling service without data control transfer from the citizen to the system” (i.e. the citizen is not identifiable or recognizable).
In systems engineering this is demonstrated by GPS, for example, where your mobile can detect its geographical location without giving away that location or your identity.
What does privacy by design mean in the context of the GDPR?
Before we dive in, I should confess that to liven up this article, I have peppered it with crappy stock photos that represent privacy or data breaches. I hope you enjoy.
First off, it’s worth saying that privacy by design is a new part of EU regulations, contained in the GDPR. The EU Data Protection Directive does not refer to the concept, which means that until the GDPR comes into force in May 2018, data controllers simply have to take appropriate measures in order to protect personal data (not to design so it doesn’t need protection).
So what does the GDPR state? It’s worth reading paragraphs 1 and 2 of article 25, which I have reproduced here (skip them if you’re in a hurry):
- Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
- The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.
In short, the GDPR requires:
- data protection by design: data controllers must put technical and organisational measures such as pseudonymisation in place – to minimise personal data processing.
- data protection by default: data controllers must only process data that are necessary, to an extent that is necessary, and must only store data as long as necessary.
Those organisations that do not implement privacy by design may show a disparity between their privacy policies and their privacy controls in practice. Back in 2013 in a paper from Lancaster University, this was shown to be the case with social networks. This is perhaps the central tension of social networks where, as John Lanchester writes, “you are the product”.
In recent times, WhatsApp arguably showed such a disparity when updating its T&Cs in 2016. Users had to tap to agree when asked to share their personal data with Facebook companies, and many will not have noticed an opt-out ‘hidden’ in a concertina which referred to the sharing of their WhatsApp data to improve ‘Facebook ad targeting and products experiences’.
What are the principles of privacy by design?
The ICO gives us a nice initial summary encouraging “organisations to ensure that privacy and data protection is a key consideration in the early stages of any project, and then throughout its lifecycle. For example when:
- building new IT systems for storing or accessing personal data;
- developing legislation, policy or strategies that have privacy implications;
- embarking on a data sharing initiative; or
- using data for new purposes.”
But we can go further and investigate privacy by design in more detail. Specifically, by looking at the seven foundational principles listed by Canada’s Information and Privacy Commissioner Ann Cavoukian in 2011, and based on Kim Cameron’s seven ‘Laws of Identity’.
These principles are not detailed in the GDPR, but they echo a lot of what the GDPR is endeavouring to encourage among data controllers.
1. Proactive not reactive; preventative not remedial
Cavoukian writes about the need for “a clear commitment, at the highest levels, to set and enforce high standards of privacy − generally higher than the standards set out by global laws and regulation.”
This is undoubtedly a key point and something that should be central to a marketer’s commitment not just to comply with the GDPR, but to present a user experience which people will understand and trust. This commitment to privacy should create a “culture of continuous improvement” with poor design recognized and anticipated, to make corrections before negative impacts can be realised.
2. Privacy as default
We’ve already discussed this concept as it is referred to in the GDPR. Important points include:
- purpose specification – explaining to users how personal data is collected, processed, retained and disclosed.
- collection limitation – fair, lawful and limited to that which is necessary (also applies to data processing, retention and disclosure).
- data minimization − non-identifiable interactions and transactions as default. Wherever possible, identifiability of personal information should be minimized.
It should be said that the GDPR takes a flexible approach to privacy as default, which according to law firm Taylor Wessing, “gives data controllers the ability to determine their level of compliance based on the privacy risks involved.” This means taking into account the context, nature and purposes of data processing.
3. Privacy embedded into design
The GDPR is flexible, but there still have to be assessments of privacy that are in some way objective and can be applied to design.
To that end, privacy impact assessments (PIAs) should be carried out. These PIAs, a framework for which has been developed by the ICO, should reduce the risks of harm to individuals through the misuse of their personal information, and can be integrated into existing project management policy.
4. Full functionality – positive-sum, not zero-sum
This is basically a principle that refutes the idea that privacy should have to compete with other interests, such as design objectives and technical capabilities. Privacy should not impair functionality.
As Cavoukian puts it, “objectives must be clearly documented, desired functions articulated, metrics agreed upon and applied, and trade-offs rejected as often being unnecessary, in favour of finding a solution that enables multi-functionality.”
5. End-to-end security – lifecycle protection
Data controllers have responsibility for the security of personal information throughout its entire lifecycle – that includes “methods of secure destruction, appropriate encryption, and strong access control and logging methods.”
6. Visibility and transparency
This, to me, is one of the more fascinating parts of privacy by design. It is central to what it is to be a progressive organisation when working with data and designing the user experience.
Data subjects should also be clear on “complaint and redress mechanisms”. This is relevant to the GDPR, which gives subjects a number of rights, including:
- a right to prevent processing for direct marketing;
- a right to object to decisions being taken by automated means;
- a right to claim compensation for damages caused by a breach of the Act.
An interesting experiment by Vitale et al. looks at the effect of a transparent UX on human-computer interaction. The researchers reasonably hypothesise that “a less transparent method of information collection from people might impose some privacy concerns.”
To test this, the researchers designed a sign-up process for a bank account which used a facial recognition system. In the transparent treatment, the system explained to users how the machine learning system works – only storing spacial coordinates (not the full photograph of the face) – using an annotated picture of a celebrity to demonstrate. This information explained how the system reduced the risks for user privacy.
The result was that transparency of the system significantly increased the number of users giving consensus for storing their face.
What was even more interesting was that when transparency was combined with embodiment (a human-like robot, shown on the left in the image below), the number of users releasing additional information about their social network accounts increased (compared with the disembodied system shown on the right).
7. Respect for user privacy
The last principle of privacy by design is all about consent, a big part of the GDPR. Specific consent is required for personal data processing and consent may be withdrawn. As we have detailed previously in looking at best practice UX for obtaining marketing consent, requests must be:
- unbundled from other terms and conditions;
- without pre-ticked boxes – i.e. the user must actively opt-in;
- granular – with separate consent for different types of processing;
- named – your organisation and any third parties who will be relying on consent should be named;
- reversible – tell people they have the right to withdraw and detail how to do it.
This principle of respect for user privacy also dictates that personal information should be accurate and up-to-date, and that individuals should have access to it, as well as be informed of its uses and disclosures.
How is privacy by design enforced?
The GDPR says that voluntary and transparent certification will be available through an appropriate certification body. I’m not sure who that will be as the text doesn’t make for easy reading.
Though privacy by design is a nebulous concept, and at first hand may seem less important than some of the more specific parts of the GDPR, it’s clear to me that a commitment to privacy by design is what’s needed from organisations. This commitment will show that a transition is in place, and that the organisation is working towards full compliance.
Privacy by design should not come as a shock, or seem too complicated, to most companies. In essence its principles sum up a lot of what the GDPR is trying to achieve – not only protecting consumers but enabling them to forge a better relationship with companies. Ultimately, it’s a winner for both parties.
Note that this article represents the views of the author solely, and is not intended to constitute legal advice.