For 'Significantly Better Products' UX design should balance Privacy & Personalisation

By October 10, 2014 Opinion, Privacy 2 Comments

In this day and age, user experience design has been extended to address all aspects of a product or service as perceived its users. How you deal with personal data and being trustworthy is definitely part of that perception. Trust has always been essential for a positive user experience. In 2004, when Peter Morville first published the User Experience Honeycomb model, being Credible was about the information you presented to your users. In 2014, smartphones and ubiquitous computing are everpresent, and user experience has been redefined to encompasses all aspects of the end-user’s interaction with the company, its services, and its products. Being credible is more than ever important, and includes taking responsibility for keeping safe information users give to you.

Plenty of companies still fail to think this way. “In fact, most people just think of UX as closely related to UI affordances and the simplicity and delight of a product, but it’s so much bigger. It’s time to change this thinking,” Satyender Mahajan writes on Wired. “Companies can deliver both a spectacular user experience and extraordinary value by safeguarding their users’ privacy and security. In order to achieve this, they must start with the idea that privacy is a fundamental component of the experience.”

As UX designers, we do not have to become infosec experts, or know the world’s collection of privacy regulations by heart. Yet with personal data privacy being a concern for around 75% of consumers, and with a more strict European law coming up, we owe it to both our users and the people who hire us to actively think about privacy – and how to implement it in their flows and designs – from start to finish.

Luckily, caring about Privacy and caring about UX aren’t that different. There’s a few basic concepts that apply to both:


1. Understand the end goal – the ‘why?’


Work with product management, marketing, IT, research or any other department needing to collect personal information, to discover what their goals are. The more specific the better. Try to get beyond high-level goals as there are “increase sales” or “we want to understand our customer better”, to specifics such as:

  • We want to send out a 50%-off promotion on people’s birthdays
  • We need to track IP for legal identification reasons

For UX design, we would definitely ask the same questions to the user – what do you want to achieve? With the Privacy Impact Assessment (PIA) something we have been doing for digital ages – asking the consumer about their expectations and opinions on the matter – has come en vogue for privacy too.


2. There are rules and guidelines. These change.


Opposed to what some might think, User Experience is not a ‘gut feeling’-science. It is years of looking at best-and-worst-practice examples, trying out different approaches to user research, keeping up to date with the latest and best patterns for information and navigation design, and before starting a mobile project, refreshing your knowledge of respectively the iOS Human Interface Guidelines, the Android Material Design Guidelines, or whatever is the latest. All whilst keeping an eye on which trends are emerging, what will be next.

Privacy is quite the same.

Mind you, just like UX, this is not an exact science. Rules can be differently interpreted, and although they don’t bent, they might give way a little when you take the right approach.

There is the law and the upcoming law

For Belgium, you need to have a standing knowledge of the Privacy Act and the Royal Decrees, and have to know if and how the telecommunication law applies to the data to be processed.  Regardless if you are a European or American company, if you have European customers, you best think ahead and plan for the European General Data Protection Regulation to be adopted.

Know the tools

Just as you need to grow accustomed to using a certain tool – be it Omnigraffle, Illustrator or even FireWorks – for mapping flows and creating wireframes, you need to accustom yourself with the tools of the trade: data processing notices, risk assessments, the privacy impact assessment (PIA) framework, information classification, and so on.

Whilst doing all this, you also need to still keep an eye on the latest innovations and state-of-the-art techniques for keeping data secure.

That’s why, just like getting the entire ‘UX scope’ just right, …


3. You can’t do it alone


“The first requirement for an exemplary user experience is to meet the exact needs of the customer, without fuss or bother. Next comes simplicity and elegance that produce products that are a joy to own, a joy to use. True user experience goes far beyond giving customers what they say they want, or providing checklist features. In order to achieve high-quality user experience in a company’s offerings there must be a seamless merging of the services of multiple disciplines, including engineering, marketing, graphical and industrial design, and interface design.” – (Nielsen Norman Group)

.. and it is OK to ask for assistance and cooperation! UX has you working on everything from objectives and needs to detailed UI, interaction and graphic design coordinated with the product story, and closely with research, development and marketing. Privacy will have you working across product design, legal, security, HR and tech.

Plenty of people involved

You’ll need the product owners to involve you from day one (which data will be gathered, which data is essential), and speak to legal about contractually enforcing the standards that need to be upheld by contractors and consultants, and about how you formulate the ToS. Human resources will be your go-to department for confidentiality agreements, device- and communication policies and sensibilisation about the importance of information privacy. Network and data security are to be discussed with and explained by IT . You will need a clear enough mandate to actually get things done, which means C-level access and confirmation.

Prioritising privacy?

Which means there is a continuous need to explain its significance in the overall product or company picture – and the consequences of not doing it right. You end up with everybody accepting it as somewhat to majorly important, but nobody having it as first priority. Delivering on time, budget, marketing, .. are.

Have privacy as part of the project plan – and the estimates – as soon as possible, so it is a deliverable, and doesn’t need to be added after the fact which would lead to both delays and soaring costs.

Hence, stand your ground and..


4. Less is More


Think about privacy-by-design from start, and suggest workable privacy-respecting alternatives where possible. We are already well-practiced at suggesting alternate solutions. Design, interaction or navigation alternatives that provide the user with less hassle. Enabling them to get to where they want (or need) to be with as few friction, and as much enjoyment, as possible. Privacy-concerns – which are most likely to come up during install or registration – are friction too. Can we prevent them?

Enter proportionality, which is part of your rules and guidelines set. Proportionality (wikipedia explanation) brings us back to 1. You need to know the why. Kindly interrogate all stakeholders about what they really want to achieve – the purposes for which personal data is collected and processed, and what data they think they will need for that. Compare collection to purpose. Is the personal information collected not relevant or excessive to the purpose?  Then see if you can achieve the same end goals with fewer or less detailed data.

Aim for fewer data, better experience

A very basic example for fewer data, and thus lowering identifiability, is that for emailing me a discount voucher on my birthday, you do not need to know my year of birth, or gender. Day and month of birth will do. (More on date of birth and identifiability at the Data Privacy Lab.)

Now imagine, you want to add either a free scarf (for the ladies), or a free tie (for the men) to everybody’s order. You can either ask me about my gender and add either the scarf or tie to my cart, or you can just let me choose between the gifts myself.

Letting me choose the gift requires an extra step, so it is not the shortest path. There’s two reasons though, which make it a better option than asking for gender and auto-deciding my free gift: 1. this approach does not require you to ask me about my sex, and 2. I get the option to choose the tie and gift it to my boyfriend.

However, deducing gender from first name would not be an alternative, as then you are processing gender information none the less, and – if you label a person wrong – even processing inaccurate information.

The issue at hand isn’t only all personal information you will gather directly from users, but about all the personal data you are collecting, deducing from behaviour and using at any point. From A/S/L, to IP addresses to your user’s profile picture.

evillocationBeware location

Notable here is that the General Data Protection Regulation as it is currently presented, treats “location data” as special data, often mentioned on the same level as “data on children or employees”.

Collecting and processing location data will require extra safety measures, privacy impact and risk assessments.  So think ahead, and save yourselves from a lot of hassle by where possible avoiding location data, or at least making sure to opt for ‘coarse location’, rather than precise.

Fewer is often better, and taking shortcuts is dangerous. These will bite you in the ass. 


5. The devil is in the details


For UX, every detail matters. The message, the story, the graphics, they are part of a larger whole. One tiny dissonant can diminish the entire experience. You have the most beautiful design and the most optimal flow, but changes led to being broken for a few hours? Some people will never return. It is hard when you mess up UX and need to come back from a heap of bad app ratings, get flooded with customers complaints, or to convince people to give you a second chance after you have disappointed them. But it is even worse when you get privacy wrong.

Depending on legislation, financial damage when there is a data breach or bad practises come to light, can amount to huge sums of compensation and fines. That’s not even taking into account the financial and reputational cost of the taint on your brand image or product.

And just as with UX, a minor oversight or a small, well-intended, action might have big consequences. From start, you have to go into great detail so decisions can be made early, and potential problems can be identified. When a project is ongoing, you need to constantly review operations and processes, and have to be aware of the things that are not optimal and could go wrong – enter risk assessments and “3. You can not do this alone”.

That, and you need a good strategy for when things do go wrong…


6. Communicate honestly


Crisis Communication

Back to our out-of-service login example. I’m assuming you have a system in place to detect that all of a sudden there were no logins anymore and that will notify the person who needs to coordinate repairs. You will also want communicate to your customers. For example:

We are currently experiencing a problem with customer login. The entire online team and everybody else available is working on fixing this. There is also a dedicated team working on activating guest checkout (no login needed). We estimate to be able to take orders again in less than two hours (by 3pm UCT). Making sure you have the gifts for the people you love is important to us. We will exceptionally offer free next day delivery on all orders placed before 9pm, guaranteeing you receive all your gifts in time for Christmas’ Eve.

If you want, we can notify you by email the moment you can order again:

This downtime message recognises the issue, communicates what is being done about the problem, and when it will be remedied. This message also clearly recognises how the customer is affected, and shows the effort to minimise – if not compensate – the negative affect.

Of course, a short outage is an enjoyable event compared to data breach, but the rules of communication remain the same:

  • What happened? (Tell what you know at that time.)
  • What is being done *now* (investigate, take system offline, ..)
  • How does this affect your customers?
  • What are you doing to minimise this risk? What can they do?
  • How can people get more information or updates?
  • What are you doing to prevent this from happening again. (Possibly in follow-up communications.)

Visa has must-read (ahead!) advice in their Responding to a Data Breach – Communications Guidelines for Merchants documents.

Of course, your data breach communication obligations depend on the local legislations, but just as we should have materials and procedures in place for when something goes technically wrong, you need to be prepared to explain when your costumer’s privacy might be in peril. Knowing ahead how, what and to whom to communicate avoids mismanaging the incident.

To get off on the right foot, start this transparent and honest communications from when you first meet..

Getting it right from start

A second thing you need to communicate honestly – transparently – about is that you are gathering data in the first place. What is now best practise, will be required once the new European privacy regulation comes into place: you need informed, explicit consent.

‘The data subject’s consent’ means any freely given specific, informed and explicit indication of his or her wishes by which the data subject, either by a statement or by a clear affirmative action, signifies agreement to personal data relating to them being processed;

(Principles relating to personal data processing, Art 5 – General Data Protection Regulation Draft)

Specific and informed means that you, using plain language, stipulate which data you will be gathering, and what you will be using this data for, in a clear and concise manner. Clear affirmative action could include ticking a box, or another action which clearly indicates in this context the user’s acceptance of the proposed processing of their personal data. Silence, mere use of a service or inactivity does constitute consent.

Doing this by the book will be much easier on iOS (where you first install an application, and from within that application then ask for eg. location access permissions) than on Android, where you have to request permissions – access to personal information such as location information and text messages – before an application even gets opened.

(Google urgently needs to improve upon the Android model for app permissions, and move from the “take it or leave” approach to a more granular requesting of phone permissions. It is ridiculous that you have to grant a news app permission to access your location before you can even install it, just in case you’d like to check the local weather. For this, I have my hopes set on Android L.)

Standardisation of Privacy Communications Design?

In an effort to standardise the communication of privacy information essential to the user, the General Data Protection Regulation proposal includes the idea to present the essentials in an “aligned tabular format, using text and symbols.” Not quite what is usually ment with ‘privacy-by-design’.


I’m no fan of these symbols, and believe it is not the European Parliament’s job to define grid layouts or Pantone colours. Regarding the quality of these icons, I urge you (anybody!) to read John Sören Pettersson’s ‘A brief evaluation of icons suggested for use in standardised information’.

What is well considered though, is to stipulate that these should also be machine readable, making ‘automated audits’ possible


Practically: Privacy & UX, start here..

As a User Experience Designer, you might not be able to influence technical infosec measures, or employee confidentiality agreements. However, there are some practical easy-to-do ways to work on the ‘privacy’ aspect of credibility:

  1. Feature vs personal data check
    Does your app or website request permissions or personal information directly from the user, from the smartphone sensors or a third party (eg Facebook) that is actually obsolete, or not really necessary? Is there an alternative that gets the same or better result, using fewer data?
  2. Explicit & informed consent
    In sign-up flows, add a data page. In plain words and images, explain what types of data you will be collecting, and what you will be doing with this information.
  3. Be transparant about interaction and content decisions based on data and behaviour (aka personalisation)
    Where feasible, consider adding ‘because.. ‘. You are seeing this, because it is a news item near your current location. We took you directly to this page, because you always go for the sports news. Etc.
  4. Show people the picture their data paints
    In their profile or account pages, make these decision factors accessible and correctable. “We believe you are interested in ‘Design (x), Analog photography (x), Augmented Reality (x), History (x), Linguistics (x) and Cupcakes (x).’ Or, “Based on your viewing behaviour we prioritise ‘Cat videos (x), TEDx talks (x) and Cooking tips (x).'”
  5. Allow people to leave. With their data.
    Please do not put a “Do you want to delete your account?” button on every screen, but make them usable and discoverable to people who feel the need to remove their account or data. If people request a deletion of their account, they mean deletion of any identifiable data as well. Unless they explicitly allow you to keep it. If you fear loosing users by making this too easy, did you ever consider offering them a ‘reset’ option, where they can erase/reset all data and profiling, but keep their account?


Balancing Privacy and Personalisation to create ‘Significantly Better Products’

At SXSW, Snowden rightly challenged startups to combine exceptional user experience with privacy at the inception of a product, not afterwards. UX designers should take their responsibility in making sure privacy is not a legal footnote forfeited in the ToS.

I full-heartedly agree with Cole Peter’s Privacy versus user experience is a false dichotomy, where he rejects Dustin Curtis’ proposition that “the truth is that collecting information about people allows you to make significantly better products, and the more information you collect, the better products you can build.

With respecting privacy now an important part of being Credible, and consumers and politicians – finally – becoming more critical concerning how their data is being handled, a ‘significantly better product’ should find the right balance between privacy and personalisation.


Written after an intense week of all things Privacy – from belgian law and the European upcoming Data Protection Regulation, to encryption, triple-wrap and SSL certificates, to the ISO 27***-standards and risk management – I ended up with well-filled pages of notes and heap of ready-to-apply knowledge. It was great to be fully able to focus on this one important topic for a week, and to figure out how Privacy might or might not fit in the everyday UX job. My conclusion? Privacy and User Experience Design are very well-matched!


Leave a Reply

Contact us