Top

Designing for Privacy

Envisioning New Horizons

A critical look at UX design practice

A column by Silvia Podesta
October 7, 2024

Data privacy has transformed from a niche concern to a global imperative. Recent trends reveal a significant shift in consumer attitudes toward data protection, particularly in the wake of the global COVID-19 pandemic.

The Cisco 2022 Consumer Privacy Survey [1] paints a compelling picture: In 2023, an overwhelming 81% of users expressed apprehension about corporate data practices. Moreover, 64% of respondents reported avoiding businesses because of data-security concerns, while a striking 76% stated that they would stop purchasing products and services from companies that mishandle their personal information.

These statistics underscore a growing trend in which users prioritize privacy in their digital interactions. In response to this evolving landscape, experts have introduced the concept of the privacy experience (PX), which advocates for the seamless integration of data privacy and UX design. This approach places users’ needs at the forefront, empowering them with granular control over their personal data throughout their digital journeys. [2]

Champion Advertisement
Continue Reading…

Why Care About Designing for Privacy?

The need for UX design that prioritizes privacy goes beyond mere customer retention; companies must now navigate an increasingly stringent regulatory landscape that champions consumer privacy. Well-crafted privacy experiences can, therefore, shield organizations from reputational harm, legal challenges, and financial repercussions that are associated with privacy violations.

In the European Union, the General Data Protection Regulation (GDPR) mandates that applications collecting users’ personal data must clearly inform them of the intended purpose of collecting that data. Moreover, users must provide explicit consent for data collection, and the company collecting the data must document their consent. These requirements have necessitated a redesign of application user interfaces to comply with the law. Similarly robust legal frameworks prioritizing individual rights and data protection are also in place in countries such as Switzerland and Canada.

In contrast, the United States’ regulatory environment remains fragmented. However, many states have enacted their own comprehensive privacy laws, and significant regulations are on the horizon. States such as Florida, Oregon, and Texas will be implementing new privacy laws in 2024. Plus, there are ongoing discussions about passing a federal act similar to GDPR—the American Data Privacy Protection Act.

As the privacy landscape rapidly evolves and users become increasingly aware of their rights, it is only a matter of time before businesses in the US need to take substantial steps to ensure privacy compliance.

Building Blocks of the Privacy Experience

Effective privacy-focused design requires a holistic view of the digital experience. Designing for privacy encompasses understanding information and data flows, assessing the risks that are associated with personal data collection and handling, and identifying users’ friction points. The goal is to create an experience that respects users’ data rights while enabling them to accomplish their tasks seamlessly.

Research indicates that the user experience significantly influences three key areas of security and privacy:

  1. Perception of risk
  2. Experience of harm
  3. Mitigation practices

In essence, the design of a digital experience provides users with implicit cues about how an application handles—or mishandles—their data. While poorly designed experiences could lead to personal data breaches, thoughtfully crafted user experiences can proactively mitigate these risks and empower users to maintain control over their data and privacy choices.

To address data-privacy concerns and create more rewarding user experiences, UX designers should build upon three fundamental principles:

  1. Transparency—UX design should aim to reduce users’ perception of risk through clear, accessible communications and by avoiding deceptive dark patterns.
  2. Integrity—This principle encompasses both the elimination of harmful dark patterns and the incorporation of practices that guide users toward making informed decisions about their data.
  3. Control—The user experience should provide simple, easy-to-understand methods for users to modify their privacy preferences and request corrective actions from companies that handle their data in ways for which they haven’t provided their consent.

These three pillars form the foundation for privacy-centric design. I’ll explore them in greater detail in the following sections.

Designing for Transparency

Transparency extends beyond merely informing users about the data-collection practices that the GDPR mandates. It requires an organization to convey its strong commitment to privacy through effective communications and thoughtful user-interface design. Let’s consider a couple of examples.

Cookie consent forms, while ubiquitous, often fall short of true transparency. They typically direct users to dense policy documents, contradicting the principles of seamless UX design and easy information access. The challenge lies in conveying crucial privacy information without turning a Web-site visit into a cumbersome contractual experience. While consent forms are necessary, the quality of their content is paramount. Avoid verbose, jargon-heavy text. Instead, focus on presenting the most relevant information concisely, as follows:

  • what personal data the site collects
  • which third parties receive the data
  • practical implications and potential risks for the user

Figure 1 illustrates a common issue regarding consent: vagueness about data usage and its implications. While this etailer explains the purpose of gathering data—that is, personalized shopping—the Privacy Policy page contains vague statements about the implications of users’ choices regarding their data. Plus, it lacks specificity about both the data the site collects and the consequences of users’ refusing to consent.

Figure 1—Vague information on an etailer’s Privacy Policy page
Vague information on an etailer’s Privacy Policy page
Figure 2—A Privacy Policy page
A Privacy Policy page

Conversely, Figure 2 showcases a more transparent approach. The site offers a breakdown of third-party cookies, complete with ToolTips that explain their intended purpose and individual opt-out toggles. This informative user interface fosters trust by providing knowledge in an accessible, easy-to-read format.

Figure 3—A breakdown of the types of cookies a Web site uses
A breakdown of the types of cookies a Web site uses

Be mindful of the crucial role a user-interface design plays in shaping users’ trust of a company’s privacy practices. Poorly designed privacy settings could inadvertently or even intentionally encourage users to share more data than they intend, often by burying options in complex menus or making them difficult to understand. Users’ discovery of such design choices can lead to a significant erosion of trust and increased suspicion in future interactions.

The privacy settings on LinkedIn provide a notable example: Users were surprised to find that images of posts they had liked on the platform appeared in Google search results for their names. The solution for this issue is counterintuitive: users must disable the visibility off LinkedIn toggle. However, this setting also affects the visibility of the users’ own published content. Because users might have initially enabled this setting to keep their own content visible on the Web, unaware that it would also publicize their likes, this scenario illustrates a subtle form of privacy infringement.

The placement and timing of privacy-related user-interface elements such as consent forms are equally important in maintaining users’ trust. For example, some Web sites display cookie consent forms on their Privacy Policy pages, as shown in Figure 1. This practice can be counterproductive because the users visiting these pages are likely seeking more information after initially dismissing the consent form. The unexpected reappearance of the consent form might lead users to hastily click Accept all out of frustration or confusion.

These examples highlight how seemingly minor design decisions can have significant implications for user privacy and trust. Whether intentional or not, such practices are common online and can cause considerable distress to users, ultimately damaging their trust in organizations. To build and maintain users’ trust, companies must prioritize transparent, user-friendly design in their privacy user interfaces.

Offering Integrity: Preventing Harm and Promoting Informed Choices

Essentially, good privacy-experience design aims to prevent harm to users from privacy violations and breaches. Although clear communications foster transparency and trust, as I discussed earlier, it’s equally crucial in guiding users toward making informed decisions about their data. Overly complex consent forms often lead users to hastily click Accept all without fully understanding the implications for their personal data.

To ensure integrity in privacy-experience design, UX designers must avoid certain design practices, while actively implementing others. Bad practices to avoid include the following:

  • hidden data collection—These undisclosed mechanisms track users’ activity or harvest their personal information without their explicit consent.
  • manipulative default settings—The purpose of preselecting checkboxes in consent forms is to elicit more information sharing than the user would want.
  • social-proof techniques—Claims that a majority of users have already consented to data sharing can create pressure on users to conform or give them a false sense of safety.
  • arduous opt-out mechanisms—Complex processes for withdrawing their consent deter users from opting out and erode their trust.

The GDPR framework provides an excellent foundation by establishing user rights that user experiences should incorporate. Good practices to implement support the following rights for users:

  • right to restrict data processing—Let users limit their data usage, as the consent form in Figure 2 exemplifies.
  • right to data portability—Provide data-portability options that let users transfer their personal data between service providers.
  • right to object—Provide objection mechanisms that let users object to the processing of their personal data for specific reasons. This right is particularly relevant in direct-marketing contexts.
  • right to govern automated decision-making—Implementing the governance of automated decision-making prevents decision-making that is based solely on automated processing from significantly impacting users.

By adhering to these design practices, UX designers can create user experiences that not only comply with privacy regulations but also empower users to make informed choices about their personal data.

Ensuring User Control for Eventual Mitigation

Again the GDPR’s list of individual rights serves as a valuable framework for empowering users to take control of their data within digital experiences.

  • privacy-setting shortcuts—Often, organizations bury privacy controls within lengthy, complex menus, making them difficult to find. Providing easily accessible privacy-setting shortcuts enhances user engagement and trust.
  • one-click data deletion and the rights of access and erasure—Always offer a straightforward method for users to request access to any personal data that an organization holds, as well as the deletion of that data.
  • data-correction options and the right to rectification—Allow users to request corrections to inaccurate personal data or let them edit their data directly.
  • confirmation of changes—Implement a system that lets users confirm any changes to their data, fostering transparency and reinforcing users’ trust.

It is crucial for a Web site to clearly explain the implications of such user interactions so users can understand any potential effects on their data and privacy.

Should Users Pay for Privacy?

Now, let’s consider the controversial practice of Web sites’ requiring users to pay for content unless they consent to data collection. This raises significant ethical questions regarding the balance between monetization and user autonomy. While this model might appear to offer a straightforward trade-off—access to content in exchange for personal data—it can create a sense of coercion among users, ultimately undermining trust and transparency.

From a UX design perspective, such practices often employ dark patterns that blur the line between persuasion and manipulation. Users might feel pressured to consent without fully understanding the implications for their privacy, leading to their making uninformed decisions about their data. In contrast, the business model of reputable media outlets typically employs a freemium model that is based on paid subscriptions.

Forcing users to choose between making a cash payment or providing their personal data raises questions about how a company determines the value of personal information and whether such pricing is fair. This not only complicates the ethical landscape but also challenges UX designers to create experiences that respect user autonomy while still achieving business objectives.

Designing for Privacy: A Cheat Sheet

In summary, UX professionals should incorporate the following guidelines into their design solutions. These guidelines encompass the three conceptual building blocks of the privacy experience (PX) that I discussed earlier: transparency, integrity, and mitigation practices.

  • information and data flows:
    • Map out how user data moves through the system.
    • Identify collection points, storage, and sharing of personal information.
  • risk assessment:
    • Evaluate potential privacy risks at each stage of the user journey.
    • Consider both perceived and actual risks to the user’s data.
  • friction points:
    • Pinpoint areas where privacy concerns could impede the user’s tasks.
    • Address these friction points, balancing compliance with usability.
  • transparency mechanisms:
    • Implement clear, accessible communications about data practices.
    • Avoid dark patterns that obscure privacy information.
  • integrity safeguards:
    • Eliminate any deceptive design patterns.
    • Guide users toward making informed privacy decisions.
  • user-control features
    • Provide easy-to-use tools for managing privacy preferences.
    • Enable easy access to users’ data rights—for example, information-deletion requests.

By focusing on these core principles, UX designers can create experiences that protect users’ privacy while maintaining functionality and trust. The goal is to empower users throughout the digital experience by ensuring transparency, integrity, and control over their personal data. 

References

[1] Cisco. “Data Transparency’s Essential Role in Building Customer Trust: Cisco 2022 Consumer Privacy Survey.” Cisco, 2022. Retrieved September 15, 2024.

[2] George Chalhoub. “The UX of Things: Exploring UX Principles to Inform Security and Privacy Design in the Smart Home. ” In CHI EA ’20: Abstracts of the CHI Conference on Human Factors in Computing Systems, April 2020.

[3] Phil Mennie, Richard Chudzynski, and Assaad Khater. “Privacy UX: Designing Tomorrow’s Experience.” pwc.com, July 4, 2024. Retrieved September 15, 2024.

[4] Fernando Almeida and José Augusto Monteiro. “Exploring the Effects of GDPR on the User Experience.” Journal of Information Systems Engineering and Management, Vol. 6, No. 3, January 2021. Retrieved September 1, 2024.

[5] F. Paul Pittman, Hope Anderson, and Abdul M. Hafiz. “What to Expect in U.S. Privacy for 2024.” White & Case, December 22, 2023. Retrieved September 1, 2024.

[6] Information Commissioner’s Office. “Guide to the General Data Protection Regulation (GDPR).” ICO.org. Retrieved September 10, 2024.

[7] Lorena Sánchez Chamorro, Kerstin Bongard-Blanchy, and Vincent Koenig. “Ethical Tensions in UX Design Practice: Exploring the Fine Line Between Persuasion and Manipulation in Online Interfaces.Designing Interactive Systems Conference (DIS ’23), July 10–14, 2023, Pittsburgh, Pennsylvania, USA.

Innovation Designer at IBM

Copenhagen, Denmark

Silvia PodestaAs a strategic designer and UX specialist at IBM, Silvia helps enterprises pursue human-centered innovation by leveraging new technologis and creating compelling user experiences. Silvia facilitates research, synthesizes product insights, and designs minimum-viable products (MVPs) that capture the potential of our technologies in addressing both user and business needs. Silvia is a passionate, independent UX researcher who focuses on the topics of digital humanism, change management, and service design.  Read More

Other Columns by Silvia Podesta

Other Articles on UX Design

New on UXmatters