Legal Design: a multidisciplinary perspective involving User Experience, legislation and psychology to develop responsible, transparent & privacy-compliant products

Experience Design Academy
10 min readMay 15, 2022

--

The demand for transparency rose in the past years due to a combination of legislative and cultural issues. Despite being usually underrated, this part is a complex and necessary aspect affecting the user experience that needs to be appropriately designed.

“Legal Design aims to make a better legal system that people can use to protect their rights, resolve their problems, and improve their communities. It leverages human-centred Design, agile technology development, and empirical research methods to create meaningful new interventions in the justice system” by The Stanford Legal Design Lab

During the 27th UX Talk, we discussed real case studies with companies, consultants and designers who had to join their forces to find a shared solution:

Elisabetta Olgiati, Experience Design Manager TeamSystem
Anna Paola Lenzi Paola Lenzi, Group Data Protection Officer TeamSystem
Alessandro Carelli, Sr. Service and UX Designer Tangity

Transparency, Data and concerns: Design Thinking and the new Legal Design for the new privacy policy — Elisabetta Olgiati and Anna Lenzi

TeamSystem is a company that designs and develops products and services for the public. The showcased project used Design Thinking processes on Legal Design to make legal concepts more straightforward and less verbose. The Legal Design mission is to democratise the knowledge and comprehensibility of contracts and agreements, which are often only understood by a few and addressed mainly to other lawyers and are not intelligible by average users.

A contextual analysis of data management, privacy policies and company-users relationship

From the Legal Design perspective, the designer’s role is to make legal concepts clearer and less long-winded. In recent years, our lives have been entirely influenced by the data society: technology has affected all sectors and aspects of human life. During the pandemic, the centrality of technology and the amount of data has grown exorbitantly, and it is still increasing. Indeed by 2025, the amount of data will have risen fivefold. In this unprecedented scenario, European Union is trying to systemise privacy and data processing. EU is committed to exploiting data and technology while keeping the user at the centre and respecting human rights and democracy. EU decided to rule technocracy through the GDPR (General Data Protection Regulation). The GDPR’s prior aim is to enhance individuals’ control and rights over their personal data and facilitate the regulatory environment for international business.

A striking case of how privacy-related issues could influence a company’s reputation was that of WhatsApp. The company had always ensured a secure messaging platform, but it collapsed at the beginning of 2021 when a new data-sharing policy drew the fury of its users and disregarded their trust. Indeed, overnight, in January 2021, WhatsApp users received a push notification that they needed to accept if they wanted to keep using the application without being pre-alerted. The new terms introduced data sharing between WhatsApp and Facebook to allow businesses to use shared resources across the two platforms to enable e-commerce and payment transactions. Unfortunately, users misinterpreted the privacy updates, supposing that Facebook would be able to read their private messages and listen to their calls, violating their privacy protection. After this incomprehension, Whatsapp lost millions of users while competitors increased the number of adopters (i.e. Telegram) (Book: Privacy is a competitive advantage Foroohar, RANA, 2017.)

Establishing a relationship of trust with users is fundamental to making them feel safe while using the product; it’s even harder to maintain it steady.

The starting point for the TeamSystem project was to understand how the company wanted to be seen and perceived by their customers concerning privacy and data accessibility issues.

What are the most relevant personal data to TeamSystem users? A user first research approach

The legal project pursued the classic Design Thinking approach, starting by understanding the user’s priorities of privacy policy issues on data and what are the most critical information for the clients via exhaustive user research carried out through interviews and surveys.
The results were analysed through sentiment analysis, demonstrating that users perceive privacy as unclear, expressing an unmistakable need for a more transparent system when dealing with their data. At the same time, the results show positive adjectives related to data such as the word “trust”, fundamental for a deep bonding with the company. This approach proved to be a valid way to understand what users think about privacy, highlighting aspects that require further improvement on the company’s part.

Last but not least, a quantitative method to estimate the complexity of the Privacy Policy, in the Gulpease unit. The Gulpease unit is a text readability indicator calibrated to the Italian language. It is based on the letter’s length of the words and the total number of words, estimating the lexical complexity of the text. The investigation results confirmed the lack of clarity, demonstrating that only people with higher educational backgrounds could actually comprehend the meaning of the contract.

A workshop within the company to define the project requirements for data management

What is the mission of the company in data management and privacy policy? And what are the metrics of success and failure? Does the company comprehend the clients’ needs and the situation’s constraints, requirements, and resistances?

The workshop aimed to gather consensus on the topics and bring out the company’s sensitivity regarding the privacy policy issues from all departments, defining the company’s objectives and collecting an interdisciplinary team’s points of view. The meeting resulted in six pillars shared by all the company’s departments.

The six key pillars for data management, build within the company beliefs

When it comes to privacy, being clear, concise, transparent and comprehensive is not just a marketing trend to capture naive users and get a few more consent. It is a legal requirement under the GDPR for any company.

The Legal field should be concerned with how information can be turned into something readable and intuitive. In fact, designers, marketers, and cognitive psychologists are needed to understand how humans think and how to transmit legal information as clearly and intelligibly as possible.

“Contracts are not written to be understood by those who use them but in highly technical and polished language for other lawyers. So the information becomes usable only by technicians of the field. Instead, the designer should organise and visually dispose of the information to be communicated intuitively for the digital contest. This is what the legal Design is about.” by Anna Paola Lenzi.

TeamSystem’s project requirements on data management

1. The new data privacy should be simple, accessible and transparent.
2. Needed to be applied to several privacy policy agreements within the company, following the principles of scalability and replicability following the brand identity aesthetic.
3. The legal meaning must not be changed.

Who are we, what data do we save, who do we share it with, and what you users do about it?

Indeed, the main problems of the as-is privacy policy were linked to complex technical language, which could be overcome by using tooltips and organising the text in chapters and using visual maps. Further legal complexities have been solved, including practical examples into the policy, to give a more tangible sense of the user’s actions.

Team System updated its data agreements in a human-centric way thanks to the adoption of Design Thinking processes and Legal Design practices experiencing concrete improvement and satisfaction from their clients.

Could this process be repeated systematically by other companies? Are there any guidelines on user-friendly privacy policies? How do users’ trust impact companies?

Watch the full keynote (in Italian)

Privacy as a fundamental human need — Alessandro Carelli

There is a massive asymmetry of power between a user and a service provider; in fact, every time we interact with digital technology, we aren’t always aware of what is happening to our sensitive information. A meaningful example is the one of the smartphones: we all have one in our pocket, and probably very few of us know that it is sharing out position with third parties through the GPS signal.

“As designers of any system that collects and processes individuals’ personal information, you will get a disproportional power over users, which implies a duty of care towards them.” Alessandro Carelli

Privacy changes and mutates over time depending on the context we live in and the people.

Before considering the digital context, Altman (1975) defines privacy as a dynamic regulatory process involving adjustments of self-boundaries — “invisible” boundaries surrounding the physical space between the self and others — to achieve desired levels of privacy: a subjective statement of an ideal level of interaction with others.

When someone crosses a personal space boundary, anxiety, stress, or even conflict and aggression can result.

As an illustration, let’s consider an ordinary social situation, like a conference, where people can decide where to sit, closer or further from the speaker. Depending on personal preferences, the attendants can also choose the distance from the rest of the people in the room; in conclusion in real life, we have discernible and effective management of our privacy and what we share with the people around us.

This means that individuals have the capacity and the human biological senses to manage the desire to be more distant or closer to others, to carefully choose with whom and what to share in a group of people. In the digital context, these biological senses completely decay; we cannot manage our digital proximity and have no awareness if something is violating our privacy, as could happen in data leaks from companies we trust.

Indeed, in the context of information technology, individuals perceive technology-mediated privacy intrusions as issues of interpersonal privacy. However, the individual’s ability to rely on the same psychological mechanisms as face-to-face relationships to regulate privacy is disrupted by the mediation of technology (Palen and Dourish, 2003; Nissenbaum, 2011).

If privacy is violated, humans experience a negative feeling

As we have already established, our sense of privacy depends on the context we experience and perceive. The way we present ourselves and what we decide to share with others changes radically depending on the context we find ourselves in. Returning to the example of the conference, the speakers perceive the physical people to whom they’re presenting; however, they do not scent the people attending the conference remotely since they are not perceivable by human senses. The speaker perceives practically only participants of which there is tangible evidence on personal skin and emotions.

When in-person, if we feel that something or someone is violating our privacy, humans feel a natural sense of infraction, provoking general anger, frustration and anxiety. I.e. at a conference, the speakers know and expect someone is going to take a picture of them. On the contrary, that would be unacceptable if the speaker was living a private moment. The reaction to this violation is proof that privacy is a fundamental human need. But online, the perception of this violation is less evident. Indeed we might be aware we are sharing our position with third parties services, but we do not really know what it means. Rationally we would probably never share that information with anyone, but we are not really aware of what we gave our consent to. At that moment, we are exchanging our privacy for a service.

What is the symmetry of power between the service providers and the user’s exchange of information?

Indeed digitally privacy is perceived as a secondary concern due to its intangibility to human senses.

Several studies have concluded that the privacy and security dimensions of digital technologies are considered by users as a secondary concern to the desire to own and use the functionality of a specific application (Balebakp, Jung et al., 2013; Balebakp and Cranor, 2014, Whitten and Tygar, 1999).

But is this trade-off clear and honest? Are users aware they are actually exchanging their privacy for the service? Is there a way to understand how much people’s privacy and the data value compared to the product or services they are using? How could privacy violations be made more tangible to users in the digital world?

Watch the full keynote (in Italian)

These lectures were held during the 27th UX Talk organised for the Master in User Experience Psychology students by Politecnico di Milano & Università Cattolica del Sacro Cuore and for the Higher Education Course in User Experience Design by POLI.design. Follow us on LinkedIn and Facebook to be updated about the upcoming UX Talks, which are always open to the public.

Curated by Alice Paracolli

--

--

Experience Design Academy

A polytechnic centre of excellence dedicated to User Experience - by POLI.design.