Data as a new design material: ethical implications on User Experience

Experience Design Academy
11 min readNov 20, 2022

--

User Experience is necessarily intertwined with the Data-Driven approach: data can generate an understanding of context, but it must be appropriately collected, interpreted, and ultimately used to inform and evaluate the design.

We investigated the ethical implications of the pervasive adoption of data from both the perspective of the designer and the end user to establish transparency and long-term value for both points of view, during our 29th UX Talk.

Being a UX designer today has to do with an ethical dimension that becomes more important when we use data. Data can be generated consciously or unconsciously by users, it can be used positively or coercively, and so on. We invited Patrizia Marti — Professor @ Università di Siena — to introduce her current research on the relationship between design, ethics and data. Following, two exceptional speakers with real stories from the corporate world: Nicolò Volpato from Tangible presented their corporate approach to designing with data; Stefano Basile from Sisal, presented ways to help users limit the logic of possible addictions thanks to AI and user behaviour data. All of this will have UX implications:

What is hiding in the data? What opportunities and risks are there in designing with data?

Designing with data: an introduction — Patrizia Marti

The most intuitive way to describe the relationship between design and data is through its theoretical framework. The book Designing with Data (King et al., 2017) produced an analysis of what it means to design with data, pointing out different ways. They distinguished three broad categories: “Data-driven design,” “Data-informed design,” and “Data-aware design.”

  1. Data-driven Design is using data to make precise and localized decisions. The most classic example is A/B testing; when developing an interface, doubt may arise about colour, style, and tone of voice. In case of any precise doubt, A/B testing is launched. This typically involves a lot of users, who often do not realize they are part of an experiment and give their preference on one version rather than another. In Data-driven Design, the answer is given by quantitative data.
  2. Data for Design is qualitative/quantitative data, consisting of data from interviews or props. The data are interpreted and creates a narrative, so it is not a purely scientific experiment but a more qualitative domain.
  3. Data-aware Design is a multidisciplinary approach involving designers, data scientists, developers, clients, and business managers actively design systems that directly inform future strategies. A data-aware mindset means that the designer knows what data could be relevant and how it can help solve the problem. Therefore, it is a mindset or ‘philosophy about Design’ rather than a concrete approach.

In the book, King et al. (2017) use a metaphor to talk about these three approaches: the train metaphor. Designers need to do a travel and arrive to the station. In data-driven Design, the designer already knows all the trip specifics, the train’s platform, the time of departure, and the destination with no complications and no level of problem-solving. In data-informed Design, the designer has to decide where to go and which train to choose, so there is some level of creativity based on the context and what the designer wants. Then there is data-aware Design, which consists of an almost philosophical approach, where the designer instead thinks about the transportation system and how to change the world. The metaphor may serve to understand the three methods and the differences between them in a more tangible way.

Illustration from Rochelle King, Elizabeth F. Churchill e Caitlin Tan «Designing with Data: Improving the User Experience with A/B Testing» O’Reilly Media, 2017

Value and Biases

The use of data to collect information about people’s behaviours opens the concept of “user-centred design” where users are remotely monitored, observed and profiled. In this paradigm, users are considered sources of information and their participation in the design process is limited to a role of data generators. But designers do not know the context in which these data are mined and the goals for which they are mined — often leading to biases. And that’s where Design comes in because if we use a purely engineering, mechanical, automated approach to even interpreting data, this can create significant problems. The perspective with which some products are designed follows the white man between 35 and 40 views, considered standard. For example, an automatic dispenser that does not work if there is a black hand underneath (fig 1). The same happens with Flickr that associated Auschwitz concentration camps as a place to play sports (fig.2).

Figure 1: The Racist soap dispenser
Figure 2: The train tracks leading into Auschwitz, which were labelled “sport” by Flickr’s algorithm. Photograph: Christopher Furlong/Getty Images

Additionally, smartphones usually have an appropriate size for a male hand, less for a female hand. Many smartwatches do not work with dark skin, over-tattoos, or obese people, because the technologies are calibrated with standard people.

Patrizia concludes by highlighting that no one currently has the solution to data use and bias formation. Still, it is essential to be aware as beings and designers. In addition, she leaves us with two considerations:

“First: User orientation is vital; designers have to make systems that really “resonate” with the varied person. Design deals with real people, and we must always be willing to question the solution found, iterating the process. We must anchor Design on a study of behaviour that cannot be based on cold data alone. We must learn to have a socio-cultural reading of the data by empathizing with the real user.

Second: Abandon the “problem solving” approach: today’s solutionist approach to things, to the complexity of the projects we make and the life we live, actually could be more useful if Design, instead of serving solutions, helped us unpack problems to create room for dialogue.” —Patrizia Marti

Watch the stream of the event to find out more (in Italian)

Participatory Design tools to avoid misuse of data in enterprises — Nicolò Volpato

Article 3 of Tangible SRL SB, Company by laws. “Introduce, promote and spread design, implementation and validation processes and tools, for digital products or services, that respect all people, based on principles of ethics accessibility equity and multiculturalism”.

Tangible deals with digital applications, platforms and services, so it needs to break away from the philosophical layer and get into everyday practice. Ethical Design considerations in the company must take place in a participatory way at the ideation and design stage. Eighty per cent of the decisions that determine the final impact of a service are defined at the beginning, in the design concept dedicated to planning and shaping the idea. Indeed, things go out with the tide during the project execution, and there is less room to ask questions that change the fate of the service project.

How can Tangible contribute to Ethical Design? How could it be possible to readjust the tools we use and the mindset with which we do projects?

In the last two years, Tangible has begun to use frameworks capable of creating speculative questions about possible design solutions that go in a more ethical service or product direction. Tangible is in the consulting business, so it comes into contact with many different figures within the client company. The first step includes engaging all stakeholders in a project in participatory workshops and canvas design activities to involve people in structured brainstorming exercises to create a moment of discussion by soliciting sensitive issues within client companies. Examples of questions might be:

How do we create and undermine the sense of trust, create self-determination, inclusions, and how data and algorithms can damage people’s sense of agency self-determination?

The workshop (partially visible in the image) is what Tangible has been using for several months in client companies and at events, with the purely educational purpose of sharing knowledge and awareness for companies worldwide on ethical operative prerequisites. Sharing processes is necessary because such a tool has value if it is used in more extended manners: the goal of this tool is to raise questions and ask prerogatives in the early stages of a project when there is space to be able to ask questions and try to seek agreements with the larger team and the client on these issues. The mindset to address the sensitive topics of Ethical Design is needed to establish a conversation and have active participation by everyone involved in the project. Indeed, for these questions, it is not enough to address only the company’s designers, everyone who makes the decisions and can change the Design needs to be involved. The tool collects the risks from the ethical purposes of the project at the same stage that information is collected to build personas and journeys, to make sure that the team is adhering to the business goals but is always aware and attentive to what is pinned down as a business ethics perspective.

Everyday things challenge us to be more ethical while designing: these reflections should not remain philosophical speculations but should find fertile ground in companies.

Watch the video to learn more about the talk and discover the examples (in Italian)

Responsible gaming through evidence based design — Stefano Basile

Sisal, founded in 1946, has a long tradition of product innovation with national popular products such as “Totocalcio” or “SuperEnalotto”. In more recent time Sisal focuses on evolving its digital experience and building its global presence by becoming an international group. Since Sisal has a diverse audience, it offers many different products; accordingly, the user’s data changes depending on context. Although it may seem contradictory, since Sisal is a company generating profit from gambling, their approach is strongly based on a conscious and socially sustainable vision of the market, promoting responsible gaming behaviours.

We follow an evidence-based approach, we base our design and communication choices on insights we gather from mixed sources, of both qualitative and quantitative nature: we look at data and talk to users.
Stefano Basile, Sisal for Responsible Gaming

Focusing on data Sisal has quite an unusual relationship with them. Being a licensee of the State it has many responsibilities towards users and specific laws to follow, such as the “Dignity Decree”, in which the enterprise cannot communicate through traditional advertising channels. Therefore, all personal and transactional customer data are used for internal analysis only and not exchanged at any level with any other party. Because of this strict regulatory context, the data given to Sisal are among the most secure, but still, its users are increasingly sceptical to give consents on data treatment and this trend is observable throughout all digital markets. This new awareness of the value of data has been triggered by a series of regulatory innovations and a more intense conversation on the subject.

User Behaviour and Data Retrieving: a difficult relationship

Although Sisal has an extremely differentiated set of customers, the majority are generally wary of surrendering their data to the company. The lack of data is one of many challenges that designers are facing when designing ethical features and customer journeys.

A clear illustration of the distrust shown by gamblers in sharing data is dated the first of 2020. From that day, to play a slot at a point of sale, users have to enter their personal tax code to verify that they are over age. It consists solely of a check of the machine. Although Sisal had prepared well in advance their users for this change and informed them that there would be no data exchange, on January 2nd, the point-of-sale slot dropped by a really large amount per cent. Overnight, the players showed a strong sense of fear in sharing their personal code for a simple apparent request for data.

It is a design problem, and the designer must seek and create explicit solutions to let the user trust in releasing their data. Without permission to use data, companies are at a standstill both from a business point of view and from innovative and technological development through data analysis.

Artificial Intelligence to foster responsibility in users

On top of licensee obligations on transparency and customer protection, Sisal wants to raise the bar and lead the market to set a higher market standard on managing problematic gaming issues. Sisal has initiated years ago a “responsible gaming program”: the vision is to be an international leader in responsible gaming, promoting healthy gaming behaviours through internal analysis of user data and digital innovation. The mission is to offer the best responsible gaming experience by generating value for society and people.

Why is this not contradictory? Sisal business has the potential to be a problem for society and wants to stop it, hence facing the problematic gaming issue is pivotal for the long term company profitability. Ludopathy as a social issue creates many problems with the authorities, creating a difficult legal framework for business. Even if game-addicts are a tiny percentage within the company customer base, a source of revenues so fragile is a direct threat to business continuity and is a pressing company interest to identify them immediately to help them to follow a healthy game.

Sisal: Responsible games features

Sisal has created several tools based on the data; for example, self-cross-exclusion, customers may realise that they are operating unhealthy behaviours and have the option within the Sisal site to self-exclude themselves from all gaming providers. Thus the customer can no longer play digitally ever again. A similar solution is temporary self-exclusion of the user for a defined period. Then there is self-time-limitation, in which the users enter how much they want to play. i.e. only one hour per day. In addition, as a suggestion from Sisal Systems to users, there is the limitation of the money-to-play option: the user can accept or not the suggestion. Last but not least, many tools are able to identify a ludopath thanks to data analysis and artificial intelligence, working with predictive algorithms that look at user behaviour online. The algorithms first analyse their gaming behaviour and utilise the call centre data records when critical situations are reported, secondly, provide a solution. The algorithm does not make decisions but identifies behaviour models and provides suggestions for action. The decision-makers are only human beings.

Watch the stream of the even to find out more (in Italian)

These lectures were held during the 29th UX Talk organised for the students of the Master in User Experience Psychology by Politecnico di Milano & Università Cattolica del Sacro Cuore and of the Higher Education Course in User Experience Design by POLI.design. Follow us on LinkedIn and Instagram to be updated about the upcoming UX Talks, always open to the public.

Curated by Alice Paracolli

--

--

Experience Design Academy
Experience Design Academy

Written by Experience Design Academy

A polytechnic centre of excellence dedicated to User Experience - by POLIdesign.

No responses yet