Dark patterns: Manipulation in Internet services

Blog Dark Patterns

Photo from JESHOOTS.COM

The term dark patterns refers to user interfaces and user experience implementations intended to influence people's behaviour and decisions when interacting with websites, apps and social networks, so that they make decisions that are potentially detrimental to the protection of their personal data.

These techniques, widely used in other areas, are not new on the Internet where such disrespectful practices can be used to manipulate people's interaction.

 

Imagen
DARK PATTERNS

The AEPD's Guidelines for Data Protection by Default deals with dark patterns, specifically in sections VI and VIII. In application of the principle of fairness set out in Article 5(1) (a), data controllers must ensure that no dark patterns are used, at least in relation to decisions regarding the processing of their personal data.

Recently, the European Data Protection Board (EDPB) adopted for public consultation its "Guidelines on Dark patterns in social media platform interfaces: How to recognise and avoid them". These guidelines, similar to the AEPD guidelines, take Article 5.1.a of the GDPR as a starting point to assess when a design pattern in a user interface corresponds to a dark pattern. The guidelines provide a number of examples, as well as best practice recommendations for avoiding dark patterns.

Dark patterns may be presented to the user in various processing operations, such as during the registration or sign-up process in a social network, when logging in, or also in other scenarios such as in the configuration of privacy options, in cookie banners, during the process of exercising rights, in the content of a communication informing about a personal data breach or even when trying to unsubscribe from the platform.

According to the EDPB Guidelines, dark patterns may be classified into the following categories:

  • Overloading: This consists of presenting too many possibilities to the person who has to make the decisions, which ends up generating fatigue for the user, who ends up sharing more personal information than desired. The most common techniques to produce this overload fatigue are showing questions repeatedly, creating privacy mazes and showing too many options.
  • Skipping: Designing the user interface or user experience in such a way that the user does not think about or even forgets some aspects of data protection.
  • Stirring: Appealing to users' emotions or using visual nudges in the form of effects to influence decisions.
  • Hindering: This is about creating obstacles so that the user cannot easily carry out certain actions. It is done through techniques such as placing privacy settings in areas that are not accessible, making it very difficult to reach them or providing misleading information about the effects of certain actions.
  • Inconsistency (Fickle): The interface has an unstable and inconsistent design that does not allow the user to perform the desired actions.
  • Left in the dark: Information or privacy settings options are hidden or presented in an unclear way using erratic language, contradictory or ambiguous information.

Other data protection principles that play a key role in the assessment of dark patterns are transparency, data minimisation and proactive accountability on privacy by default. In some occasions, so do the purpose limitation principle, the conditions for obtaining consent and transparency in the information provided for the exercise of rights. In any case, the principle of data protection by design and by default should be applied from the moment of conception of user interfaces and user experiences, prior to launch, to guarantee the fundamental rights and freedoms of individuals, as well as compliance with regulations.

The Guidelines contain a checklist in the Annex. This checklist includes examples of each of the categories, as well as the articles of the GDPR that are relevant and may be violated in each category. Where such violations occur, data protection authorities may sanction the use of dark patterns.

Likewise, the AEPD's Guidelines for Data Protection by Default highlights in section VIII that compliance with the principle of data protection by default must be one of the elements to be reviewed in any GDPR compliance audit. The guidelines set out a non-exhaustive list of control elements, one of which is to check that dark patterns are not used to manipulate the user's choice process or to covertly influence the user's decision regarding the scope of processing.

Further information on data protection and privacy on the Internet can be found on the Agency's Innovation and Technology website, as well as on our blog:

 

 

Entradas relacionadas