Social Media

Dark Patterns in Social Media Interfaces | Denton

Dark Patterns in Social Media Interfaces |  Denton

Unfortunately, it’s all too common in the online world for someone to try to manipulate you into making decisions you wouldn’t have made otherwise. An example of this is the use of “dark patterns”. We can find it in almost every online environment, especially in social media. In an attempt to address this issue, the European Data Protection Board has published new guidance on dark patterns in the context of GDPR1 (Guidelines). Below we have prepared a summary of the key points.

What exactly are dark patterns?

The guidelines define dark models as user interfaces and experiences implemented on social media platforms that lead users to make unintended, unintended and potentially harmful decisions regarding the processing of their personal data. In other words, dark patterns are a kind of user experience technique that uses psychology to manipulate users into making decisions that are unfavorable to them regarding their personal data. Since dark patterns can represent violations of GDPR, they should be avoided when designing user experiences.

Types of Dark Patterns

Dark patterns can take various forms. The guidelines describe the following types:

1. Overload occurs when social media users are faced with a large number of requests, choices or are bombarded with information. The purpose of this practice is to induce users to share more data or to unintentionally allow personal data to be processed against the data subject’s expectations. This is done as follows:

  1. Prompt continues – Repeatedly asking users to provide data or consent may exhaust users to the point that they give in and agree to provide access to data that is not necessary for the purpose of the processing, so that they can use the platform in an uninterrupted manner. Example: A social media platform asks a user for their phone number each time they log into their account. The request appears even if the user refused to provide a phone number during registration.
  2. Privacy Maze – This type of overload can be seen when information, such as a privacy notice, is not easily accessible and users are forced to navigate through multiple pages of the website to find it.
  3. Too many options – The user of the social media platform should be able to easily check the privacy settings. For this purpose, the platform provider must compile this information in one easily visible place. Multiple tabs dealing with data protection or in data privacy choices can overwhelm the user. With too many options to choose from, the user is either unable to make a choice or, in the information group, will overlook some settings.

2. Jump can occur when the interface has been designed in such a way that users are manipulated into forgetting some or all aspects of data protection issues. Two activities can be recognized as jumping rope:

  1. Deceptive comfort – Privacy by design should be reflected in default data privacy settings. If the social media provider defaults to the most intrusive privacy settings possible, we’re dealing with the Deceptive Snugness dark pattern. Example: When registering an account and providing a date of birth, users are asked to indicate the group of recipients with whom they will share this information. By default, “all platform users” is indicated.
  2. Look over there – This type of jump involves the introduction of competing or irrelevant elements in a document dealing with data protection issues. This can affect how users perceive information and can distract them into ignoring or skipping privacy options in favor of the competing item. Example: When providing information about cookies, operators of online platforms often take advantage of the ambiguity of the word cookies, for example when asking for consent in a humorous way and applying culinary sense.

3. Emotional steering uses vocabulary, grammar as well as visual measures to convey information to evoke a specific emotion in users to manipulate their behavior. Example: A social media platform writes user prompts to encourage sharing more personal data than necessary: ​​”Tell us about your amazing personality! We can’t wait, so come right now and do let us know!

4. Hidden in plain sight uses visuals to prompt the user to choose less restrictive privacy settings. This can be done by using small fonts or a font color that blends into the background. Structuring the message in this way may cause the user to overlook important information from a data protection point of view.

5. Obstruction is a tactic by which the user is delayed or prevented from becoming familiar with relevant privacy information. Two types of activities fall under this dark scheme:

  1. Dead end – Users usually define the level of personal data protection when they first create their account on a social media platform. If during this process the user cannot access information about the processing of personal data, it is a dead end.
  2. longer than necessary – This type of hindrance is not always contrary to the provisions of the GDPR. Therefore, the use of this dark pattern should be considered on a case-by-case basis. The important thing is that users are not forced to activate options that are more intrusive for data sharing. Example: It is not possible to withdraw directly from the processing of targeted advertisements, even if the consent (opt in) only requires one click.

6. Inconstant is based on unreadable and unclear interface design. Therefore, it is difficult for the user to understand the purposes of data processing. Lack of hierarchy and decontextualization are examples of instability:

  1. Lack of hierarchy – This type of dark pattern occurs when the same data protection information appears multiple times but is presented differently. This serves to confuse users, who are unable to understand how their data is processed. Example: Information on the rights of the person concerned by data protection can be found in several places in the privacy policy. Most of the rights are explained in the “Your rights” section; however, the right to lodge a complaint with the supervisory authority as well as the contact details of the authority are provided in various other sections relating to other elements of the protection of privacy.
  2. Decontextualize – Important privacy information is located on a page different from its context, so users would have a hard time finding it.

seven. left in the dark is a method of user communication designed to conceal information or data protection controls or leave users uncertain about how their data is being processed. Here are some examples of these practices:

  1. Linguistic discontinuity – The service is dedicated to users in their language, while privacy information is not provided in the official language(s) of the country where users reside.
  2. Conflicting information – Due to receiving conflicting information, users are unable to determine the consequences of their actions and become confused. In end users keep the default privacy settings.
  3. Ambiguous wording or information – The terminology used is deliberately vague and ambiguous so that users do not know how their data is processed or how to control it.

Dark Patterns and GDPR:

It should be emphasized that dark schemes can, but do not have to, violate the provisions of the GDPR. The EDPD guidelines are recommendations only. Their release should be seen as a positive measure aimed at strengthening the protection of natural persons. They provide important guidance to controllers, allowing them to assess their practices from a GDPR compliance perspective. For data subjects, the guidelines can serve to raise awareness of the questionable and often controversial actions of data controllers that can be used against them.

Moreover, as the EDPB points out, dark patterns may not only constitute an unlawful interference with the privacy sphere of social media users, but they may also violate consumer protection regulations. This is why it is so essential for companies operating in the e-commerce sector to analyze their activities to determine whether any of them could be assessed as deceptively or manipulatively influencing data subjects to make more intrusive decisions. for privacy. By remaining vigilant to identify the existence of dark patterns and taking the necessary corrective measures, companies can avoid costly legal and administrative proceedings, not only before the Data Protection Authorities but also the Consumer Protection Authorities.

  1. Guidelines 3/2022 on dark patterns in social media platform interfaces: how to recognize and avoid them, version 1.0 adopted on March 14, 2022.