Norway: New Draft EDPB Guidelines on "Dark Patterns" in Social Media

On 14 March 2022, the European Data Protection Board ("EDPB") issued draft guidelines titled "Dark patterns in social media platform interfaces: How to recognize and avoid them." The guidelines provide practical advice and recommendations for developers and users on how to assess and avoid social media interfaces that may be considered manipulative and thereby infringe GDPR-requirements.

By Inge Kristian Brodersen and Sondre Arora Aaserud

The design of a social media platform can affect how users act and what choices they make. The provider of a social media platform is normally the data controller and responsible for GDPR compliance. As such, the provider is responsible for facilitating that the rights and freedoms of the data subjects are safeguarded.

According to EDPB's guidelines "dark patterns are considered as interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data". The guidelines divide the "dark patterns" into six categories:

  • Overloading: Users are presented with e.g., too much information or too many options to make them share more data than they initially wanted or was aware of.
  • Skipping: Design that distracts the users from worrying about the protection of their data.
  • Stirring: Interface designed to make users base their choices on emotions.
  • Hindering: Design that prevents or blocks users from becoming adequately informed of the processing activities.
  • Fickle: Design that makes it unclear or misleading to the user what e.g., the purpose of the data processing is.
  • Left in the dark: Design that hides or blurs relevant privacy information making the users unsure of their data protection and accompanying rights.

According to the guidelines, developers and providers should avoid designs that fall within one or more of these categories as it may be considered an infringement of GDPR and work to the detriment of the purposes of data protection. At the outset and in order to avoid "dark patterns", companies should always remember to evaluate the design of their digital platform taking into account the core privacy principles in GDPR Article 5, including transparency, purpose limitation, and data minimization. Further, to ensure data subjects give "informed" consent, cf. Article 4 (11), companies need to ensure consents are collected and used in accordance with the conditions for consent in Article 7 and the right under Article 12 for the data subject to receive clear and unambiguous information. In addition, Article 25 on data protection by design and by default may assist providers of digital platforms in avoiding "dark patterns".

The guidelines are quite specific and provide several examples followed by best practice recommendations to make it easier for companies to identify which use case is applicable for their situation and what EDBP states as best practice in the specific example. In addition, the guidelines include an appendix containing a checklist for evaluating whether a design falls within any of the categories of dark patterns. The appendix also lists those GDPR provisions which, in the view of EDPB, are most affected by the particular "dark pattern" types, thus increasing the relevance of the document for legal practitioners.

EU's drive against "dark patterns" has been on the horizon for a while, and EDPB's guidelines both reference and appear to build on the Norwegian Consumer Council of 27 June 2018 titled "Deceived by design: How tech companies use dark patterns to discourage us from exercising our rights to privacy". EDPB's guidelines can be seen as a part of the wider current EU initiative to bolster consumer rights within digital markets and restrain the power of large technology companies, as embodied by the forthcoming Digital Services Act and Digital Markets Act. The main purpose of the proposed Digital Services Act is to create a "safe, predictable and trusted online environment", cf. its Article 1(2) b) in which the fundamental rights of all users of digital services are protected, including rights derived from GDPR which may be infringed by "dark patterns". The draft guidelines are currently made available for public comments until 2 May 2022, after which they will be updated and published in final form.

This article is intended to be a general summary of the law and does not constitute legal advice. Consult with counsel to determine applicable legal requirements in a specific situation.