Norway: Content-Sharing Services

By David S. Brambani and Mats Hole

Providers of online content-sharing services shall in principle be liable for content uploaded by their users if such content infringes third parties' copyright or related rights, cf. the DSM directive. After Poland lodged its action requesting that Article 17 should be annulled, the Court of Justice of the European Union (CJEU) found Article 17 valid. The CJEU found that Article 17 provides appropriate safeguards to protect freedom of expression. The judgment gives important guidance on the application of the new liability regime in practice.

The Copyright Digital Single Market Directive (DSM) has been highly debated. The center for the discussion has been the balance between copyright and freedom of expression. In essence, an overly strict regulation of users' dissemination via various platforms may amount to a disproportionate intrusion into the freedom of expression and an overly liberal regulation interferes with the copyright-holders' rights.

On 24 May 2019, the Republic of Poland brought an action (C-401/19) requesting the annulment of Article 17 due to an alleged conflict with the right of expression and information in the EU Charter of Fundamental Rights. After nearly three years of waiting, the CJEU's judgment finally arrived in late April.

The judgment confirms that Article 17 represents a major shift in the liability framework for hosting platforms. Previously, platform providers could rely on the E-Commerce Directive's Safe Harbour principle. Thus, they could avoid liability for user uploaded content, provided they had no positive knowledge of illegal activity or information stored on the platform. Pursuant to the new Article 17, the service providers are now in principle considered directly responsible for the communication of the content they host, even in the absence of positive knowledge of infringement/illegal activity on their platform.

Certain exemptions apply. A main criterion for exemption is that the content-sharing service provider must make "best efforts" to ensure the unavailability and prevent future uploads of specific works and other subject matter for which a rightsholder has provided the service provider with the relevant and necessary information.

The "best efforts" requirement that the service providers must comply with has been the center of attention in discussions about Article 17 over the previous years. This was also the topic at hand when the CJEU delivered the C-401/19 judgment.

In C-401/19 it was the preventative filtering/blocking of content that was to be ruled on. The Republic of Poland argued that the wording of Article 17 makes it necessary for the platform providers – to avoid liability – to carry out automatic filtering of content uploaded by users. It was argued that such filtering, or preventative control mechanisms, would undermine the essence of the right to freedom of expression and information. The CJEU did not agree.

A few key takeaways from the judgment:

  • Firstly, the CJEU upheld that Article 17 in fact represents a novel liability regime. This is an important clarification, as this shuts the door for any retroactive action taken by the rightsholders.
  • Secondly, and perhaps most importantly, the act of preventative blocking or filtering as it stands in Article 17 is legal for the member states of the EU to implement into their national laws, as long as the appropriate safeguards for the users are also implemented correctly. The fact that users have rights in relation to the lawful use of content within the article itself is a key part to why preemptive blocking is legal, and the implementation of the preventive blocking or filtering into national law is only legal if these rights are implemented with it.
  • Thirdly, the CJEU emphasized the fact that filtering systems which cannot distinguish between lawful and unlawful content is not permissible, as this would not respect the fair balance between the right to freedom of expression and the right to intellectual property. Such filters are already in use on the major platforms of this kind, but the CJEU emphasizes the fact that any filtering system that cannot distinguish between lawful and unlawful content would not satisfy the criterion of the law. The quality of this filtering technology must be "in accordance with high industry standards of professional diligence" to be legal.

What the future holds regarding the use of upload filters, and the quality of such filters, will be interesting to see. In any case, the judgment C-401/19 will definitely be referenced when it comes to assessing the national implementations of the DSM Directive and incoming cases before national courts when the national laws are to be used for solving disputes.

This article is intended to be a general summary of the law and does not constitute legal advice. Consult with counsel to determine applicable legal requirements in a specific situation.