Explore

Topic, claim, or defense
Show in map
Court Decision

Delfi v. Estonia

This case concerned an online news site’s liability for threats and anti-Semitic slurs posted by users in the site’s comments section. Estonian courts held, and the European Court of Human Rights (ECtHR) Grand Chamber affirmed, that the platform could be liable for those comments – even before it knew about them. The ECtHR reviewed the Estonian ruling solely for compliance with the European Convention on Human Rights -- it did not consider specific laws, such as the EU’s eCommerce Directive, except as necessary for the human rights law assessment. Defendant news site, Delfi, had both proactive and reactive measures in place to bar inappropriate user comments. It removed the comments at issue in the case when it learned about them. It argued that strict liability for user comments, and the de facto monitoring obligation...
Legislation

Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities

Audiovisual Media Services Directive AVMSD
The EU Audiovisual Media Services Directive (AVMSD) was adopted in November 2018, aiming to better reflect the digital age and create a more level playing field between traditional television and newer on-demand and video-sharing services. The Directive encompasses a series of duties of so-called video sharing platforms (VSPs) concerning the prevention and moderation of content that constitutes hate speech and child pornography, affects children’s physical and mental development, violates obligations in the area of commercial communications, or can be considered as terrorist. National authorities (mainly independent media regulatory bodies) are given the responsibility of verifying that VSPs have adopted “appropriate measures” to properly deal with the types of content mentioned above (alongside other undesirable...
Self-Regulation/Voluntary Agreement/Code of Conduct

Code of Conduct on Countering Illegal Hate Speech Online

Commission agreed with all major online hosting providers—including Facebook, Twitter, YouTube and Microsoft—on a code of conduct that includes a series of commitments to combat the spread of illegal hate speech online in Europe.
Policy Document

European Commission, Communication, Online Platforms and the Digital Single Market: Opportunities and Challenges for Europe, COM(2016) 288 Final, May 25, 2016

The Communication apparently endorses the plan of maintaining the existing intermediary liability regime. However, the Commission stresses that “a number of specific issues relating to illegal and harmful content and activities online have been identified that need to be addressed.” In this regard, the Commission would launch a “sectorial legislation … and problem-driven approach.” Apparently, this sectorial action will target copyright-protected content, minors’ protection from harmful content, and incitement through hatred. This should happen through a mix of legislative interventions—by updating the audiovisual and copyright regulations—and promotion of voluntary self-regulatory actions. The Communication puts forward the idea that “the responsibility of online platforms is a key and cross-cutting issue.”
Policy Document

A Digital Single Market Strategy for Europe, COM(2015) 192 final.

European Commission, Communication,
The Strategy plans the introduction of enhanced obligations that websites and other Internet intermediaries should have for dealing with unlawful third-party content and discusses what regulations should apply to a subset of those intermediaries deemed “internet platforms.” The Commission would like to discuss whether "whether to require intermediaries to exercise greater responsibility and due diligence in the way they manage their networks and systems – a duty of care." (emphasis added). The Commission refers to enhanced responsibilities for dealing with illegal content, such as child pornography, terrorist materials, and content that infringes upon intellectual property rights. (See § 3.3)