Explore

Topic, claim, or defense
Show in map
Court Decision

Delfi v. Estonia

This case concerned an online news site’s liability for threats and anti-Semitic slurs posted by users in the site’s comments section. Estonian courts held, and the European Court of Human Rights (ECtHR) Grand Chamber affirmed, that the platform could be liable for those comments – even before it knew about them. The ECtHR reviewed the Estonian ruling solely for compliance with the European Convention on Human Rights -- it did not consider specific laws, such as the EU’s eCommerce Directive, except as necessary for the human rights law assessment. Defendant news site, Delfi, had both proactive and reactive measures in place to bar inappropriate user comments. It removed the comments at issue in the case when it learned about them. It argued that strict liability for user comments, and the de facto monitoring obligation...
Legislation

Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online

Regulation on Terrorist Content Online
The Regulation aims to ensure the smooth functioning of the digital single market by addressing the misuse of hosting services for terrorist purposes. The Regulation establishes a definition of ‘terrorist content’ for preventative purposes, in connection with the definitions of relevant offences under Directive (EU) 2017/541 of the European Parliament and of the Council. It also includes an exception regarding material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity. The Regulation applies to all providers of relevant services offered in the Union, irrespective of the country of their main establishment. A hosting service provider should be considered offering services in the Union if it enables natural or legal persons in one or...
Legislation

Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities

Audiovisual Media Services Directive AVMSD
The EU Audiovisual Media Services Directive (AVMSD) was adopted in November 2018, aiming to better reflect the digital age and create a more level playing field between traditional television and newer on-demand and video-sharing services. The Directive encompasses a series of duties of so-called video sharing platforms (VSPs) concerning the prevention and moderation of content that constitutes hate speech and child pornography, affects children’s physical and mental development, violates obligations in the area of commercial communications, or can be considered as terrorist. National authorities (mainly independent media regulatory bodies) are given the responsibility of verifying that VSPs have adopted “appropriate measures” to properly deal with the types of content mentioned above (alongside other undesirable...
Policy Document

Commission Recommendation on measures to effectively tackle illegal content online

The European Commission issued guidelines encouraging Member States and hosting service providers "to take effective, appropriate and proportionate measures to tackle illegal content online". Chapter I of the document provides general recommendations. Those include: the creation of easy to access and user-friendly mechanisms allowing the submission of notices reagarding the existence of illegal content. These mechanisms should encourage the notice provider to provide "sufficiently precise and adequately substantiated" notices "to enable the hosting provider concerned to take an informed and diligent decision in respect of the content to which the notice relates" (paragraphs 5-8) Hosting providers should inform the content providers about the notice, allowing for counter-notice procedures. However, the recommendation...
Policy Document

A Digital Single Market Strategy for Europe, COM(2015) 192 final.

European Commission, Communication,
The Strategy plans the introduction of enhanced obligations that websites and other Internet intermediaries should have for dealing with unlawful third-party content and discusses what regulations should apply to a subset of those intermediaries deemed “internet platforms.” The Commission would like to discuss whether "whether to require intermediaries to exercise greater responsibility and due diligence in the way they manage their networks and systems – a duty of care." (emphasis added). The Commission refers to enhanced responsibilities for dealing with illegal content, such as child pornography, terrorist materials, and content that infringes upon intellectual property rights. (See § 3.3)
Legislation

Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019

Amends the Criminal Code Act 1995
Following the Christchurch terrorist attack on 15 March 2019, the Australian Government passed an amendment to the Criminal Code to require social media platforms to expeditiously remove abhorrent violent material and, where relevant, refer it to the Australian Federal Police. The law came into force on 6 April 2019. The law applies to social media services and hosting services, whether located within or outside Australia. It creates an offence for an internet service provider, or a person providing a social media service or hosting service, to fail to refer to the Australian Federal Police material that can be accessed on their service that they have reasonable grounds to believe records or streams abhorrent violent conduct occurring in Australia (s. 474.33). The maximum penalty is $168,000 for an individual or $840...