Commission Recommendation on measures to effectively tackle illegal content online

Document type
Policy Document
Luiz Fernando
Marrey Moncau

The European Commission issued guidelines encouraging Member States and hosting service providers "to take effective, appropriate and proportionate measures to tackle illegal content online".

Chapter I of the document provides general recommendations. Those include:

  • the creation of easy to access and user-friendly mechanisms allowing the submission of notices reagarding the existence of illegal content. These mechanisms should encourage the notice provider to provide "sufficiently precise and adequately substantiated" notices "to enable the hosting provider concerned to take an informed and diligent decision in respect of the content to which the notice relates" (paragraphs 5-8)
  • Hosting providers should inform the content providers about the notice, allowing for counter-notice procedures. However, the recommendation establishes exceptions to the duty to inform the content provider, especially in cases of "serious criminal offences involving a threat to the life, or safety of persons", and when "a competent authority so requests for reasons of public policy and public security" (paragraphs 9-13)
  • Hosting providers are encouraged to facilitate "out-of-court settlements to resolve disputes" (paragraphs 14-15), to be transparent on the procedures established for the removal of content (paragraphs 16-17), and to take "proportionate and specific proactive measures in respect of illegal content", that may include the use of automated detection of illegal content. (paragraph 18)
  • The recommendation states that there should be "effective and appropriate safeguards to ensure that hosting service providers act in a diligent and proportionate manner in respect of content that they store". Where automated tools are being used, safeguards should consist, in particular, of human oversight and verifications, although the Commission also opines that human oversight is not necessary in some cases. (Paragraphs 19-20). Measures should be taken to avoid notices and counter-notices "that are submitted in bad faith and other forms of abusive behaviour" (paragraph 21).
  • Points of contact and fast-track procedures between hosting providers and Member States are encouraged. Member states are encouraged "to establish legal obligations for hosting service providers to promptly inform law enforcement authorities" (Paragraphs 22-24)
  • Cooperation between hosting service providers and with trusted flaggers is encouraged ("'trusted flagger' means an individual or entity which is considered by a hosting service provider to have particular expertise and responsibilities for the purposes of tackling illegal content online") (paragraph 25-28);

Chapter II provides specific recommendations relating to terrorist content. Those include: 

  • The inclusion in the hosting service provider terms of service "that they will not store terrorist content." (paragraph 30)
  • Member states should ensure that their authorities "have the capability and sufficient resources to effectively detect and identify terrorist content and to submit referrals to the hosting service providers concerned". Hosting providers must confirm the receipt of those referrals and inform the competent authorities of their decision in respect of the content. (Paragraphs 32-34).
  • "Hosting service providers should assess and, where appropriate, remove or disable access to content identified in referrals, as a general rule, within one hour from the moment at which they received the referral." (paragraph 35)
  • Cooperation between hosting providers to "prevent the dissemination of terrorist content across different hosting services." This includes the sharing and optimisation of technological tools (Paragraphs 38-39). Cooperation between hosing service providers and member states is also encouraged (Paragraph 40).

The recommendation was criticized by civil society groups, including EDRI, which considered that the European Commission was "pushing 'voluntary' censorship to internet giants to avoid legislation that would be subject to democratic scrutiny and judicial challenge." 


Topic, claim, or defense
General or Non-Specified
Dangerous Speech/Violent Extremism
Document type
Policy Document
Issuing entity
Transnational Organization (Includes Bilateral Agreement)
Type of service provider
Host (Including Social Networks)
Issues addressed
Notice Formalities
Trigger for OSP obligations
Procedural Protections for Users and Publishers
OSP obligation considered
Block or Remove
Monitor or Filter
General effect on immunity
Weakens Immunity
General intermediary liability model
Takedown/Act Upon Knowledge (Includes Notice and Takedown)
Takedown/Act Upon Administrative Request
Notice and Notice