Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online

Regulation on Terrorist Content Online
Document type
Legislation

The Regulation aims to ensure the smooth functioning of the digital single market by addressing the misuse of hosting services for terrorist purposes. 

The Regulation establishes a definition of ‘terrorist content’ for preventative purposes, in connection with the definitions of relevant offences under Directive (EU) 2017/541 of the European Parliament and of the Council. It also includes an exception regarding material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity. 

The Regulation applies to all providers of relevant services offered in the Union, irrespective of the country of their main establishment. A hosting service provider should be considered offering services in the Union if it enables natural or legal persons in one or more Member States to use its services and has a substantial connection to that Member State or those Member States. 

The Regulation harmonises procedure and obligations resulting from removal orders requiring hosting service providers to remove or disable access to terrorist content, following an assessment by the competent authorities. Based on Given the speed at which terrorist content is disseminated across online services, the Regulation obliges hosting service providers to ensure that the terrorist content identified in the removal order is removed or access to it is disabled in all Member States within one hour of receipt of the removal order. Except for in duly justified cases of emergency, the competent authority should provide the hosting service provider with information on procedures and applicable deadlines at least 12 hours in advance of issuing for the first time a removal order to that hosting service provider. The removal order should contain a statement of reasons qualifying the material to be removed or access to which is to be disabled as terrorist content and provide sufficient information for the location of that content, by indicating the exact URL and, where necessary, any other additional information, such as a screenshot of the content in question.

Both the content provider and the hosting service provider should have the right to request to the competent authority to determine whether the order seriously or manifestly infringes this Regulation or the fundamental rights enshrined in the Charter. Where such a request is made, that competent authority should adopt a decision on whether the removal order comprises such an infringement. Where that decision finds such an infringement, the removal order should cease to have legal effects. 

Hosting service providers that are exposed to terrorist content should, where they have terms and conditions, include therein provisions to address the misuse of their services for the dissemination to the public of terrorist content. They should apply those provisions in a diligent, transparent, proportionate and non-discriminatory manner.

Hosting service providers exposed to terrorist content should put in place specific measures taking into account the risks and level of exposure to terrorist content as well as the effects on the rights of third parties and the public interest to information. Hosting service providers should determine what appropriate, effective and proportionate specific measure should be put in place to identify and remove terrorist content. Specific measures could include appropriate technical or operational measures or capacities such as staffing or technical means to identify and expeditiously remove or disable access to terrorist content, mechanisms for users to report or flag alleged terrorist content, or any other measures the hosting service provider considers appropriate and effective to address the availability of terrorist content on its services.

Where the competent authority considers that the specific measures put in place are insufficient to address the risks, it should be able to require the adoption of additional appropriate, effective and proportionate specific measures. The requirement to implement such additional specific measures should not lead to a general obligation to monitor or to engage in active fact-finding within the meaning of Article 15(1) of Directive 2000/31/EC or to an obligation to use automated tools.

Hosting service providers that have taken action or were required to take action pursuant to this Regulation in a given calendar year should make publicly available annual transparency reports containing information about action taken in relation to the identification and removal of terrorist content. The competent authorities should publish annual transparency reports containing information on the number of removal orders, the number of cases where an order was not executed, the number of decisions concerning specific measures, the number of cases subject to administrative or judicial review proceedings and the number of decisions imposing penalties.

For the purposes of this Regulation, Member States should designate competent authorities.

Topic, claim, or defense
Dangerous Speech/Violent Extremism
Public Order (Includes National Security)
Freedom of Expression
Document type
Legislation
Issuing entity
Legislative Branch
Type of service provider
Host (Including Social Networks)
Issues addressed
Notice Formalities
Trigger for OSP obligations
Procedural Protections for Users and Publishers
OSP obligation considered
Block or Remove
Monitor or Filter
Account Termination
Data Retention or Disclosure
General effect on immunity
Mixed/Neutral/Unclear
General intermediary liability model
Takedown/Act Upon Knowledge (Includes Notice and Takedown)