UN Freedom of Expression Report, Document No. A/HRC/17/27 (May 2011)

Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (Frank La Rue)
Document type
Policy Document

This report summarizes the findings of the Special Rapporteur that came from a series of communications, meetings, seminars, and country visits. The report highlights the fact that Article 19 of the Universal Declaration of Human Rights (‘UDHR’) and the International Covenant on Civil and Political Rights (‘ICCPR’) were crafted broadly enough to encompass freedom of opinion and expression on the internet and through other technological means. Categories of information that may be restricted include: child pornography, hate speech, defamation, direct and public incitement to commit genocide, and advocacy of national, racial, or religious hatred that constitutes incitement.

Chapter III of the report summarizes the first principles of freedom of expression in general and on the internet. It underlines the applicability of international human rights norms and standards on the right to freedom of opinion and expression to the Internet as a communication medium, and sets out the exceptional circumstances under which the dissemination of certain types of information may be restricted. Chapters IV and V address two dimensions of Internet access respectively: (a) access to content; and (b) access to the physical and technical infrastructure required to access the Internet in the first place. More specifically, chapter IV outlines some of the ways in which States are increasingly censoring information online, namely through: arbitrary blocking or filtering of content; criminalization of legitimate expression; imposition of intermediary liability; disconnecting users from Internet access, including on the basis of intellectual property rights law, cyber attacks, and inadequate protection of the right to privacy and data protection. Chapter VI contains the Special Rapporteur’s conclusions and recommendations concerning the main subjects of the report.” Chapters I, II, and V are not relevant for the purpose of this Report.

 

Arbitrary blocking or filtering of content (Paragraph 9-10)

This highlights the use of “just in time” blocking, which is censorship that prevents users from accessing or disseminating information during important political and social moments. The report mentions that “States’ use of blocking or filtering technologies is frequently in violation of their obligation to guarantee the right to freedom of expression,” including because "blocking is not justified to pursue aims which are listed under article 19, paragraph 3, of the ICCPR, and blocking lists are generally kept secret, which makes it difficult to assess whether access to content is being restricted for a legitimate purpose" and blocks are “often not sufficiently targeted and render a wide range of content inaccessible beyond that which has been deemed illegal” (Paragraph 31).

 

Imposition of Intermediary Liability

The European Union’s E-Commerce Directive enables intermediaries to avoid liability for content if it does not have knowledge of illegal activity and removes it once it becomes aware. The Digital Millennium Copyright Act has similar provisions in the United States (Paragraph 41).

The report express concerns about notice-and-takedown regimes for two reasons. First, users whose content has been flagged for removal have little or no recourse. Second, intermediaries could err on the side of removal to avoid penalties, thereby censoring legitimate, legal content (Paragraph 42). The framework of “Protect, Respect, and Remedy” is based on three pillars: (1) States’ duties to uphold human rights, (2) corporate responsibilities to do the same, (3) the need for victims to receive effective remedy (Paragraph 47). The rapporteur also emphasizes that restrictions of content on the Internet must comply with the three-Part Test (paragraph 69).

The Special Rapporteur highlights that any blocking or filtering of content should be accompanied with an explanation to users and to the operators of the websites that are blocked. The Special Rapporteur also calls for states to decriminalize defamation. Intermediaries should only be held responsible for removing content pursuant to legal orders issued by a court or “a competent body” that is independent of commercial and political influence (paragraph 70).

The summary of this document is part of the report produced on the Stanford Law School Intermediary Liability and Human Rights Policy Practicum and is based on the work of Shane Seppinni. The full report “The ‘Right to Be Forgotten’ and Blocking Orders under the American Convention: Emerging Issues in Intermediary Liability and Human Rights”, can be accessed here.

 

Year
2011
Topic, claim, or defense
General or Non-Specified
Copyright
E-Commerce
Freedom of Expression
Document type
Policy Document
Issuing entity
Transnational Organization (Includes Bilateral Agreement)
Issues addressed
Trigger for OSP obligations
Procedural Protections for Users and Publishers
Transparency
OSP obligation considered
Block or Remove
Monitor or Filter