6.1 Content platforms should publish transparency reports that provide specific information about all content restrictions adopted by the intermediary, including actions taken before government requests, court orders, private requirements, and on the implementation of their restriction policies of content1.
6.2 Content platforms should issue periodic transparency reports on the application of their community rules that include at least:
a. Full data describing the categories of user content that are restricted (text, photo or video; violence, nudes, copyright violations, etc.), as well as the number of pieces of content that were restricted or deleted in each category, detailed by country2.
b. Data on how many content moderation actions were initiated by a user’s report (flag), a trusted flagger program or by the proactive application of community standards (for example, through the use of a machine learning algorithm)3.
c. Data on the number of decisions that were effectively appealed or determined to have been made in error4.
d. Data reflecting whether the company performs a proactive audit of its non-appealed moderation decisions, as well as the error rates that the company found5.
e. Aggregated data that illustrate trends in the field of monitoring compliance with standards and examples of real cases or detailed hypothetical cases that clarify the nuances of the interpretation and application of specific standards6.
1 Manila Principles
2 Santa Clara Principles
3 Santa Clara Principles
4 Santa Clara Principles
5 Santa Clara Principles
6 Regulation of content on the Internet, Special Rapporteurship on the Promotion and Protection of the Right to Freedom of Opinion and Expression, 2018