2. Service terms and conditions
2.1 The terms of service (TOS) of all content platforms, as well as other complementary documents (such as guides or content application guidelines) should be written in a clear, precise, intelligible and accessible way for all users. Large platforms should also present them in the user’s national language.
2.2 All content platforms should establish and implement TOS that are transparent, clear, accessible and in accordance with international human rights norms and principles, including the conditions under which interference with the right to freedom of expression or User privacy1. In particular, the user should be informed of the conditions that may lead to the termination of the contract (account drop, for example) as well as the elimination, de-indexing or significant reduction of the scope of their expressions and contents from unilateral modifications of algorithms of healing.
2.3 No content platform should be able to unilaterally change the terms of service and conditions of use, or apply new terms, without clearly informing the user of the justification and without giving, under reasonable notice, the possibility of canceling the contract, without having economic or legal consequences derivative of that contract2.
2.4 Platforms should not establish abusive or asymmetric conditions on the use and ownership of the content generated and published by its users, respecting their copyright in the same way that users should comply with regulations regarding content generated by third parties. In this regard, restrictions arising from copyright protection should consider the limitations and exceptions recognized in international treaties and national laws.
2.5 The terms of services should not grant unlimited and discretionary power to the platforms to determine the suitability of user-generated content3. In particular, the TOS that could imply limitations in the exercise of the right to freedom of expression and access to the information of its users should not be formulated in a vague or broad way, in such a way that they allow an arbitrary interpretation and application by the platform.
2.6 Regarding the curation / prioritization of the visualization of the content generated by its users (in news feeds, search results, news access services and other similar), large platforms should:
a. Transparent the criteria used by the algorithms for ordering or addressing, if possible, explaining the effects to the user.
b. Do not use arbitrary or discriminatory criteria that could illegitimately affect the freedom of expression and the right to information of its users.
c. Provide customized filtering mechanisms in a clear, transparent, explicit, revocable / editable manner and under user control, so that they decide what content they want to prioritize and how they want to do it (eg, chronological order).
d. Respect the user’s right to know and define which of their personal data are collected and stored, and how they are used in the content addressing, respecting the principle of informational self-determination.
2.7 In case the large platforms decide, by themselves, to incorporate in their TOS certain restrictions and even prohibitions to the publication of contents or expressions generated by their users they could only do so with the following limitations, so that they are compatible with the international standards of human rights:
a. They could prohibit, even by automatic filtering, those contents that are clearly and manifestly illegal and that, at the same time, are recognized as legitimate limitations to freedom of expression in international human rights declarations or treaties such as: sexual abuse or exploitation of minors, or propaganda in favor of war and any apology of national, racial or religious hatred that constitutes incitement to violence or any other similar illegal action against any person or group of people, for any reason, including those of race, color, religion, language or national origin4.
b. They could restrict, as a non-definitive precautionary measure, contents that, even if they are not recognized as illegal, cause serious, imminent and irreparable damage or difficult reparation, to other persons such as: unauthorized dissemination of sexual content or acts of violence or explicit and excessive or aberrant cruelty. In these cases, the list and definitions of restricted content should be included in the TOS in a restrictive, clear, precise manner and considered, in the analysis of the measure to be taken, the context of the expression published and not used in the framework of legitimate expressions (educational, informative or other content).
c. Contents such as cyberbullying or explicit and abusive drug use could be restricted to specific audiences, such as children and teenagers.
d. Any other measure of prioritization or restriction to expressions and other content generated by its users that the platform could consider –for commercial or other reasons- “offensive”, “inappropriate”, “indecent” and similar vague or broad definitions that could illegitimately affect the Freedom of expression, large platforms should provide mechanisms and notices for other users – voluntarily and based on their moral, religious, cultural, political or other preferences – who decide what content they want to have access to and what not. These contents should not be prohibited, deleted or reduced in scope by default disproportionately affecting the right to freedom of expression of its users.
——————————————–
1 Freedom of expression and Internet, Office of the Special Rapporteur for Freedom of Expression IACHR, 2013, para. 112
2 Agreement between EU with Facebook, Google and Twitter (2018) – “Better social media for European consumers”
3 Last sentence taken from the EU Agreement with Facebook, Google and Twitter in 2018 “Better social media for European consumers”
4American Convention on Human Rights, art. 13, inc. 5
Give us your opinion
Error: Formulario de contacto no encontrado.