4. Application of policies and due process
4.1 In the design and application of their community content management policies, platforms should seek that any restriction arising from the application of the terms of service does not unlawfully or disproportionately restrict the right to freedom of expression1 for which they must respect the requirements of searching for an imperative purpose, as well as the need, suitability and proportionality of the measure to achieve the intended purpose2.
4.2 The criteria for making decisions, so as not to affect human rights, should take into account the context, the wide variation of idiomatic nuances and the meaning and the linguistic and cultural peculiarities of the contents subject to a possible restriction3.
4.3 In addition, in the analysis of the content restriction measures applicable in each case, the principles of proportionality and progressivity should be respected, weighing the severity of the damage, the recurrence of the violations and the impact that such restrictions could have on the Internet capacity to guarantee and promote freedom of expression with respect to the benefits that the restriction would bring to the protection of other rights4.
4.4 Users should always have the right to have the content restriction decisions of the platforms themselves respected in due process, particularly when it comes to measures that could affect their right to freedom of expression. As a general principle, and except for duly justified exceptional cases, people affected by a restriction or interference measured by the platforms and, where appropriate, the general public, must be notified in advance about the restriction measures that affect them5.
4.5 In view of the aforementioned principles of necessity and proportionality, in case of possible breaches of the TOS the platforms should adopt less burdensome measures than the removal or others of similar effects, opting for warning or notification mechanisms, flagging, linking with opposing information or other.
4.6 The most drastic unilateral measures that were adopted without notice or due to prior process, such as lowering of accounts or profiles, removal of content, or other measures that have a similar impact of exclusion from the possibilities of participating in the platform should be taken by large platforms only under the following conditions:
a. When dealing with non-arbitrary or discriminatory technical management interventions (such as spam, fake accounts, malicious bots or similar),
b. When dealing with duplicates or reiterations in “crude” (not commented on or edited for journalistic or informative purposes or other legitimate purposes) of other contents and expressions of obvious illegality that were already restricted after human evaluation following the aforementioned standards,
c. When the following situations are identified:
a. The grounds set forth in section 2.7 A;
b. the fulfillment of orders of competent authorities of immediate withdrawal and the consummation of common crimes already typified in national legislation;
c. serious, imminent and irreparable damage or of difficult reparation to the rights of other persons as in the cases listed in point 2.7 literals B and C.
In all these cases, except in the case of orders from competent authorities, the platform should proceed to the immediate subsequent notification, with the possibility of appeal for a possible revision of the measure under the terms of chapter 5 of this document.
4.7 Upload filtering and filtering is only legitimate and compatible with international human rights standards when it comes to child pornography6 or in the first two situations described in the previous point. Otherwise, it should be considered as an act of censorship, under the terms established by the American Convention on Human Rights.
4.8 Any other measure of restriction of contents or expressions that the platform intends to adopt in the event of a possible breach of the TOS or the complaint of third parties (for example with regard to an impact on copyright), the content questioned should be kept published on the platform until a final decision arising from a due process where, after notification a) the voluntary withdrawal of the questioned content was promoted b) the exercise of the right to defense of the user will be guaranteed allowing a counter-notification with filing of discharges, before taking a decision.
4.9 No content platform should have legal responsibility for content generated by third parties, as long as they do not intervene by modifying or editing those contents, nor do they refuse to execute judicial orders or from competent and independent official authorities that comply with appropriate due process guarantees.
4.10 Content platforms can only be held responsible depending on their actions or negligence in the prioritization or active promotion of expressions that could affect the rights of third parties if they depart from the principles established in point 2.6. In these cases the legal responsibility that may correspond to the platforms for third party expressions or healing activities should not be of the objective type.
1 Freedom of expression and Internet, Office of the Special Rapporteur for Freedom of Expression IACHR, 2013, para. 112
2 Ídem, para. 55
3 Tomado del Informe 2018 David Kaye (en el original hace referencia a recomendaciones de transparencia a las plataformas
4Freedom of expression and Internet, Office of the Special Rapporteur for Freedom of Expression IACHR, 2013, para. 54
5 Ídem, para. 115
6American Convention on Human Rights, art. 13, inc. 4