25 standards for a more transparent Internet presented in Colombia

In an effort to achieve democratic governance of the digital public space, several Colombian social organizations are promoting a standards initiative to demand greater transparency and accountability from large digital platforms. This initiative seeks to guarantee the fundamental right to freedom of expression, counteract disinformation and promote a more inclusive and open Internet.

The lack of transparency and accountability in the content moderation processes of large digital platforms continues to be a global concern, with direct impacts on fundamental rights such as freedom of expression. In Colombia, this challenge prompted an alliance of organizations to work on developing standards to strengthen the governance of large digital platforms and the inclusive nature of the digital public space, and to enable a freer and more open Internet.

This effort is carried out within the framework of a growing concentration of power in a few transnational corporations that operate as guardians of access to online information, disproportionately affecting vulnerable groups such as women, indigenous peoples, Afro-descendants, and the LGBTI community. It restricts  the work of journalists, independent media, and human rights defenders, who depend on an open digital environment.

The proposal, whose initial draft was prepared by OBSERVACOM and El Veinte, involved building progressive agreements around transparency standards, through a series of consultations, virtual meetings and in-person workshops involving organizations and entities that participated in the Social Media 4 Peace -SM4P- project in Colombia.

The development of this series of standards, based on principles of UNESCO and the Inter-American Commission on Human Rights’(IACHR) Special Rapporteur for Freedom of Expression, aims to set a benchmark for moving toward democratic regulation that demands greater transparency from large platforms. With an approach appropriate to the Colombian context, the proposed standards seek to empower users to achieve the full exercise of fundamental rights online -including freedom of expression- counteract the impact of disinformation and hate speech, and promote a freer, more equitable and accessible Internet for all people.

 

The main standards of the proposal:

Transparency in the rules of the game: Content platforms must ensure that their terms of service and community standards are written in a clear, precise and accessible manner in the national language, to ensure that they are understood by users. It is also essential that the types of prohibited content and activities that could lead to sanctions such as the removal of content, scope restrictions or account suspension are detailed in a visible and centralized manner. Finally, within this item, platforms are required to provide information about the human moderators in charge of addressing local issues, specifying their location, experience in the national context and training in human rights, thus ensuring culturally and linguistically appropriate moderation processes.

Algorithmic transparency: Digital platforms must guarantee transparency in the use of algorithms for the moderation and curation of content, explaining the criteria that impact its visibility, differentiating those that are controllable by users. In addition, they must clearly identify promoted, sponsored or political content, identifying those responsible and providing information on the metadata. Regarding personal data, they must report how it is collected, used and processed for algorithmic decisions, specifying the handling of sensitive data. Finally, they must facilitate access to non-personal and pseudonymous data for universities and researchers, promoting a detailed analysis of the impact of their automated systems on processes such as ranking, targeting and recommendation, along with the degree of human supervision involved.

Empowerment of users: Platforms must promptly notify users when their content is removed, their accounts suspended or measures are adopted that affect their participation, clearly explaining the reasons. They must also provide accessible information in the user’s language about the available appeal processes, including details of the procedure, estimated times and how such appeals are evaluated. These guarantees must allow users to exercise their right to defense, ensuring a fairer and more transparent interaction with the platforms’ decisions.

Transparency of electoral advertising: During electoral periods, platforms must ensure transparency of political ads by publishing key information: the amount, author, financers, and targeted reach of such ads. This information must be kept in an online database, accessible to any user, and clearly visible to recipients of the message. In addition, platforms must report on the use of personal data for targeting advertising and appropriately label related AI generated content, ensuring a positive impact on civic space and reducing the risk of disinformation or hate speech.

Accountability for their actions: Platforms must be held accountable by publishing periodic reports with detailed and disaggregated information on content restrictions applied in the country. These reports must include data on removals, reach reductions, suspensions, and account blockings, specifying whether these actions were taken in response to state requests, court decisions, private requirements, or the platform’s own policies. They should also include information on requests made by state actors, such as government agencies and regulators, outlining the reasons given, the decisions taken and the results obtained. It is also essential to detail the impact of automated and human moderation, as well as complaints and appeals made by users and their outcomes.

State transparency: States have the responsibility to be transparent about their involvement in content moderation decisions, reporting on requests or demands made to platforms, including the legal basis for each case. In addition, they must ensure that companies are not prevented from publishing detailed information about these requests, unless there is a clear legal basis and the restriction is strictly necessary and proportionate to achieve a legitimate objective. These measures seek to ensure that States do not obstruct the transparency of platforms, but rather promote adequate accountability for their role in the governance of the digital environment.

The full document can be consulted HERE.


ENLACES RELACIONADOS:

Organizaciones colombianas forman alianza para proteger el espacio público digital

Related posts

Progressive presidents commit to joint actions for algorithmic transparency

Argentinian court finds Google guilty of violating man’s privacy in Street View Case

Brazilian court redefines exceptions to the principle of non-liability of digital platforms