Observacom
Sin categoría

Brazilian court redefines exceptions to the principle of non-liability of digital platforms

Brazil’s Supreme Federal Court declared Article 19 of the Civil Rights Framework for the Internet, usually referred to in Portuguese as the Marco Civil partially unconstitutional and expanded the exceptions to the principle of non-liability of platforms for third-party content. Although platforms already actively moderate (including removing millions of posts) under their own rules, the decision incorporates state criteria—in addition to those already established by Article 21 of the March Civil —that hold the companies liable if they fail to act on illegal content, even with extrajudicial notifications or through their own actions when they should do so. This represents a shift from current regulatory paradigms in the region.
Photo: Fellipe Sampaio / Supreme Federal Court (STF)

In a historic ruling historic, the Supreme Federal Court (STF) ruled by 8 votes to 3 to expand the exceptions to the principle of non-liability established in Article 19 of the Civil Framework for the Internet (MCI), establishing that digital platforms can be held civilly liable for illegal content published by their users.

Until now, the Marco Civil limited the liability of companies for third-party content in two circumstances: when they did not comply with a specific court order to remove a certain publication from their networks (art. 19) or when -without the need for a prior court order- “they do not diligently promote, within the scope and technical limits of their service, the unavailability” (i.e. by removal or blocking) of non-consensual publications of intimate material, defined as “the violation of privacy arising from the disclosure, without authorization of its participants, of images, videos or other materials containing scenes of nudity or sexual acts of a private nature” (art. 21).

Following the Supreme Court’s ruling, this criterion has been expanded: as long as new legislation is not enacted, they will also be subject to civil liability for failure to act on complaints (extrajudicial notifications) of crimes or unlawful acts in general, except for crimes against honor and reputation—such as slander, defamation, or libel—in which cases a court order will continue to be required to compel platforms to act without being held liable.

In the case of serious illegal content, the Supreme Court established a reinforced duty of care on the part of the companies themselves, which must act even without an external complaint. Platforms will be liable if they do not immediately remove this content, without prior notification from a judicial, state, or victim authority. The cases of “serious situations” refer—specifically and anchored in legal norms currently in force in Brazil—to the following types of crimes: antidemocratic acts provided for in the Penal Code; terrorism and preparatory acts of terrorism; inducement, instigation, or aiding suicide or self-mutilation; incitement to discrimination based on race, color, ethnicity, religion, national origin, sexual orientation, or gender identity; gender-based violence against women, including content that spreads hatred or aversion; sexual crimes against vulnerable people, child pornography, and other serious crimes against children and adolescents; and human trafficking. In these cases, the platforms are presumed to have knowledge of the content due to its severity and the notoriety of the harm.

The court’s President, Luís Roberto Barroso, proposed the intermediate model that was adopted by the majority: “There is no constitutional basis for a regime that incentivizes platforms to remain inactive after becoming aware of clear violations of criminal law,” he argued.

Another highlight of the ruling is the presumption of liability in the case of content sponsored or widely promoted by bots, situations in which platforms are deemed to have prior knowledge. They may also be automatically held liable if a systemic failure in their moderation mechanisms is proven, when they fail to adopt effective technical standards to moderate harmful content.

Private messaging services such as WhatsApp and email will continue to require a court order to establish liability. Meanwhile, marketplaces will continue to be governed by the Consumer Protection Code.

Furthermore, the ruling imposes new structural obligations: platforms must have headquarters or legal representation in Brazil, offer accessible reporting channels, and publish periodic transparency reports. It also establishes the obligation to implement self-regulatory mechanisms that include notification systems, due process, and annual reports on promoted ads and content.

Regarding this legal redefinition, João Brant, Brazil’s Secretary of Digital Policies, considered it a balanced model that marks a new regulatory paradigm. In his view, it is a modern regime, “partly aligned with the European Union and the United Kingdom, partly more cautious, and partly more protective of rights,” which makes it “appropriate for the Brazilian reality.”

The new criteria will apply only to future cases, preserving proceedings initiated before June 26, 2025. While the court’s decision does not mark the beginning of content moderation, it does introduce a legal framework that more precisely defines platforms’ responsibilities regarding harmful content, setting a relevant precedent for the regulatory debate in the region.

Although the platforms already actively exercised moderation functions using their own criteria, the Supreme Court’s decision introduces legal definitions that set out in which cases and under what conditions the companies must act, given that their decisions could lead to legal liability. In this regard, some civil society actors warn that it could pose risks for freedom of expression.


RELATED LINKS:

Las nuevas responsabilidades de las plataformas tras fallo de la Corte de Brasil

Entenda a decisão do STF sobre responsabilização das redes sociais

STF: Redes sociais respondem por posts mesmo sem ordem judicial; veja tese

Ir al contenido ... (en este espacio hay varias líneas de código de programación personalizado)