By María Capurro Robles, OBSERVACOM project coordinator.
A new law passed by Congress establishes moderation, reporting, transparency, and accountability obligations on content platforms to ensure the safety, privacy, and well-being of children’s digital experiences. It also establishes prohibitions on advertising and data profiling, targeting the heart of a business model that grows voraciously at the expense of children’s rights.

The Brazilian Congress adopted a key instrument for the protection of children’s and adolescents’ rights in the digital environment, through the approval of Bill PL 2628 —known as the Digital ECA—which had been under debate since 2022.
In a context of fierce resistance from digital platforms and services to democratic, rights-based regulation of their activities, the passage of this law is the most significant advance in the region regarding the protection of children on the Internet.
The bill’s approval is also the result of a process that combined intense social and political debate, dramatic circumstances, and decisions by various branches of government aimed at protecting children, and thus regulating some of the platforms’ activities.
In 2023, a series of incidents of violence in schools precipitated public concern over a trend that had been growing exponentially for the past 10 years. These included premeditated and lethal attacks, bullying and intimidation, and interpersonal violence and self-harm.
These events, though originating from a multitude of causes, mobilized the Ministry of Justice and Public Security to adopt Ordinance 351/2023 with guidelines aimed at platforms to prevent the circulation of “illegal, harmful, and prejudicial content on social media platforms related to violent extremism that encourages attacks against schools and incites these crimes or their perpetrators.”
The following year, the National Council for the Rights of Children and Adolescents (CONANDA) published Resolution 245, based on existing regulations, which recognizes the duty of care of platforms and requires them to identify and mitigate risks to children, particularly those associated with online gambling.
The National Data Protection Authority took action against TikTok based on the Data Protection Law, and a São Paulo court also ruled against TikTok based on current labor regulations. The 2025 school year began with the implementation of a federal law limiting cell phone use in schools, an initiative previously adopted by several states.
These and other measures from various branches of government, coupled with a broader debate on the democratic regulation of platforms that Brazil has been undergoing in recent years, supported a widespread social consensus on the need and urgency of regulation. Legislative Bill 2628 advanced through the legislative sphere until it was approved by both chambers, defining the guidelines for a comprehensive framework for the protection of rights in the digital environment.
The approved law is fully aligned with the Convention on the Rights of the Child and, in particular, with General Comment No. 25 of 2021 on the rights of the child in relation to the digital environment of the UN Committee on the Rights of the Child, which is the universal standard in this matter. The Comment is clear about the obligation of States to establish and implement regulations and public policies to ensure that companies comply with a series of protection obligations. All self-initiated, self-regulatory, and similar measures that companies may adopt are complementary and in no way displace or replace state regulation. The Comment and previous ones are crystal clear in this regard: States and companies must respect, protect, and fulfill the rights of children in all areas of their lives.
Main issues regulated by law
On the one hand, the regulations establish the general obligation for all these technological products or services to strictly guarantee the safety, privacy, and well-being of children, from a perspective of the best interests of the child (Articles 3 and 5.3). This is done both in the design and operation phases of their products and services, to prevent and mitigate the risks of access, exposure, recommendation, or facilitation of contact with a series of content, products, and practices detailed in Article 6, which include sexual exploitation and abuse, intimidation, pornography, harassment, promotion and marketing of gambling, drugs, and prohibited products, predatory and deceptive advertising practices, incitement to self-harm, and attacks against children.
The law establishes a detailed series of actions that service providers must take to ensure security, privacy, and well-being at the various stages of development of their products and services (Articles 7 and 8). Specifically, it details the tools that must be deployed for access control and age verification (Articles 9 to 15) to protect children and adolescents from the aforementioned risk scenarios.
In addition, digital services will have to offer “appropriate technical measures” that allow adults in charge to limit usage time; monitor interactions; and restrict contacts and actions children can make (purchases, converting virtual items into real money, etc.) and those that services or unknown individuals can make regarding their online experience. These measures must also include alerts about potential risks.
Concern about consumer promotion and the impact of advertising on children has been around for a long time. Television regulations—even in the region’s backward and regressive audiovisual legal frameworks—contain provisions that restrict advertising and restrict its content. However, in a digital environment where advertising based on data mining and attention-grabbing is the main business, this concern becomes even more relevant.
The Committee on the Rights of the Child’s General Comment orders States to “prohibit by law the profiling or targeting of children of any age for commercial purposes through a digital record of their actual or inferred characteristics” as well as advertising practices that affect children’s digital experience and their rights. The standard is very clear, and Brazilian law adopts it. Therefore, it prohibits data profiling for commercial purposes and advertising directed at minors under 18 years of age using techniques such as emotional recognition or virtual and augmented reality (among others, according to Article 22).
In turn, Article 23 establishes a prohibition that gave rise to one of the ways of naming the law (the Anti-Adulterization Law), since it “prohibits Internet application providers from monetizing and promoting content that depicts children and adolescents in an eroticized or sexually suggestive manner, or in a specific context of the adult sexual universe.”
Perhaps the most contentious issue in the debate over regulations in this area is the obligation to remove content based on the protection of children, and its compatibility with respect for freedom of expression.
In this regard, Article 29 provides that service providers must “…remove content that violates [those] rights as soon as the victim, their representatives, the Public Prosecutor’s Office, or entities representing the defense of children’s rights inform them of the offensive nature of the publication, regardless of whether a court order exists.” To identify this content, it refers to the catalog in Article 6, already discussed.
The law also imposes on digital providers a series of transparency and periodic information production obligations, using an asymmetric approach, that is, aimed at players with more than one million users in the market. These obligations include reports on complaints, criteria and measures for content moderation, their impacts and results, among others detailed in Article 31. Likewise, they must guarantee transparency and access to data for academic, scientific, or journalistic research on the impact of their services on children.
Finally, it should be noted that, in the event of noncompliance with the obligations established by the law, the State may impose sanctions ranging from fines to the temporary suspension of activities or, in extremely serious cases, a ban on operating in the country.
One point waiting for a more precise definition concerns the institutional framework. The law calls for the creation of an autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, but this institution still needs to be established.
Final comments
In a regional context where the absence of protection for children in information and communication environments has been the norm for years, Brazil’s initiative can and should spark debate, challenge inaction, and inspire political action to make the digital environment a space for development and safe and meaningful online experiences for children.
Guaranteeing internet connectivity is not enough: States have an obligation to protect rights through regulations and public policies that offset a balance that is shamelessly tilted in favor of the economic and geopolitical interests of global private actors. Media and information literacy is another key component of an urgent strategy to prevent the digital environment from deepening exclusion in the most unequal region on the planet.
RELATED LINKS:
Ministry of Justice and Public Security of Brazil, Ordinance 351/2023

