Uruguay, Chile, and Mexico require ride-hailing apps to make their algorithms transparent

Three Latin American countries now require ride-hailing apps to explain how their algorithms work. These regulations advance transparency, establish clear obligations, and ensure state oversight, offering a relevant precedent for the debate on regulating other digital intermediaries that use automated decision-making with impacts on people’s rights.

The latest labor-law reforms in Uruguay, Chile and Mexico show that algorithmic transparency is not a utopia — nor something that only the European Union can discuss and implement. These three Latin American countries now have laws requiring ride-hailing and delivery apps to explain how their systems assign trips, monitor performance, and make automated decisions. 

In Chile, Law 21.431 of 2022 amended the Labor Code to regulate “the employment contracts of workers in digital service platform companies.” The law acknowledges the labor relationship with people providing transport or delivery services, and requires the apps to include in the employment contract detailed information about how the worker’s personal data will be treated, the criteria used to match workers to users, and the method for calculating remuneration.

The law includes a specific chapter on transparency and the right to information, establishing that workers can request access to data related to their performance and evaluation, find out how these metrics affect the assignment of tasks, and demand that these criteria be objective and non-discriminatory. The Labor Directorate (Dirección del Trabajo, the enforcement authority for the law), can request information about the algorithms used to verify compliance and apply sanctions if companies do not comply.

Mexico moved in the same direction with the 2024 reform to the Federal Labor Law, which created Chapter IX Bis on “Work on Digital Platforms.” This chapter defines what is constitutes a digital platform and who are “platform workers”, and requires companies to sign contracts —which can be digital— detailing working conditions.

Article 291-J explicitly introduces algorithmic transparency: platforms must disclose the rules used to assign tasks “through algorithms or similar mechanisms” and produce a “policy document on algorithmic management of work.” This document, which forms part of the contract and must be authorized and registered by the Federal Center for Conciliation and Labor Registration, must explain the consequences of fulfilling or failing to fulfill orders, the effects of user ratings, incentives and penalties, and other criteria that influence the intensity, frequency, and pace of work. The law also expressly prohibits the use of algorithms in a discriminatory way and establishes significant fines for companies that fail to register their contract models or to meet transparency obligations.

Uruguay took an even more detailed step with Law 20.396 of 2025, which establishes “minimum levels of protection for workers who provide services through digital platforms.” Article 4 focuses on the “transparency of algorithms and monitoring systems” and requires companies to inform each worker about the existence of automated tracking, control, and evaluation systems, as well as automated decision-making systems that affect their working conditions: access to tasks, income, working hours, health, and safety.

The law requires that information about the logic, criteria, and parameters used by algorithms be understandable and respect the principles of equality and non-discrimination. Articles 6 and 7 reinforce this obligation with two key rights: access to information in durable form (including electronic records) and the right to an explanation of significant decisions, along with the possibility of contacting a human being to discuss and clarify the situation. The Ministry of Labor and Social Security, through the General Labor Inspectorate, is empowered to request this information and sanction non-compliance.

Despite differences in legislative design, the three laws share some common elements. One of them, central to the requirement of transparency, is that they transform the “black box” of algorithms into a regulated object: they require companies to document how these systems work, inform the affected parties, and provide information to local authorities—in this case, labor authorities. 

None of the laws require the companies to publish source code or disclose commercially sensitive formulas to the general public. What they require is a clear and comprehensible explanation — in language accessible to workers — of which variables are considered, how they are combined, and what consequences their operation has for workers. They also empower national authorities to access this information for oversight. In other words, these countries have struck a balance between protecting certain business interests and guaranteeing labor rights, without accepting algorithmic opacity as an inevitable fact.

The examples from Uruguay, Chile and Mexico show that countries in the region — with varying economic and political clout and facing significant power imbalances with large global platforms — can impose clear rules on big tech when their algorithms affect fundamental rights. All three identified competent authorities, defined obligations, and established concrete sanctions for non-compliance.

Far from being mere statements of principle, these laws demonstrate that algorithmic transparency can be mandated and enforced — even by smaller states facing global tech giants. 

The precedents demonstrate that algorithmic transparency can be a legally enforceable and accountable obligation and open an unavoidable discussion for other platforms whose impact is not limited to the workplace, but directly affects people’s rights to information and freedom of expression.


RELATED LINKS: 

Law 21431 – Amends the labor code regulating the contract of workers of digital service platform companies

Federal Labor Law

Law No. 20396

Related posts

Australia insists that large platforms pay media outlets for using their content

From search engine to answer engine

Should digital platforms pay media outlets and journalists? Two studies by OBSERVACOM contribute to the debate