JOTA – Opinion & Analysis — Published in March 2026

Addressing AI Bias Is a Legal Agenda, Not Just a Technological One

The rapid advancement of artificial intelligence has brought significant gains in efficiency and innovation, but it has also raised growing concerns about algorithmic bias and its impact on automated decisions that affect fundamental rights. AI systems trained on historical data may reproduce or amplify existing inequalities, which calls for attention not only from a technical perspective but also from a legal one.

In this context, there is increasing recognition that addressing these risks requires governance, regulation, and legal accountability—especially when algorithms are used in sensitive areas such as credit decisions, recruitment, healthcare, or judicial decision-making.

According to Juliana Sene Ikeda, partner at Campos Thomaz Advogados, combating algorithmic bias cannot be treated solely as an engineering or data science challenge. “Mitigating bias also requires a structured legal approach that includes transparency, auditing, and accountability mechanisms to ensure that AI systems do not reproduce discrimination or violate rights,” she notes.

Juliana also highlights that the global regulatory debate points to the need for more robust models of algorithmic governance, including requirements for documentation, monitoring, and human review of automated decisions. “The discussion around artificial intelligence must move beyond technology and incorporate legal principles that ensure fairness, accountability, and the protection of fundamental rights,” she states.

The debate on AI bias reinforces that technological development must progress alongside regulatory frameworks and oversight mechanisms capable of ensuring the responsible use of these tools. 

To access the full article, click here.

*

share

LinkedInFacebookTwitterWhatsApp

newsletter

Subscribe our newsletter and receive first-hand our informative

    For more information on how we handle your personal data, see our Privacy Policy.