Algorithms: how to make them fairer

It was the thirties. Robert Moses, the most powerful urban planner in New York's history, ordered the construction of bridges so low that buses could not pass. Buses were the transport of the poor and the black population. No law was needed: the bridge was exclusionary by design and by default.

In the digital version, the same happens: algorithms and language models become a moral architecture. When a homogeneous team decides how a facial recognition system works, when a company's financial logic determines which voices are amplified in an feed of information, or when an interface designed without regard for diversity leaves entire groups out, the gesture is identical to that of the low bridges. Every technical decision is, at the same time, a decision about who matters and what is "normal".

Cargando
No hay anuncios

If we look beyond incompetence or malice, what we find is the unconscious sedimentation of a way of being in the world. Philosopher Peter-Paul Verbeek puts it precisely: engineers do ethics by other means. When they design a technology, they materialize a morality. They don't declare values, they insert them. Paradoxically, in the dominant technological imaginary, ethics is a last-minute layer: a committee that reviews, a box that is ticked. But the crucial decisions —those that determine who can speak, who is heard, who is monitored, who is left out— crystallize before the first line of code. When the ethics committee arrives at the end of the process, the damage is already done: the architecture is already there and changing it costs money, time, and political will, which almost never appear.

Cargando
No hay anuncios

The answer cannot be just diversifying teams, although it is a necessary and urgent strategy. We must go further: treat digital rights as fundamental rights. The right not to be discriminated against by an algorithm, to understand automated decisions that affect you, to participate in the design of the systems that structure your life. Without a framework of rights, victims of design have no name or possible remedy.

Let's consider the documented cases of young people who have committed suicide after systematic exposure to harmful content recommended algorithmically. We have seen how platform managers hide behind computation: the algorithm only does mathematics; the content is generated by users; they only provide the infrastructure. It's the same defense Moses could have made: I only built a bridge. But, as Langdon Winner said, artifacts have ideology.

Cargando
No hay anuncios

Conway's Law –"Any organization that designs a system will produce a design whose structure is a copy of the organization's communication structure"– reminds us that systems speak of their creators. And on the other side of the mirror, what we find is not a worldview or a collective project: there are incentives to grow non-stop, to serve the best possible tray to advertisers, to keep the advertising pipeline inflated at any cost. Moses' bridges took decades to be torn down. Algorithms could be changed today, if corporations understood that being architects of an ethic aligned with rights and justice is not "do-gooding": it is the best way to guarantee long-term business.