March 8th International Women's Day

Algorithms are also sexist: this is how they amplify discrimination against women

The systems that underlie browsers, social networks, and all applications reproduce and amplify society's gender biases, offering a very unequal view of the world.

Algorithms are a reflection of society
5 min

BarcelonaDid you know that algorithms decide what ads we see on social media? It might seem insignificant, but it's not. "If you're a young person about to decide on a university degree, it's very likely that if you're a guy, social media will show you degrees in engineering and computer science, and if you're a girl, degrees in education, nursing, and caregiving," says Liliana Arroyo Moliner, PhD in sociology and director of the Chair for Socially Responsible Digital Innovation. Milagros Sainz, a researcher and professor at the Open University of Catalonia (UOC), explains that the popular Google Maps app "uses a man's pace to calculate walking distances, which often makes women or people with mobility issues take longer." There's also discrimination when looking for work, in healthcare, and with the algorithms used by banks to decide whether or not to grant a loan, because the mathematical models of these algorithms are applied in very different fields, and many decisions are made taking them into account.

A 2025 investigation that was reported by the newspaper The Times about LinkedIn He conducted a test on this network with identical posts made by men and women. The result was that the posts Women's followers reached only 0.6%, while men's could exceed 50%. Another frequently cited example is that of recruitment processes. 65% of companies already use AI during the personnel selection process. In 2018, alarm bells rang when it was revealed that the algorithm that did the personnel selection at Amazon It discarded resumes that had characteristics the algorithm interpreted as feminine. "The problem was that the data it was given to train it on was mostly from men, and the algorithm goes learn "The ideal candidates were male," notes Ana Freire, an engineer, PhD in computer science and professor at the UPF Barcelona School of Management (UPF-BSM).

A study by University College London and the University of Kent A study conducted last year emphasized how social media algorithms amplify misogynistic content, particularly among younger users, especially on TikTok. Image and voice recognition apps also better identify and classify the faces of white men, "because most of the teams training the technology are white men," says Freire. And audio analysis struggles to recognize higher-pitched voices, which primarily affects women when they interact. These are just a few practical examples of gender bias in internet algorithms and artificial intelligence (AI).

Some historical biases

Every scientific and technological advance throughout human history has been accompanied not only by surprise, controversy, and debate, but also by gender bias. "Throughout the history of science and technology, women have been made invisible in many aspects, and the same thing is happening now," explains Nadia Alonso, professor and researcher in the Department of Audiovisual Communication, Documentation, and Art History at the Polytechnic University of Valencia. She emphasizes that the gender biases present in the algorithms that power search engines and artificial intelligence "reflect the prejudices that exist in society."

Algorithms are not neutral because they learn from the real world, and the real world is unequal. When an artificial intelligence system is trained with data, patterns, and previous human decisions, it can reproduce—and even amplify—existing stereotypes and discrimination, such as the underrepresentation of women in many areas. "What algorithms do is reproduce existing biases and amplify them," notes Arroyo Moliner.

For AI to function, it must be fed data, and this data is inherently biased. "AI doesn't generate biases; rather, it perpetuates the biases of the data generated in our society; we are the ones who create these biases. Algorithms learn as a young child might, growing up and naturally acquiring these biases," adds Ana Freire. And it's not just the training data; user interactions, design decisions, and the feedback The human bias we develop through our responses contributes to perpetuating these inequalities.

Ana Albalat-Mascarell, a professor of applied linguistics also at the Polytechnic University of Valencia, explains that "algorithms collect data from a context where everything feminine is perceived as inferior to masculine, due to existing prejudices, and then categorize the feminine at a lower level." The fact that the algorithm is not neutral surprises many people, because being based on mathematical models gives it a certain presumption of objectivity. Data scientist Cathy O'Neil, author of the book Weapons of math destruction, She is a specialist in this area and has explained on several occasions how mathematical algorithms are not neutral and can be manipulated by biases—gender, origin, education, social class—and how this ultimately impacts our lives. "Anyone who thinks that algorithmic biases don't affect them is very mistaken. This affects everyone, and it's crucial to be aware of it," Alonso warns.

More women in creative teams

To correct these biases—whether based on gender, age, ethnicity, or class—a collective effort is necessary. Liliana Arroyo proposes spaces for co-creating technology, "and not a single company designing it."

Ana Freire offers a glimmer of hope, asserting that technology has tools to counteract these biases. "Humans are feeding the technology, and if we are aware of these biases, we can correct them," she says. But for this to happen, more women are needed on the teams that design it, validate the data training, and provide a critical perspective throughout the entire process. But that's not enough. The teams also need to be diverse—in gender, but also in age, background, education, and Experience—and interdisciplinary approaches, because, as Meritxell Beltrán, an expert on the impact of algorithms on gender equality and professor of economics and business studies at the UOC, says, "technology will be neutral if we work to make it so." Freire also proposes using "transparent" algorithms, those that let us see how they make decisions. "They are simpler algorithms, but they allow us to see the algorithm's reasoning process for making a decision or hiring a person," Freire explains. And a third measure to minimize these biases is to have an expert review the algorithm. "If there is an underrepresented group, it must be identified and its representation in the final decision must be guaranteed," to avoid cases like the algorithms for granting bank loans, for example, which offered less advantageous conditions for women, since they used historical data that is not currently representative. And what can we do?

As users, we are also responsible. Digital literacy is necessary to be proficient in using these tools, but above all, critical digital training is essential: "To understand what I'm using, what business models are behind it, and what kind of social narrative that result is showing me," points out Liliana Arroyo Moliner.

"We must educate society in general," warns Sainz. "We must get our act together and understand that from the moment we open a device or an AI tool, it can condition our way of thinking and acting. And we shouldn't trust the first thing it offers us; we must verify it and, if necessary, correct it," she adds. Alonso and Albalat-Mascarell urge the education of future generations: "Especially Generation Alpha, who no longer understand the world without technology, and must learn to use and understand this technology. We must eliminate biases of all kinds because otherwise they will have distorted views of reality." Liliana Arroyo makes a proposal that goes even further: that algorithms should be named after their creators, because that way we might not forget that they are made by people. Because in the end, they are still a product of humans, "with their flaws and virtues."

Networks, a way of understanding the world

Nadia Alonso and Ana Albalat-Mascarell focus on social media and its algorithms, "which dominate the world." These experts warn that "most people, especially young people, get their information through social media, which shapes how we see the world, how we understand it, and how we form our idea of ​​what it is like. But these networks are based on algorithms that reproduce stereotypes and sexist inequalities, and therefore it can be concluded that many people end up..."

Content recommendations on social media are also biased: "The content that men receive is very different from what women receive, and even the advertising they see," notes Ana Freire, who adds that algorithms seek to personalize content and segment by population groups.

It is therefore necessary to be aware of all these premises when opening any social network and delving into its content. Nothing we see has appeared by chance, and our interpretation of reality will be entirely conditioned by these algorithms, which are mostly biased and unequal.

stats