Elisa García-Mingo: "Digital misogyny is a business model"
Sociologist and researcher of the male sphere
BarcelonaFor years, sociologist Elisa García-Mingo has immersed herself in the study of the machosphere, online communities that spread misogynistic content, whether through ridiculing comments and messages or by asking artificial intelligence to reinvent photographs of women to undress them.
Is the machosphere traditional sexism on social media?
— It's a mix. In the late 90s, sexism began to migrate to the digital space, initially in forums and communities where men shared ideas and grievances, but in recent years it's primarily been found in more closed communities and debates. When we started researching it six or seven years ago, it had a more ideological and political component. Today, it's also a business model. After the pandemic, there was a boom on TikTok, Twitch, and other platforms, and the creation of misogynistic content has become professionalized. What remains is the misogynistic culture, but the format and strategy change.
But does it have its own dynamics?
— Yes, because it responds to the logic of digital culture. The internet allows for rapid organization, anonymous action, and the coordination of mass attacks. The experience of digital violence is different from that of analog violence. The experience of being a victim is different because you often don't know who is attacking you or from where, and you can receive hundreds of messages in a very short time, even from automated accounts, from multiple sources. That's why we speak of organized digital misogyny. The very organizational capacity that social movements or feminism have used has also been used to coordinate violence.
What messages does it send?
— Within the macho sphere, a shared culture is constructed through recurring messages. It is said that contemporary feminism is radical and is destroying society, that it is destroying the traditional family, endangering children, and that men are the ones who truly suffer. There is talk of a feminist dictatorship. There are also biological discourses about testosterone, male strength, and the need to reclaim a traditional masculinity. These are ideas that are constantly repeated.
What does it offer these aggravated men?
— For many men, it's a space perceived as safe. They feel that outside of that environment they might be questioned or canceled, while within the macho sphere they can express sexist ideas without criticism. It's a space of validation, even if the discourse is violent.
Do the algorithms used by social media platforms drive these messages?
— Absolutely. To understand these changes, it's necessary to look at algorithmic mediation. Platforms have been refining their recommendation and moderation algorithms. Before, the algorithm recommended content, and mediation was minimal; now it's central. Algorithms decide what you see and what you don't, and they're becoming increasingly complex. Technology is insecure and toxic, and it has the potential to create a more hateful and violent population because everything circulates faster, fostering a culture of humiliation.
He was talking about business. Is misogyny a business niche?
— Algorithms play a fundamental role. The platforms' business model is based on attention. Polarizing content generates more interaction and more screen time, which translates into higher revenue. That's why increasingly extreme content is often recommended. Echo chambers are created in which you end up consuming similar messages repeatedly. With just one view, or after a few repetitions, the system starts recommending similar content. Furthermore, researchers don't have access to the black box of algorithms; it's the secret to their business.
Does that explain why without following the influencers Do we see these kinds of messages?
— Algorithms segment the population, moderate, and recommend content to improve the user experience. Amazon popularized the first major recommendation algorithms with "Those who bought this also bought that." This is very appealing because it personalizes the experience. However, it also creates echo chambers: you end up constantly consuming similar messages.
A real loop.
— We've seen that teenagers can quickly become exposed to misogynistic or hateful content, even without seeking it out. We're talking about technologies that can be toxic by design, because they ultimately lead people to this kind of content. This connects to what we call a culture of humiliation, in which not only women are dehumanized, but also other vulnerable groups.
And the tradwivesThese women influencers Do those who promote the role of women in the home form part of this macho sphere?
— Antifeminist women are also part of this ecosystem and this business. Many promote a return to traditional femininity, yet they are businesswomen. The interesting thing is what's at stake. The far right wants women to stop being the driving force of resistance and for feminism to become divided, because it knows that sexism has reached its limit. The area where this reactionary wave can expand is among young women. In fact, this is the group among whom antifeminism has grown the most.
And then there's artificial intelligence for undressing women or creating pornography.
— It's no coincidence that tech companies allied with the far right are fueling hatred. It's a process of dehumanization, which goes hand in hand with fascism. They present women as objects to be mocked. It's shameful!
Is ending this practice a losing battle?
— However, today there are many more initiatives than seven years ago. There are European regulations, strategies for protecting children online, educational projects, algorithm audits, free software initiatives, and alternative internet projects. The problem is that the large platforms are very powerful and their business model is enormous. But there is greater awareness and a stronger response than before.