<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:media="http://search.yahoo.com/mrss/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:atom="http://www.w3.org/2005/Atom"  xmlns:content="http://purl.org/rss/1.0/modules/content/" version="2.0">
  <channel>
    <title><![CDATA[Ara in English - algorithms]]></title>
    <link><![CDATA[https://en.ara.cat/etiquetes/algorithms/]]></link>
    <description><![CDATA[Ara in English - algorithms]]></description>
    <language><![CDATA[es]]></language>
    <ttl>10</ttl>
    <atom:link href="http://en.ara.cat:443/rss-internal" rel="self" type="application/rss+xml"/>
    <item>
      <title><![CDATA["Digital misogyny is a business model"]]></title>
      <link><![CDATA[https://en.ara.cat/feminisms/online-misogyny-is-business-model_128_5671835.html]]></link>
      <description><![CDATA[<p><img src="https://static1.ara.cat/clip/5d445bf9-a369-4d61-924f-6e075b5ce07f_16-9-aspect-ratio_default_0.jpg" /></p><p>For years, sociologist Elisa García-Mingo has immersed herself in the study of the machosphere, online communities that spread misogynistic content, whether through ridiculing comments and messages or by asking artificial intelligence to reinvent photographs of women to undress them.</p>]]></description>
      <dc:creator><![CDATA[Marta Rodríguez Carrera]]></dc:creator>
      <guid isPermaLink="true"><![CDATA[https://en.ara.cat/feminisms/online-misogyny-is-business-model_128_5671835.html]]></guid>
      <pubDate><![CDATA[Sun, 08 Mar 2026 09:00:38 +0000]]></pubDate>
      <media:content url="https://static1.ara.cat/clip/5d445bf9-a369-4d61-924f-6e075b5ce07f_16-9-aspect-ratio_default_0.jpg" type="image/jpeg"/>
      <media:title><![CDATA[The sociologist Elisa García Mingo.]]></media:title>
      <media:thumbnail url="https://static1.ara.cat/clip/5d445bf9-a369-4d61-924f-6e075b5ce07f_16-9-aspect-ratio_default_0.jpg"/>
      <subtitle><![CDATA[Sociologist and researcher of the male sphere]]></subtitle>
    </item>
    <item>
      <title><![CDATA[Stop digital sexism]]></title>
      <link><![CDATA[https://en.ara.cat/editorial/stop-digital-sexism_129_5671548.html]]></link>
      <description><![CDATA[<p><img src="https://static1.ara.cat/clip/5b5fa5b8-a4e2-4f6d-8868-2b5b45b94742_16-9-aspect-ratio_default_0.jpg" /></p><p>We commemorate March 8th again this year, with almost the same demands, or even worse, insisting that we don't want to take steps backward. Feminism seems like Sisyphus trying to push a boulder uphill over and over again. Today, one of the reasons that makes progress in rights and the consolidation of those already achieved more difficult is the digital world. Because it's not just that it has a sexist bias, but that by its very nature it amplifies and multiplies it. As we explain in today's dossier, the examples are numerous and the reasons obvious. One is that the digital universe is dominated by white heterosexual men, both the owners of the main companies in the sector and those who program and work in these companies. These men train the algorithms with the inherent gender bias, based on their interests and concerns, and they don't even find it strange that this bias exists, nor do they know how to detect it. </p>]]></description>
      <dc:creator><![CDATA[Editorial]]></dc:creator>
      <guid isPermaLink="true"><![CDATA[https://en.ara.cat/editorial/stop-digital-sexism_129_5671548.html]]></guid>
      <pubDate><![CDATA[Sat, 07 Mar 2026 18:56:31 +0000]]></pubDate>
      <media:content url="https://static1.ara.cat/clip/5b5fa5b8-a4e2-4f6d-8868-2b5b45b94742_16-9-aspect-ratio_default_0.jpg" type="image/jpeg"/>
      <media:title><![CDATA[A woman using a computer in a stock image.]]></media:title>
      <media:thumbnail url="https://static1.ara.cat/clip/5b5fa5b8-a4e2-4f6d-8868-2b5b45b94742_16-9-aspect-ratio_default_0.jpg"/>
      <subtitle><![CDATA[]]></subtitle>
    </item>
    <item>
      <title><![CDATA[Algorithms are also sexist: this is how they amplify discrimination against women]]></title>
      <link><![CDATA[https://en.ara.cat/society/algorithms-are-also-sexist-this-is-how-they-amplify-discrimination-against-women_130_5670708.html]]></link>
      <description><![CDATA[<p><img src="https://static1.ara.cat/clip/e8fbfb64-c303-487d-9f82-b223ef46b9c7_16-9-aspect-ratio_default_0.jpg" /></p><p>Did you know that algorithms decide what ads we see on social media? It might seem insignificant, but it's not. "If you're a young person about to decide on a university degree, it's very likely that if you're a guy, social media will show you degrees in engineering and computer science, and if you're a girl, degrees in education, nursing, and caregiving," says Liliana Arroyo Moliner, PhD in sociology and director of the Chair for Socially Responsible Digital Innovation. Milagros Sainz, a researcher and professor at the Open University of Catalonia (UOC), explains that the popular Google Maps app "uses a man's pace to calculate walking distances, which often makes women or people with mobility issues take longer." There's also discrimination when looking for work, in healthcare, and with the algorithms used by banks to decide whether or not to grant a loan, because the mathematical models of these algorithms are applied in very different fields, and many decisions are made taking them into account.</p>]]></description>
      <dc:creator><![CDATA[Thais Gutiérrez]]></dc:creator>
      <guid isPermaLink="true"><![CDATA[https://en.ara.cat/society/algorithms-are-also-sexist-this-is-how-they-amplify-discrimination-against-women_130_5670708.html]]></guid>
      <pubDate><![CDATA[Fri, 06 Mar 2026 19:00:27 +0000]]></pubDate>
      <media:content url="https://static1.ara.cat/clip/e8fbfb64-c303-487d-9f82-b223ef46b9c7_16-9-aspect-ratio_default_0.jpg" type="image/jpeg"/>
      <media:title><![CDATA[Algorithms are a reflection of society]]></media:title>
      <media:thumbnail url="https://static1.ara.cat/clip/e8fbfb64-c303-487d-9f82-b223ef46b9c7_16-9-aspect-ratio_default_0.jpg"/>
      <subtitle><![CDATA[The systems that underlie browsers, social networks, and all applications reproduce and amplify society's gender biases, offering a very unequal view of the world.]]></subtitle>
    </item>
    <item>
      <title><![CDATA["There is nostalgia for a pure love"]]></title>
      <link><![CDATA[https://en.ara.cat/lifestyle/there-is-nostalgia-for-pure-love_128_5541881.html]]></link>
      <description><![CDATA[<p><img src="https://static1.ara.cat/clip/18c4c9d8-c6a6-4b4e-be9c-0e36456803bb_16-9-aspect-ratio_default_0.jpg" /></p><p>She is the first person in her family to attend university. She graduated in English Philology and became interested in surveillance studies in modern literature. A Fulbright scholarship brought her to the United States in 2014. She currently teaches at Barnard College, Columbia University, a center for gender studies. Her thesis is titled: <em>Algorithmic love, reorganization of romantic love</em>.</p>]]></description>
      <dc:creator><![CDATA[Carla Turró]]></dc:creator>
      <guid isPermaLink="true"><![CDATA[https://en.ara.cat/lifestyle/there-is-nostalgia-for-pure-love_128_5541881.html]]></guid>
      <pubDate><![CDATA[Mon, 27 Oct 2025 06:01:04 +0000]]></pubDate>
      <media:content url="https://static1.ara.cat/clip/18c4c9d8-c6a6-4b4e-be9c-0e36456803bb_16-9-aspect-ratio_default_0.jpg" type="image/jpeg"/>
      <media:title><![CDATA[Sandra Moyano, professor at Barnard College in Columbia.]]></media:title>
      <media:thumbnail url="https://static1.ara.cat/clip/18c4c9d8-c6a6-4b4e-be9c-0e36456803bb_16-9-aspect-ratio_default_0.jpg"/>
      <subtitle><![CDATA[Professor at Barnard College, Columbia University]]></subtitle>
    </item>
  </channel>
</rss>
