top of page

Like, Share, Divide: How Algorithms Build Walls, Not Bridges

  • Dec 2, 2024
  • 4 min read

Updated: Apr 3

When was the last time an algorithm recommended something that truly challenged you? A book that made you question your worldview? An article that forced you to rethink a position? Instead, we get what’s relevant to us, which is just another way of saying what we already think.


by Florence Kim


Visual concept by Florence Kim and generated by Midjourney
Visual concept by Florence Kim and generated by Midjourney

I was 15 when we first spoke about the internet. The "web" evoked grand ideas of limitless knowledge. I imagined a world where every encyclopedia would be at my fingertips, where access to information would be democratized, where learning would be boundless. The real question was, how will we digest all this knowledge? Instead, we ended up drowning in (mis)information flow, scrolling through cat videos. Which is fine, don’t get me wrong, but we are now so oversaturated, so overfed with information, that critical thinking is no longer about finding knowledge but about filtering what is worth our time.


The internet was supposed to connect us, to break down barriers, to expose us to diverse ideas. Instead, it has built walls, reinforced biases and trapped us in echo chambers where algorithms curate our reality based on what we already believe.


Algorithms Are Not Neutral

There’s a dangerous misconception that algorithms are unbiased, neutral tools. They aren’t. They reflect human biases, the data they are fed and the objectives they are designed to achieve. Algorithms don’t seek truth, they optimize engagement. And engagement thrives on outrage, controversy and affirmation—not nuance, complexity or doubt.


Every like, share and comment feeds the system, shaping what we see next. Over time, our feeds stop being a reflection of the world and become a mirror of ourselves, reinforcing our perspectives, shielding us from discomfort, ensuring we never have to confront ideas that challenge our views.


This goes beyond social media. AI-driven content curation narrows our exposure to information. Online courts, facial recognition software, hiring algorithms—these technologies are not just predicting our behavior but determining our rights and opportunities. They define who gets hired, who gets flagged as a security threat, whose loan gets approved. And they do so with all the biases that exist in society baked into their logic.


Technology Deepens Segregation

Algorithms don’t just divide us ideologically, they reinforce social, racial and gender disparities.

  • Facial recognition software disproportionately misidentifies people of color, leading to wrongful arrests.

  • Online hiring tools discriminate against women and minorities, filtering out resumes based on historical biases.

  • Even language translation carries hidden biases. Hungarian, for example, is a gender-neutral language, yet when AI translates "professor" into English, it often defaults to "he." When it translates "assistant," it defaults to "she." These are not neutral errors, they are reflections of systemic inequalities.


In the digital world, these biases don’t just persist, they scale. They become embedded in code, invisible yet omnipresent, shaping perceptions, reinforcing old stereotypes and widening the gaps we should be closing.


The Death of Critical Thinking

One of the biggest casualties of algorithm-driven reality is critical thinking.


We consume content that confirms what we already believe. The more we agree, the more we engage. The more we engage, the more we see similar content. The more we see similar content, the more we believe it represents reality.


We don’t debate anymore. We affirm.


When was the last time an algorithm recommended something that truly challenged you? A book that made you question your worldview? An article that forced you to rethink a position? Instead, we get what’s relevant to us, which is just another way of saying what we already think.


Conformism is the byproduct of this digital ecosystem. If everyone is exposed to the same narratives, speaking the same way, using the same expressions, then where is the space for new ideas? The internet was meant to bring diversity of thought, but it increasingly leads to sameness, stripping us of the discomfort necessary for growth.


Isolation in the Age of Connection

And yet, for all this connectivity, we are more disconnected than ever. We were told AI would bring us closer, but what does it say about us that more and more people are paying for AI-generated partners? That human connection is being outsourced to code? That loneliness is being monetized?


Technology is isolating us in ways we never anticipated. We work remotely. We shop online. We interact through screens. Human interactions are replaced by automated responses, AI-generated voices and chatbot customer service. We are together but alone, connected but disconnected.


AI Should Help Us Understand Each Other

This isn’t the future we were promised. AI should bridge divides, not deepen them. It should broaden our perspectives, not narrow them. It should be a tool for understanding, not a mechanism for segregation.


We need AI that challenges us, surprises us and forces us to think outside our own biases. That doesn’t just optimize for engagement but prioritizes diversity of thought, meaningful debate and exposure to different perspectives. We need AI that enhances human connection, not replaces it.


At the moment, we are (dis)connected. It’s time to rethink how we build bridges instead of walls.


This article is original work by Florence Kim. If you wish to quote or reference it, please attribute accordingly. For direct quotes, please cite '[article title]' by Florence Kim, aidvocacy.org, [year]. For online references, kindly include a link to the original article.

 
 
 

Comentarios


bottom of page