Overview:
This piece explores how social media algorithms shape not only our feeds but also our worldviews. It highlights the dangers of echo chambers, the influence of digital power structures, and offers a human-centered reflection on how we can reclaim our ability to think freely in a tech-driven world.
It feels like we’re choosing what to believe — but what if we’re not?
Scrolling
Every scroll, like, and click we make online is tracked, sorted, and fed back to us in a carefully curated stream of content. This isn’t just about convenience anymore; it’s about power. The power to influence what we see, what we think, and ultimately… what we believe. Welcome to the politics of the algorithm.
Social media platforms like Instagram, TikTok, X (formerly Twitter), and Facebook are no longer neutral playgrounds where ideas exchange freely. Instead, they’ve evolved into battlegrounds where algorithms decide which voices rise and which fall silent. These invisible code-driven forces shape our reality daily, often without us even realizing it.
Complexity
If you’ve ever felt like your social media feed “just gets you,” that’s because it does — but not in a wholesome, intuitive way. Behind the scenes, complex algorithms analyze everything you do: what you linger on, what you scroll past, what you share, and even what you ignore. They build detailed profiles that predict what will keep you glued to your screen. The goal isn’t to inform or educate, but to maximize engagement. Outrage, fear, and sensationalism outperform calm, reasoned discourse every time.
The consequences of this are far-reaching. Instead of exposing us to a wide spectrum of ideas, these algorithms trap us inside echo chambers. They create digital bubbles where our existing beliefs are constantly reinforced, and opposing views are filtered out. This creates a dangerous feedback loop of confirmation bias, distorting our understanding of the world. We start to think the reality presented by our feeds is the only reality. This affects not only our personal opinions but the fabric of society itself.
Attention
What makes this phenomenon even more troubling is that the algorithms are not unbiased. They are designed, maintained, and tweaked by companies whose primary motivation is profit. The currency of the internet age is attention, which is a scarce, valuable resource. The louder and more emotionally charged a post is, the more likely it is to capture attention and generate clicks — and therefore revenue. This is why political actors, content creators, and even foreign interference campaigns exploit these systems. They pump out divisive, sensational content designed to “go viral.” These aren’t accidents or glitches; rather, they are calculated strategies to influence public opinion and behavior.
Politicians have learned to weaponize the algorithm, using it to bypass traditional media filters. They connect directly with audiences through targeted messaging. The result is a fractured public discourse where misinformation spreads faster than facts, and outrage is amplified over understanding. In this environment, entire communities can be easily manipulated or radicalized without ever realizing it.
Powers at be
Despite the grim picture, we are not powerless. Recognizing the forces at play is the first step to regaining control over what we believe. Being intentional about how we consume information matters now more than ever. This means diversifying your sources and actively seeking out perspectives that challenge your views. It also includes questioning the content before accepting it as truth. Social media is not inherently bad — rather, it’s the way it’s currently engineered that creates problems.
On a personal level, mindfulness can be a powerful tool. Notice when a post triggers an immediate emotional reaction, especially anger or fear. Take a moment to fact-check before sharing or engaging. Resist the urge to outsource your thinking to algorithms or trending topics. Algorithms should serve as tools that support our curiosity and critical thinking. They should not be oracles that decide our beliefs.
Accountability
At the societal level, there is a growing push for transparency and accountability in how platforms design their algorithms. Advocates argue for clearer explanations about why certain content is shown. They suggest options to customize feeds and measures to reduce the spread of harmful misinformation. Some platforms have started experimenting with these ideas, though progress is slow. This progress is often met with resistance from stakeholders invested in the status quo.
Understanding the politics of the algorithm means understanding that technology is not neutral. It reflects the values and incentives of those who build and control it. In the hands of a few powerful companies, algorithms shape the information ecosystem on which democracy, culture, and public life depend. This raises urgent ethical questions about who gets to decide what counts as news. It also questions what voices are amplified or silenced.
Creators
Ultimately, the responsibility doesn’t lie solely with the platforms or the creators of algorithms. It’s a collective challenge that requires digital literacy, critical thinking, and civic engagement from all of us. The next time you scroll through your feed, pause and ask: Is this what I believe? Or is this what I’ve been shown to believe? Because there’s a big difference — and understanding that difference might just be the key to reclaiming our autonomy in a digital age dominated by algorithms.
By: Mari Y.L.

