The Algorithm of Authoritarianism: When Social Media Becomes State Media
The news that broke over the weekend about TikTok being tweaked to become “100% MAGA” has been rattling around in my head like a loose screw in an old MacBook. It’s one of those stories that makes you pause mid-sip of your morning latte and wonder if we’ve finally crossed some invisible line into full dystopian territory.
What strikes me most about this whole situation isn’t just the brazen nature of it – though that’s certainly something. It’s how perfectly it illustrates a pattern we’ve been watching unfold across the digital landscape for years now. The systematic capture of information infrastructure by those who understand that controlling the narrative is far more effective than winning hearts and minds through actual policy.
I’ve been working in tech long enough to understand how algorithms work. They’re not neutral arbiters of content – they’re tools that can be finely tuned to amplify certain voices while quietly suppressing others. The sophistication of modern recommendation systems means you don’t need to hit people over the head with propaganda. You can slowly shift what they see, gradually normalising ideas that would have seemed extreme just months before.
The discussion I’ve been following online really drives this home. Someone made an excellent point about how this isn’t just about creating another echo chamber like Truth Social – it’s about gaining access to audiences that wouldn’t normally seek out conservative content. Young people, particularly those leaning left politically, who might never tune into Fox News or visit right-wing websites, could find themselves slowly consuming increasingly partisan content without even realising it.
This reminds me of what happened to AM radio back in the 90s. I remember my dad dismissing it at the time – “who listens to AM anymore?” he’d say. But that dismissal allowed Rush Limbaugh and his ilk to capture an entire communication medium, creating a pipeline that fed misinformation to millions of commuters for decades. Now we’re seeing the same playbook applied to social media platforms, but with far greater reach and far more sophisticated targeting.
The pattern is becoming clear: they don’t build new platforms, they capture existing ones. Twitter, now TikTok, and if you look closely, you can see the pressure building on other platforms too. It’s not about creating spaces for their supporters – they already have those. It’s about denying platforms to everyone else while gradually shifting the entire information ecosystem rightward.
What particularly frustrates me is how this will be sold as “free speech” by the very people who screamed censorship when health authorities asked social media companies to reduce COVID misinformation. The hypocrisy is breathtaking, but it’s also strategic. They understand that most people won’t dig deep enough to see the contradiction.
From a technical perspective, the idea that the US government will control TikTok’s algorithm is genuinely terrifying. Algorithms are incredibly powerful tools for shaping perception and behaviour. In the wrong hands, they become instruments of mass manipulation. The fact that this is being done so openly, with such casual disregard for democratic norms, suggests a level of confidence that should alarm all of us.
Here in Australia, we often look at American politics with a mixture of fascination and horror, like watching a slow-motion car crash. But we’d be foolish to think we’re immune to these trends. Our media landscape is already heavily concentrated, and the techniques being refined in the US have a habit of crossing the Pacific.
The most depressing part of all this is how predictable it was. We’ve watched this playbook unfold across traditional media, radio, and now social media. Each time, there are warnings, each time there’s pushback, and each time it happens anyway because the people with the resources to prevent it either don’t understand the threat or don’t care enough to act.
But perhaps there’s still hope in understanding the game being played. When you can see the manipulation tactics clearly, you become harder to manipulate. When you understand how algorithms work, you can make more conscious choices about what you consume and share. When you recognise propaganda for what it is, you can seek out alternative sources of information.
The challenge now is whether we can build and maintain truly independent information systems fast enough to counter this trend. It won’t be easy – it requires not just technical solutions but also the collective will to prioritise information integrity over convenience and entertainment.
One thing’s certain: if we don’t start taking information infrastructure seriously as a democratic institution, we’ll find ourselves living in a world where our thoughts and opinions are shaped not by our own experiences and reasoning, but by algorithms designed to serve the powerful. And that’s a future none of us should accept.