The Puppet Show: When Foreign Bots Masquerade as Your Neighbours
Been having one of those conversations lately that makes you question everything you see online. You know the type – where someone mentions how they’ve been getting friend requests from celebrities on Facebook, and suddenly everyone’s chiming in with their own bizarre stories. Mel Gibson wanting to be mates, Steven Miller sliding into DMs, even Ryan Gosling’s mum apparently making the rounds. It’s almost comical until you realise what’s actually happening beneath the surface.
The discussion started around how Russian and Iranian troll farms are essentially running the show in MAGA online spaces, but it’s opened up this much bigger conversation about how artificial our entire digital landscape has become. Someone mentioned getting friend requests from half the LA Dodgers roster after commenting on a single article. Another person joked about Sandra Bullock reaching out because their “profile caught her eye.” We’re all laughing, but there’s something deeply unsettling about it all.
What really struck me was when someone pointed out how Facebook transforms hacked accounts into AI personas. They watched their own old account get hijacked and turned into some fake person with generated photos. It’s like watching identity theft happen in slow motion, except the stolen identity becomes this manufactured person designed to manipulate others.
The scale of this manipulation is staggering. People are describing comment sections with tens of thousands of responses that all sound eerily similar, accounts that post identical food photos across multiple profiles, and entire networks of fake families complete with spouses, babies, and dogs. It’s an elaborate theatre production where the audience doesn’t know they’re watching actors.
Living through the recent election cycle here in Australia, I watched similar patterns emerge in our own political discussions. The timing was too perfect, the talking points too coordinated. During major news events, there’d be this strange lull online – like someone hit pause on the conversation while scripts were being rewritten. Then suddenly, a flood of identical perspectives would appear, all pushing the same narrative with slightly different words.
What’s particularly insidious is how this artificial amplification makes fringe views appear mainstream. Someone mentioned how Trump’s military parade looked so poorly attended because the online enthusiasm didn’t translate to real bodies in the street. The bots can’t march, can’t show up to rallies, can’t actually vote. But they can make a small movement look massive, convincing both supporters and opponents that there’s more support than actually exists.
The technology aspect fascinates and terrifies me in equal measure. We’re essentially watching the Turing test being passed in real-time across social media platforms. These aren’t the clunky chatbots of a few years ago – they’re sophisticated enough that people are having extended conversations without realising they’re talking to algorithms. When researchers deployed AI bots in subreddits and nobody noticed until the study was published, it became clear we’ve crossed a threshold we can’t uncross.
What bothers me most is how this preys on our fundamental human need for connection and validation. These platforms were supposed to bring us together, help us share ideas and stay connected with friends and family. Instead, they’ve become psychological warfare zones where our emotions and beliefs are commodities to be harvested and manipulated by whoever can afford the most sophisticated bot army.
The environmental implications keep me up at night too. All this artificial engagement requires massive computing power, contributing to our carbon footprint while simultaneously undermining democratic discourse. We’re literally burning fossil fuels to generate fake conversations designed to make us angrier at each other.
But here’s what gives me hope: people are starting to notice. The comments I’ve been reading show increasing awareness of these manipulation tactics. People are learning to spot the patterns – the generic usernames, the coordinated timing, the shallow responses that fall apart under scrutiny. Someone mentioned how blocking obvious bot accounts actually changes the entire tone of comment sections, revealing the real human conversations underneath.
The solution isn’t to abandon these platforms entirely, though I understand the temptation. We need better digital literacy, stronger verification systems, and perhaps most importantly, regulations that treat social media manipulation as the serious threat to democracy that it is. Every fake account spreading misinformation should be seen as a form of electoral interference, regardless of whether it’s pushing left or right-wing talking points.
Until then, we need to approach our online interactions with the same scepticism we’d apply to a door-to-door salesman. Question the timing, check the source, and remember that if Steven Miller really wants to be your Facebook friend, something’s probably not quite right with the world.
The internet was supposed to democratise information. Instead, it’s become the most powerful tool for manufacturing consent that’s ever existed. But maybe, just maybe, if enough of us start paying attention to the puppet strings, we can start having real conversations again.