The Age Verification Surveillance Monster We're Sleepwalking Into
I’ve been following the age verification debate for a while now, and honestly, every time I think it can’t get more dystopian, something new comes along to prove me wrong. This week’s revelation about Persona – the age verification vendor that’s been exposed for running what essentially amounts to a comprehensive surveillance operation – is both shocking and entirely predictable.
For those who haven’t heard, researchers discovered that Persona’s system doesn’t just verify your age. Oh no, that would be far too reasonable. Instead, it performs 269 distinct verification checks, runs facial recognition against watchlists and politically exposed persons, screens “adverse media” across 14 categories including terrorism and espionage, and assigns risk and similarity scores. They collect and can retain for up to three years your IP addresses, browser fingerprints, device fingerprints, government ID numbers, phone numbers, names, faces, and a whole battery of “selfie” analytics.
Let that sink in for a moment. All this for what’s supposedly about protecting kids online.
Look, I’m a parent. I’ve got a teenage daughter, and yes, I care about what she’s exposed to online. But here’s the thing that seems to escape the politicians and moral panic merchants: we already have tools for this. Parental controls exist. Device management exists. Actually talking to your kids about online safety exists. What we don’t need is a surveillance infrastructure that would make the Stasi blush, all wrapped up in the comforting rhetoric of “protecting children.”
The “for the kids” defence has become the tech policy equivalent of a get-out-of-jail-free card. Terrorism, child abuse, or national security – just spin the wheel and pick whichever justification polls best this week. Someone in one of the discussion threads I was reading put it perfectly: the public outcry is minimal because people genuinely don’t understand what they’re signing up for. They hear “age verification” and think it’s just confirming you’re over 18. They don’t realise they’re being enrolled in a comprehensive surveillance database.
What really gets me frustrated is how predictable the failure modes are. Discord wanted to use Persona. Reddit apparently uses them for UK age verification. And now we discover their security was about as robust as a wet paper bag. The code was just sitting there, exposed, at a US government-authorized endpoint. This isn’t some theoretical privacy concern – this is actual, real-world incompetence with people’s most sensitive personal data.
The UK situation is particularly grim. Reddit’s already behind an ID wall there, and not just for adult content either. Support groups for sexual assault survivors. Alcohol and gambling support communities. Literally any text marked NSFW, even if it’s just someone venting about their terrible day at work. VPN use has predictably skyrocketed, and now the government’s response isn’t to reconsider their terrible policy – it’s to discuss banning VPNs. Because nothing says “free society” like preventing people from protecting their privacy online.
My DevOps background makes me particularly cynical about all this. I’ve worked in IT long enough to know that security is hard, that data breaches are inevitable, and that once data is collected, it will eventually leak or be misused. It’s not a matter of if, but when. Every additional database of personal information is another target, another liability, another eventual scandal waiting to happen.
The AI angle troubles me too. These systems are increasingly using machine learning models to assign “risk scores” and “similarity scores” to people. What do these scores mean? Who decides what’s risky? What happens if the algorithm flags you as suspicious because you happen to look like someone on a watchlist? We’re building systems of automated judgement with zero transparency and zero accountability, and wrapping them in the friendly language of “verification” and “safety.”
There’s a broader political dimension here that we can’t ignore. Age verification laws aren’t really about age – they’re about identity. They’re about building a system where anonymous access to information and communication is increasingly impossible. Where every website visit, every forum post, every expression of opinion is tied back to a verified, government-documented identity. That’s not a world I want to live in, and it’s certainly not a world I want my daughter to inherit.
The solution isn’t more sophisticated surveillance technology. It’s not blockchain-based age verification or privacy-preserving cryptographic protocols or whatever Silicon Valley snake oil is being peddled this week. The solution is accepting that the internet is a space where anonymity and privacy should be default rights, and that parents – not governments, not tech companies – are responsible for managing their children’s online experiences.
We already have laws against actual child exploitation. We already have tools for parental control. What we don’t have, and desperately need, is meaningful pushback against this surveillance creep. Contact your privacy regulator. Support organisations fighting these laws. Use encrypted communication. Vote for politicians who actually understand technology and civil liberties.
Because once we’ve normalised comprehensive identity verification for internet access, once we’ve accepted that every online interaction requires government ID and facial recognition, we won’t get that privacy back. The infrastructure will exist, and it will be used for purposes far beyond its original justification. It always is.
The irony is that none of this will actually protect kids. Determined teenagers will find ways around it – they always do. VPNs, borrowed IDs, foreign websites. Meanwhile, adults will have sacrificed their privacy, their anonymity, and their freedom to access information without surveillance, all for a policy that won’t achieve its stated goal.
We need to be louder about this. We need to make more noise, educate more people, and push back harder. The alternative is waking up one day in a digital panopticon and wondering how we let it happen while we were all busy thinking about the children.