The Privacy Paradox: When 'Secure' Apps Are Anything But
I’ve been having one of those moments lately where you stumble across something that makes your blood boil just a bit. You know the feeling – when you discover that a company has been pulling the wool over everyone’s eyes, and suddenly you’re questioning everything you thought you knew about digital privacy.
The trigger this time was learning about Viber’s data collection practices. Here’s an app owned by Rakuten that markets itself as privacy-focused, complete with all the right buzzwords about end-to-end encryption and respecting user privacy. Yet when someone actually bothered to check the App Store privacy labels, the reality was starkly different. We’re talking about location data, browsing history, contacts, sensitive information – basically everything they can get their hands on – all linked directly to your identity and used to track you across other apps and websites.
The sheer audacity of it is what gets me. It’s one thing for Facebook or Google to hoover up your data – at least they’re relatively transparent about their business model being built on surveillance capitalism. But when a company explicitly positions itself as the privacy-conscious alternative while doing exactly the same thing behind the scenes? That’s not just misleading; it’s manipulative.
What really drives home the absurdity is that Viber apparently comes with 300 companies that they share your data with by default. Three hundred! And if you want to opt out, you have to manually deselect each one individually. It’s like they’ve designed the most user-hostile privacy settings possible, banking on the fact that most people won’t have the time or patience to wade through hundreds of checkboxes.
This whole situation reminds me of shopping for a new phone plan a few months back. The salesperson at one of the major telcos here kept emphasizing how they “value customer privacy” while simultaneously trying to upsell me on targeted advertising services. The cognitive dissonance was remarkable – they genuinely seemed to believe that collecting massive amounts of personal data for commercial purposes was somehow compatible with respecting privacy.
The Viber situation is part of a much larger problem in the tech industry. Companies have figured out that privacy has become a selling point, so they’ve weaponized the language of privacy protection while continuing business as usual. It’s the same playbook we’ve seen with “green” marketing – slap some feel-good messaging on your product while changing nothing fundamental about how you operate.
What makes this particularly frustrating from a policy perspective is that we’re still playing catch-up with regulation. The EU’s GDPR was a good start, and our own Privacy Act amendments are moving in the right direction, but companies are still finding ways to game the system. They’ve gotten very good at crafting privacy policies that technically disclose what they’re doing while being practically incomprehensible to regular users.
The comments I’ve been reading about this issue really highlight how fed up people are getting. There’s a growing awareness that we’re being taken for fools, and frankly, it’s about time. When someone suggests that the solution is to manually use ChatGPT to create web applications just to uncheck privacy-violating boxes, you know we’ve reached peak absurdity in terms of user experience.
The most sensible voices in these discussions keep coming back to Signal as the gold standard for private messaging. They’re right, of course – Signal’s approach to privacy is fundamentally different because their business model doesn’t depend on data collection. But here’s the catch: network effects matter enormously for messaging apps. You can’t exactly force your family WhatsApp group to migrate to Signal just because you’ve become privacy-conscious.
This brings us to the heart of the problem. Individual choice is important, but it’s not sufficient when dealing with companies that have built their entire business model around exploiting information asymmetries. We need stronger regulations that make deceptive privacy practices illegal, not just morally questionable. We need interoperability standards that prevent messaging platforms from becoming walled gardens. And we need genuine penalties for companies that engage in privacy theatre while conducting surveillance by default.
The silver lining in all of this is that people are starting to pay attention. Privacy is becoming a kitchen table issue in a way it wasn’t even five years ago. The tide is slowly turning, and companies that continue to treat user privacy as an afterthought are going to find themselves increasingly isolated.
Until then, I suppose we’ll keep playing this exhausting game of digital whack-a-mole, trying to stay one step ahead of companies that see our personal information as just another revenue stream. But at least we’re getting better at calling out the worst offenders when we spot them.