Privacy lessons from the fitness fiasco

Most of us already recognise that technology has the potential to wipe out our privacy, if checks and balances are not in place – or at least I hope we do! What’s scary then in the recent hoo-hah about fitness trackers revealing secret locations is that it shows how bad we are – both as users and as technology developers – at spotting those privacy risks ahead of time.

Soldiers and other security staff have been warned for years against revealing their location via social networks. The risks are obvious: in 2007, Iraqi insurgents used geotagged photos to locate and destroy four US attack helicopters, for instance. More recently, geotagged selfies contradicted official Russian claims by revealing Russian soldiers in Ukraine, fighting alongside Ukrainian rebels.

Yet here we are, with people acting all surprised that, when the Strava fitness tracking app openly publishes its users’ location and movement data, it reveals where soldiers exercise, as well as civilians.

You have to wonder what on earth those military users thought they were doing, leaving a tracker wirelessly-connected when they’ve been warned for years about geotagged photos, Facebook Places, Foursquare and all the rest. Did they fail to spot the privacy options on their Strava settings page? (It’s easily done – they are buried a few layers down.) Or did they, as so many of us do, assume that it’s just ephemeral data, of no interest to anyone else?

The tracking scare should remind everyone, not just the world’s militaries, that even a direct order is sometimes not enough. And if it’s an indirect order or mere advice, you’re lucky these days if the recipient scans the first paragraph before muttering “Whatever” and clicking Accept. There must be training too, plus active checks on compliance and probably some form of pen-testing or white-hat hacking.

Beyond that, it also shows why – as the GDPR will require – you need to get a user to actively opt-in to data processing, and why it must be informed consent. Simply providing an opt-out, without a clear explanation of the risks, is nowhere near enough.

To be fair, Strava does recognise that some individuals want anonymity. In a statement it said, “Our global heatmap represents an aggregated and anonymized view of over a billion activities uploaded to our platform. It excludes activities that have been marked as private and user-defined privacy zones.”

Real anonymity is hard

The problem is that this concept of anonymity looks too much like, “Oh, that could be just anyone out there, jogging around Area 51 or that Syrian airbase!” If any more proof were needed that some people in technology have no idea what anonymisation really means, this is it.

There’s a whole bunch of lessons in here, both for Strava and the rest of us. I’ve already mentioned a couple – that privacy needs to be the default, not an opt-out extra, and that anonymisation doesn’t just mean taking the names out. Another is that there is nothing intrinsically good in big data, it’s all in how it’s used – and in who’s using it.

And perhaps it’s also to beware vanity, although that can be a tough challenge for the Instagram generation. Whether it’s soldiers keen to be top of the exercise leaderboard or app developers trumpeting how many million users they have, they’re showing off. Wanting to do your best is one thing, but as the saying goes, pride comes before a fall.

CIO
Security
Networking
Data Center
Data Management
Close