Nearly five years ago the British House of Lords used the term ‘wild west’ in a damning report on the state of the internet, calling on government and industry to do much more to protect users online and help restore fading confidence in the world wide web.
Today pundits bandy the same term about, but this time they are talking about mobile apps, and in particular a storm that erupted earlier this year around iOS applications which take data from user address books without user consent or even knowledge.
The issue first came to attention in early February when a developer discovered that a social app known as Path was exfiltrating pretty much all the data from user address books – full names, email addresses and phone numbers – as a matter of course and sending it back to the Path ‘home’ server without asking users’ permission.
After members of Congress picked up the issue and contacted Apple to ask why this was being allowed, a whole heap of app providers including Foursquare, Hipster and the recently acquired Instagram hurriedly amended their software so that it flashed up a user permission message in such situations.
For the record, Apple told the lawmakers quite correctly that “apps that collect or transmit a user's contact data without their prior permission are in violation of our guidelines”. This obviously didn’t stop some unscrupulous or absent minded developers from ‘forgetting’ this particular guideline in the past, however, and Apple said it will address this in a future software release.
Android, so often second best when it comes to comparing the security and privacy features of the two most popular smartphone platforms around, already has functionality to force developers to ask a user’s permission if their app requires access to the phone’s address book.
I would argue, though, that there is a more fundamental security issue at play here and it revolves around basic human behaviour – most users either don’t know enough or care enough about their phones and what could happen to their data to make the pop-up permission box a satisfactory solution.
Even in these privacy-conscious times, users will often click through whatever messages they get on their phones – they just want to get to the good stuff and start enjoying their apps.
Add to this the fact that the small message on a smartphone screen rarely explains exactly what is going to happen to the data once it is sucked into the cloud – how it is going to be used, where it is going to be used, and whether it will be transmitted and stored securely – and you have a situation of what I’m going to call “oblivious data loss”.
There are several security and privacy risks here, of course, not least the implications of BYOD smartphones in the enterprise with data hungry apps downloaded on them inadvertently exposing the confidential details of colleagues, clients and business partners.
At the moment apps don’t distinguish between groups when asking user permission to access the iOS address book, but even if they did, users may accidentally save a new business contact in their personal instead of corporate contacts group, and then allow an app access to the former, again resulting in oblivious data loss.
In a worst case scenario, then, exactly what are the risks of oblivious data loss?
Well, many apps are keen to stress that any information will be sent up to their servers ‘securely’, while not specifying exactly how this achieved. But even if it is sent securely, it’s probably more important what happens after that. Is it being stored in plain text form? Are the developer’s databases secure? How secure? Who is allowed to see it? Under what circumstances?
Even if the reason for copying user data is innocent in the first place, which it usually is with reputable apps, there is no guarantee that the data will remain safe wherever it ends up. It’s even worse if the data is stored in plain text, as Path was found to be doing.
At the end of the day most developers just want to build the most user friendly, compelling application possible and if it’s social, they want to find all the people in your address book and use that info to improve your experience on the app. Oh, and maybe make a billion dollars along the way…
It’s unfortunate that more of them don’t use a technique known as hashing, by which they could effectively anonymize this data, which would make it next to useless for any criminals who might get their hands on it. Even better yet, these cloud providers should be using policy based key management and strong encryption to protect the precious (to someone) data they copy.
If the group or individual behind an app has more malicious intent, of course, then the kind of oblivious data loss we’re seeing at the moment will enable them to mine a rich seam of personal information with which to launch phishing scams or other social engineering-based attacks.
Then there is the question of what happens when or if a company is taken over by a third party. What will happen to the user’s data then? Don’t assume that it will be destroyed or even that your rights will be preserved.
One of the most challenging aspects of this oblivious data loss problem is that there is no easy solution. What makes the whole thing a lot more tricky is that even if an individual is super careful with their own address book, how do they know their own information is not being sucked up into the cloud by countless applications on their friends’ phones?
Apple certainly has to make a start by enforcing a rule that developers wanting to access the iOS address book must ask users’ permission, but developers also need become more attuned to the privacy requirements of their users and start to store data more securely.
In the end let’s hope we’re not still talking about this wild west of oblivious data loss to the cloud in five years’ time.