If you’ve ever found the Significant Locations section on your iPhone, then a recently published study that shows how such data can be used to decipher personal information about users should pose some alarm.
The way Significant Locations works is that your iPhone keeps a list of places you frequently visit. This list usually shows your favorite places and shops and will, of course, log the location of any service you might visit often, such as the medical center.
Apple gathers this information to provide “useful location-related information” in its apps and services, and promises this data is encrypted and cannot be read by Apple. But I’m a little unclear whether this information is made available to third-party apps.
You can see this information for yourself, but you really have to know where to look: go to Privacy>Location Services>System Services and then look at Significant Locations item at the end of a lengthy list. Tap on any one of the items in the list to find a whole set of data points, including all the different places you’ve been in your city, and when.
Apple’s crusade to protect privacy is well-known and I believe it to be sincere in its attempt. But it could go one step further with this feature. Let me explain why.
Why we need private places
A newly published report shows that location data can be used to figure out personal information.
[Also read: Apple wants Safari in iOS to be your private browser]
The researchers ran a small study across 69 volunteers using their own testing app on iOS and Android devices. In just two weeks, the app gathered more than 200,000 locations — and researchers were able to identify nearly 2,500 places. They used that to surmise 5,000 pieces of personal data, including highly personal information around health, wealth, ethnicity, and creed.
‘Thanks to machine learning…’
“Users are largely unaware of the privacy implications of some permissions they grant to apps and services, in particular when it comes to location-tracking information,” explained researcher Mirco Musolesi, who observed the use of machine learning to boost information discovery.
“Thanks to machine-learning techniques, these data provide sensitive information such as the place where users live, their habits, interests, demographics, and information about users’ personalities.”
It doesn’t take a genius to figure out that when these methods are extended across a congregation of thousands or even tens of thousands of users, untrammelled surveillance via apps can gather, analyze and exploit vast troves of incredibly private information, even if only confined to location information.
This should be of concern to enterprises attempting to manage distributed teams in possession of confidential information; in the wrong hands, such information can open employees up to blackmail or potential compromise. All it takes is one rogue app, or one rogue worker with access to such data gathered by an otherwise bona fide app developer.
A new approach
Apple does provide extensive information about how it protects privacy with location data, and it is possible to disable Location Services at any time on a blanket or per-app basis. In light of the report, how can Apple improve this protection?
The researchers say they hope their work will encourage development of systems that can automatically block collection of sensitive data. For example, location tracking can infer when a person visits a medical center or hospital, so perhaps a system to obfuscate such visits could be created?
Another approach that might work is to give users tools with which to deny collection of some location data. I can imagine a system that lets users disguise visits to places they define, or to generic categories of places they wish to protect — hospitals, medical or counseling centers, for example. When the system recognizes a user is in this place, it can decline to share or collate that data with any third-party app.
Now, I’m certain competitors dependent on purloining such information will complain that this provides Apple with some form of advantage in that system-level app support would remain possible. But that sounds more like an API request than a genuine need for courtroom time.
The report quite clearly shows that when gathered in bulk, even something as simple as location data can be exploited; that’s something everyone should consider when asked to provide an app with location data access, particularly when the service seemingly has little to do with location.