Read Those App Permissions - It Won’t Save Your Privacy
We all know how much information our smartphone apps hoover up. Most of these apps are capable of accessing our phones’ camera, location services, microphone, contact list, and social media accounts – it’s up to us to read the permissions requested and decide if we should grant them these windows into our lives.
Of course we would never grant permission for a weather app to read our contacts list, just like we know something’s fishy if a flashlight app demands access to our location.
A more insidious problem can arise when an app has a legitimate reason for requesting permission to a feature.
This is the crux of the galvanizing New York Times investigation that uncovered numerous mobile apps to be collecting real-time locations for individual smartphone users and sending this information to data brokers at a rate of around 8,600 times a day per user, or once every 21 minutes.
The apps all had reasons to require users’ locations – they were weather apps, transit apps, sports score apps (location apparently used to recommend nearby teams and players), among others. Nearly all of them collected location data because users gave permission – but out of 17 apps that sent this precise location data to brokers, only a few (three on iOS, and one on Android) informed users that the information would be used for advertising and to “analyze industry trends”.
For marketers – and indeed, most companies – location data is a goldmine for precisely this: the analysis of trends, insights, consumer behavior. Slightly different phrases for the same outcome – figuring out how you’ll act, in order to sell you the right thing at the right time.
Possibly you may think, what’s the big deal? Of course our data is being monetized – how else am I getting this sweet traffic app for nothing except a bunch of data points that are anonymized anyway?
Here’s the individual argument, the threat to your personal privacy: despite the fact that such information is scrubbed of identifying meta-data, nearly anyone with access to a supposedly anonymized set of location data could identify someone without consent – for example, by noting the addresses where a phone spends time and using public records to infer the rest. Digital records as simple as Facebook likes have been shown to be eerily predictive of intimate personality traits; where we spend our time is evidence not only of where we work and live, but our health status, annual income and education.
There’s at least one thing you can do – delete those free apps. They have the financial incentive to share your personal data and location for cash. Even if you think you’ve opted out of location-sharing, data brokers may be inferring your real-time GPS anyway.
But there’s a societal issue too. Call it a Cambridge Analytica: Your friend’s personal data, habits and personality traits can be exposed by the apps you use. Facebook is the most egregious offender here, thanks to the vast numbers of apps and services that plug into its platform.
As the artist Trevor Paglen said in an interview for The End of Trust (an absolutely fantastic compendium of essays on privacy by a list of luminaries), we need – and deserve – spaces that are not leveraged for capital or political value. At least some of the dozens of hours spent on our smartphones per day should be out of reach of the faceless haze of data brokers out there, intent on inferring insights into our loves, hates and habits.
Good digital practices – reading app permissions, avoiding free apps where one might “be the product” – can only accomplish so much. The decks are stacked by lax app store policies allowing apps to avoid mention of the intimate data they’re sharing with advertisers; by the largely unregulated data brokerage industry that can get away with packaging our activity across disparate services into neat profiles of the perfect customers.
It’s not only up to app developers to be more transparent, or for people to “know better”, or even greater regulation of the suffocating embrace between tech companies and data brokers. We need to go beyond good digital hygiene and work towards an overhaul of this amorphous industry in charge of our privacy, from its rules to our own expectations of what gets to be shared.