Privacy Central

People checking phones
Opinion16 Jan 20194 mins read

Happy? Anxious? There’s An Ad For You

Data science lets advertisers reach people with greater precision than ever. Soon, that will include how we feel. Can we trust algorithms to target ads fairly?

Natasha Stokes
By Natasha StokesFeatures Editor

Would you rather have ads shown to you based on your emotional state, or on your past online behavior? What’s that – you’d rather ads not be targeted to you at all? That prospect is an increasingly distant option, as brands make use of ever smarter technology to decode our every mood.

Spotify recently announced that its Discover Weekly playlist – that much-loved, personalized mixtape made by algorithms – will be up for brand sponsorship. Microsoft has already signed on to promote its AI knowhow with a package that will include all ad spots during Discover Weekly playtime (ads are pushed only to free accounts, which make up around 55% of Spotify users).

Detecting and capitalizing on customer intent is a core tenet of advertising. But in this case, Microsoft – and all future Discover Weekly advertisers – get access to a uniquely valuable audience, created through the feature’s algorithmic personalization based on hours and hours of listening data.

As Spotify notes on its marketing page: “Spotify has won the trust of 191 million fans by offering music customized to give them joy and delight. Our in-app environment is a place of safe discovery and we are willing to extend that premium ecosystem with brands that share our values.”

Over at Vox, Kaitlyn Tiffany dives into the issues with a big brand like Spotify trading on its users’ trust (not to mention “joy and delight”). Advertisers leveraging the emotions of potential customers may be nothing new, but technology – and more accurately, data science – is turning it into an increasingly precise business.

ESPN’s smartphone app targets ads based on whether sports fans are feeling “happy, sad, slightly anxious, or overjoyed”, which apparently it can do because it has “the largest database of fan behaviors, preferences and insights”. The New York Times’ Project Feels ad product says it matches ads with people based on their predicted emotional response to Times articles, through artificial intelligence that analyses associations between content, keywords and emotion.

Rolling Stone’s Amy X. Wang notes the risk that Spotify is taking by “loaning out the audience of its most beloved product”. The question is what else it might do with its vast stores of users’ listening data. Spotify recently patented tech to “personalize user experience based on personality traits” – and it’s one of many tech companies with patents to serve ads based on mood, like Amazon, Apple, Facebook, Google and Microsoft.

Sure, there’s a basic rebuttal: what’s wrong with a better targeted ad? Not much, possibly, if an ad for a new phone reinforces a positive mood – but what if a negative mood is detected? Facebook has produced advertiser reports on its ability to identify feelings of distress in teenagers, including states of mind like “insecure”, “anxious” and “worthless”. Can we trust algorithms – and advertisers – to respond appropriately to emotional vulnerability?

The data that companies like Facebook and Google collect on their users and what they’re interested is a critical factor in their hyper-successful ad-based business models. Advertisers like their platforms for the narrow targeting that’s possible – but the algorithms underpinning these ad platforms have sometimes crossed the line into discrimination. One study revealed that Google was much more likely to show higher-paid job ads to men over women. Facebook advertisers have been able to exclude people by race, as well as mothers, the disabled and Spanish-speakers and a class-action lawsuit is underway against the big F for age discrimination in job ads.

Algorithmic bias usually stems from unconscious bias in training data, and for now, there’s no regulatory standard to root out the bias inherent in a sector dominated by a small demographic. In the brilliant Hello World, which outlines the importance and capabilities of algorithms that influence the world around us, Dr Hannah Fry discusses the need to understand how algorithms work, instead of entrusting private companies with this power to shape our lives and behaviors.

After all, people like algorithms. We are happy to give over data when it results in a better, customized experience, whether on news sites, shopping sites, Netflix, or Spotify.

On Forbes, seven tech commentators weighed in on the future for the nascent business of mood-targeting ad tech, and agreed that emotion-matching is just the beginning. Increasingly sophisticated data collection – say from smart home devices, activity trackers and other wearables – and AI analysis will explode access to data, allowing advertisers to anticipate needs – “perhaps even before the person realizes it” – and develop even more personalized marketing campaigns.

Amid the inexorable spread of algorithms that can read us and show us what we want, something like that is already happening. As data science advances, it will need stricter regulation to ensure our algorithmic overlords treat us fairly.