Privacy Central

Facial Recognition Privacy Concerns - Walmart
Privacy30 Aug 20173 mins read

Walmart's Facial Recognition Tech Raises Privacy Concerns

Creepy new facial recognition tech being developed by Walmart that it plans to combine with customer purchase history sets our privacy antenna tingling. It's part of a wider trend of adoption of this tech that has disturbing privacy implications.

Ben Dickson
Ben DicksonTech Blogger

Your brow is furrowed, lips pressed into a thin line as you wait stuck in a slow-moving line at the store. A video camera overhead captures your frown and alerts a manager to consider opening more registers before you give up, dump your purchases and walk out of the store — all without a single human interaction.

This scenario is set to become reality with new technology being developed by Walmart that will enable the retail giant to detect unhappy customers without the need for human interaction.

The technology uses video cameras at checkout lines and facial recognition algorithms to evaluate customer frustration and alert employees. It will also correlate customer biometric data with transactional data to detect changes of the purchase habits of the customer due to dissatisfaction.

Facial recognition isn’t new, but it has become disturbingly effective thanks to leaps in artificial intelligence and computer vision algorithms. Tech giants such as Google and Facebook are using the technology widely to identiufy faces in images, even when they aren’t sharp or are partly obscured.

And now retailers are using it to track your emotions.

This shows how the control we have over our personal information is eroding. In the past, collecting user data required an explicit interface, such as a computer or a mobile device, and a website or an app. Users always had a choice to opt out from data collection programs or avoid using a website to protect their privacy.

That choice is starting to fade, thanks to the wealth of information available online, analytics tools and AI algorithms. From social media platforms to ecommerce websites, news websites and content delivery apps, we are leaving a breadcrumb of data online that companies can exploit to learn much about us. Now, IoT sensors, WiFi beacons and facial recognition technology are expanding those data collection powers to the physical world.

And there’s no opting out from a video camera scanning your face at a store checkout, in an airport or as you walk in the street.

Concerns about facial recognition center around lack of consent. Where’s the opt out?

Retailers might be interested in the technology to satisfy and better serve their customers without causing friction. But the same techniques can be put to evil use. Last year, an especially effective facial recognition app developed in Russia became the focus of privacy concerns after it was used by vigilantes to harass unfortunate victims. The developers hinted at providing their services to Russia’s FSB service, which could use it to identify dissidents and protestors.

More recently, it emerged that in China, law enforcement is using facial recognition technology in combination with predictive analytics to predict crime. The Chinese government controls more than 176 million surveillance cameras and has one of the most invasive data collection programs across the world, which it uses to rate citizens. The combination of these technologies gives the government an almost seamless view of everything citizens do.

U.S. law enforcement already possesses a database containing facial images of more than half of the country’s adult population, and an overwhelming number of the entries belong to people with no criminal background. According to the American Civil Liberties Union, the technology is being used in different states to identify faces in photos of protests or to perform real-time facial recognition and identity checks through live surveillance cameras.

It can be argued that increased surveillance, data collection and algorithmic analysis will help prevent crime. But AI algorithms aren’t perfect and often go awry and tend to replicate the biases of their creators.

Facial recognition is increasingly being used for law enforcement. Despite little oversight, databases are filling up with millions of images of innocent citizens.

There’s little or no oversight over the use of data collection and analytics technologies, which could lead to the complete collapse of individual privacy. Politicians and civil rights defenders in the U.S. and other countries are warning that enforcing no regulation or standards over data collection or facial recognition can yield disastrous results.

An example is the investigation published by ProPublica last year, which found that an AI engine used in Fort Lauderdale to assess the risk of recidivism in criminals was twice as likely to mistakenly flag black defendants as being at a higher risk of committing future crimes.

This is by no means a rant against computer vision technology or AI algorithms. These are the same technologies that have created miracles in fighting cancer, improving education and reducing road disasters. But as is the case with every technological breakthrough, AI and computer vision will have some negative impact without proper safeguards, even more so with our lives shifting increasingly online.

There are certain measures that can prevent facial recognition and image analysis software from identifying you, but how long those techniques will remain valid is uncertain, given the accelerating pace at which algorithms are advancing.

Whether lawmakers and scientists will take the necessary precautions to make sure that these wonderful technologies aren’t employed in sinister ways remains to be seen. In the meantime, the algorithms will be watching, and privacy as we know it will never be the same.

TOP10VPN uses cookies. You can find out more here.