Choice architecture. Nudging. Manipulation. Whatever you call it, asking users to consent to tracking cookies is far from straightforward – and many approaches may even be illegal under the General Data Protection Regulation (GDPR).
Last November, members of the European Consumer Organisation, BEUC, lodged formal complaints against Google with their national Data Protection Authorities based on research carried out by the Norwegian Consumer Council (NCC).
The study analysed settings in Facebook, Google and Windows 10, and found that the interfaces were designed in a way that makes turning off privacy-intrusive settings much harder than turning them on.
The NCC said that “default settings, dark patterns, techniques and features of interface design” are meant to “manipulate users,” and drive them towards privacy-intrusive options.
This abusive business practice, that NCC described as “unethical, deceptive and manipulative,” could violate the GDPR’s principles of “informed consent,” “data protection by design,” and “data protection by default.”
“The whole issue of what amounts to truly free consent as required by European data protection law is in a state of flux. Legally, even a take it or leave it situation can arguably qualify as consent, but it is obviously not very meaningful,” says Eduardo Ustaran, a partner at Hogan Lovells, specialising in privacy. “In my view, this shows the weakness of relying on consent as a justification for the use of data and why from a regulatory perspective, the emphasis should be elsewhere.”
Consent, or else
Yet many businesses are still using consent as a way to gain access to data that users may not even be aware they are handing over.
Some sites require users to give consent or leave the site, while many interfaces “nudge” users into making what may not be a fully informed choice, through a combination of design and wording tactics that may obscure privacy-friendly choices, offer an illusion of control, or require users expend more time and effort in choosing the pro-privacy option.
Take Facebook and Google: although both allow users to tailor their settings to their preference, users who simply click “Agree” or “Accept” will never even see the settings – and, unsurprisingly, the default settings are the least privacy-friendly options.
Users are encouraged to click on the “Agree” button through clever design. For example, bright blue buttons (Google once tested 42 shades of blue) are used for accepting, while dull grey in a less bold font asks users to “manage data settings.”
Scare tactics are also used, as popups are worded to compel users to choose certain options, while information is omitted or downplayed. For example Facebook’s consent popup for facial recognition warns users that if they keep face recognition turned off, they won’t be protected if “a stranger uses your photo to impersonate you.” It also says that people with visual impairments won’t be able to tell who’s in a photo or video, but neglects to warn that facial recognition might be used for targeted advertising based on emotional states, or to identify users in situations where they would prefer to remain anonymous, notes the NCC.
The influence of advertisers
Johnny Ryan, Chief Policy and Industry Relations Officer at Brave, believes that the situation is even worse. “What’s going on now, is if you cut corners on consent you can try to trick people into giving consent. But the thing that no one discusses, is what are they trying to get you to consent to? In most cases, the portal publisher is a victim of another actor in the industry – ad tech,” he says.
According to Ryan, who has asked the European Commission’s competition department to carry out a sector inquiry, the guilty party driving ever-more demand for access to personal data is Europe’s €12 billion real-time bidding (RTB) ad industry. The advertising solutions offered by ad tech agencies typically request and collect as much data as might conceivably be useful in the future to target vast numbers of user profiles.
Of course, consent is often requested for data necessary to a site’s functioning, such as a commenting widget, or overall site analytics. “The controversy arises when a publisher finds themselves, often unknowingly, asking for permission for many, many third-parties, to do things that no one should ask permission for,” says Ryan. “This system works for ad tech, but nobody else really. I’m not convinced that even marketers are getting value for money from it.”
Breaking their addiction to this prevalent form of digital advertising is undoubtedly difficult for publishers. On top of that, there is no design standard for requesting and receiving cookie consent.
“Some interfaces might overwhelm the user with information and buttons to click. This lack of clarity may result with users making decisions in uninformed way,” says Lukasz Olejnik, an independent cybersecurity and privacy advisor.
The problem is often heightened with mobile sites, where the limited size of smartphone screens can further cramp the interface, making it more cumbersome for users to manage their consent options.
“Technology could help here. A simple and automatic consent mechanism would be far superior,” says Olejnik. “This was the spirit of the W3C Tracking Preferences Expression standard, a spirit upheld by the ePrivacy Regulation version approved by the European Parliament in 2017, but today, there is no certainty [that it will be broadly deployed].”
ePrivacy: The key to real consent?
New laws may be the only way to break the chain of advertising-requiring-cookies-requiring-consent. Although the GDPR lists some methods by which personal data may be collected and processed, it is the ePrivacy Directive that really sets out when and how cookies can be used.
That law is currently being overhauled to make it a better fit for the online age – the current law dates back to 2002 (although it was revised in 2009) and is responsible for the cookie consent popups on websites. However, as it was gradually implemented across the EU, national differences and inconsistent enforcement have made it difficult for businesses to know exactly what is expected of them.
Tracking people without their consent is already illegal under the ePrivacy Directive, but the GDPR establishes a stronger definition of consent – that it must be freely given, specific and informed. “In the GDPR, websites are no longer allowed to trick their users into giving consent. However, since it is unlikely that member states will soon start reforming their national laws on this point, we need the new ePrivacy Regulation, which would be directly applicable,” says Birgit Sippel, the MEP in charge of steering the ePrivacy Regulation through the European Parliament. “Circumventing European rules on consent would be far more difficult for member states if we had an ePrivacy Regulation instead of the current Directive.”
Unfortunately for those pinning their hopes on a revised ePrivacy Regulation, negotiations have stalled as national governments cannot agree on their position. The European Parliament reached its position back in October 2017, but cannot begin negotiations without the member states. It falls to the new Romanian presidency of the council to broker a deal before the hiatus of the European Parliament elections in May. But with intense lobbying on all sides, it looks increasingly likely that we will not see the final new ePrivacy Regulation before 2020.
In the meantime, it will be left to the courts to decide when companies have overstepped the mark in gaining consent.