Privacy Central

Can You Trust Whatsapp?
Communication26 Sep 20173 mins read

Can You Really Trust Whatsapp with Your Secrets?

With a new business service rumored, we ask whether you should trust Whatsapp with your most sensitive data. It may offer end-to-end encryption but the Facebook-owned app has come under fire from privacy advocates.

Claire Broadley
By Claire BroadleyTech Blogger

WhatsApp has announced that it will soon release a business app for Android devices. For a company hailed for privacy and security, this seems like a logical step. After all, WhatsApp has been praised for resisting political pressure when user privacy is at stake.

But Facebook — the owner of WhatsApp — was one of a clutch of companies strongly criticized by the Electronic Frontier Foundation in June for a lack of clarity on user privacy. And we could be seeing the first concrete evidence of a significant shift towards encryption backdoors as companies like Facebook hold talks with the UK government.

Would it be sensible for any business to trust Facebook with highly sensitive intellectual property? The answer — right now — is no. And the evidence is mounting remarkably quickly.

What’s Up With Whatsapp?

In June 2017, the EFF launched a report that criticised WhatsApp’s failure to publish clear privacy policies. Other criticisms include a lack of clarity on how WhatsApp handles so-called “gag orders”. The EFF raised the point that WhatsApp — and its parent company, Facebook — could sell out its users remarkably easily because it has no policy against selling the data that it collects.

There’s another very worrying development. In July, just a few weeks later, the Global Internet Forum to Counter Terrorism met for the first time. This working group included key representatives from Facebook, Microsoft, YouTube (owned by Google), and Twitter. Also at the meeting was Amber Rudd, the UK’s home secretary, who is anti-encryption and pro-surveillance. Rudd penned an article for the Telegraph newspaper that has a key detail: the introduction of backdoors into encrypted services.

The stars are aligning. As the EFF predicted, it appears that big tech and governments have scant regard for our right to privacy.

What Backdoors Mean for You and Your Business

Amber Rudd’s article claims, ludicrously, that people like WhatsApp because of the user experience — not because it’s encrypted. She presents no evidence or research to back this up. Rudd’s talks with Facebook, Microsoft, Google, and Twitter will remain completely confidential. (Presumably, the notes will be encrypted, too.)

Rudd is hardly the best person to lecture tech companies on how to safeguard data. In August, she used her personal email account to correspond with a man impersonating Robbie Gibb, Theresa May’s chief of communications. It was actually hoaxer Sinon Reborn, a man who uses primitive impersonation to correspond with public figures.

Backdoors are poison to encryption and create a massive security risk

If Amber Rudd compels big tech companies to put backdoors in their services, all of these service may as well have no encryption at all. That’s an issue for personal privacy, but it also raises questions about the security of business data. Backdoors are not just an invitation to government; they create a clear temptation to hackers as well.

An Escalating Privacy Problem

The EFF points out that WhatsApp could sell your data. And data is already big business.

Look at PayPal: it has made a massive investment in Cloud IQ, an advertising company that harvests your browsing data and applies machine learning algorithms to it. As we already know, anonymized browsing data can be used to identify users remarkably easily.

When money starts flowing between these companies, privacy safeguards matter. Remember: none of the companies that met with Amber Rudd achieved a five-star rating from the EFF. WhatsApp only chalked up two. There is a grey area between monitoring users’ activity for marketing, and monitoring the content of their communications for surveillance. Data can change hands with next to no oversight at all.

Weak privacy policies are an issue for everyone that uses encrypted services or transacts online. This risk is increased further when the same companies are meeting with our governments to figure out ways to monitor communication. You don’t have to be an investigative journalist to see where this cosy arrangement is heading.

Vote With Your Feet

WhatsApp, and apps like it, may not be the privacy defenders they claim to be. The anti-terrorism argument should not be used to sidestep individual rights to privacy.

This latest move just confirms what Snowden said all along. The UK government has brought in legislation that makes secret backdoors in encryption legal. The law exists. Now, we are seeing it swing into action.

Should we really trust companies like Facebook and Microsoft to defend us against terrorism when they cannot openly show us what that defence will look like? The clear answer is no. WhatsApp users would be wise to switch to open source messengers that are not in bed with the governments that would like to cripple their security. For businesses, the reasons are even more compelling.