WhatsApp employees review millions of private messages, report says

Privacy is one of the flags of Facebook Inc. and all its social networks. At least that is what its top executives say and it is also indicated in their policies.

Also read: Cell phones in which WhatsApp will stop working in November

“No one outside of this chat, not even WhatsApp, can read or listen to the messages “, is read in absolutely every conversation that each user has on the platform. This is due to end-to-end encryption that became a flagship of the app and a demonstration of the step to absolute privacy that they have wanted to give Mark Zuckerberg and Facebook in recent years.

Read on: WhatsApp: how to know what an audio says without listening to it

Also, in 2018, in testimony to the United States Senate, Zuckerberg, as president of Facebook, assured that they “They do not see the content that is sent through WhatsApp.”

They use Facebook software to sift through millions of private messages, images, and videos.

But how true is this? Are they 100% protected? How effective is end-to-end encryption? A new report from ProPublica revealed that there are teams of WhatsApp that not only read and review messages, but also modify them in some cases or even pass them on to justice organizations for prosecution.

Also: The new tactic to steal user data on the internet

Why does WhatsApp have a thousand employees who are dedicated to reviewing messages?

Whatsapp has over 1,000 contractors in offices in Texas, Dublin and Singapore. Sitting at their posts, these employees they use a software of Facebook to sift millions of private messages, images and videos, “reads the ProPublica report titled ‘How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users’.

You can also read: September, the month most anticipated by gamers

These thousand employees review the messages that appear on their screens and also give their judgment on the content. But beware they don’t check all the messages that are sent, beyond that would be impossible.

These workers end up giving their concept on the millions of messages, photos and videos that were cataloged and reported by users as “potentially abusive” (child pornography, spam and hate).

“Workers have access to only a subset of WhatsApp messages, which users automatically flag and forward to the company as possibly abusive. The review is one element in a broader monitoring operation in which the company also review material that is not encrypted, including details about the sender and his account, “the report reads.

We Suggest: Facebook and Ray-Ban Launch Smart Glasses: Are They Worth Buying?

This is one of the many strategies of the companies behind Facebook Inc.’s social networks to offer safer environments and prevent any type of abuse. But What then is the problem?

The first thing, as Carl Woog, WhatsApp communications director, told ProPublica, is that the social network does not consider this work as a “content moderation”, something that is contemplated among the practices of Facebook and Instagram, for example . The big difference is that in the cases of the latter two messages are not encrypted.

Also: Karen Abudinen’s earrings at the ICT Ministry

In addition, these social networks have reports of content removed and openly cataloged as offensive. For WhatsApp there are no such reports.

On the other hand, ProPublica also indicated that WhatsApp shares certain private data with judicial entities, like the US Department of Justice. Those are just a few of the revelations in ProPublica’s comprehensive report, which you can read by clicking on this link.

On Twitter: @TecnosferaET

Article: Soure