Facebook, owner of the instant messaging application WhatsApp, presumes privacy policies and ensures that posts exchanged by users cannot be deciphered by the company.
The CEO of Facebook, Mark Zuckerberg, assured during his testimony before the United States Senate, in 2018, that the company does not see “anything” of the content of WhatsApp, although this might not be entirely true.
According to a recent report published by the portal ProPublica, Facebook has low contracted to more than 1,000 employees that read and moderate the “millions of private messages, images and videos” that are sent through the app.
“WhatsApp has more than 1,000 contract workers, filling the floors of office buildings in Austin, Texas, Dublin and Singapore […] They make judgments about anything that appears on their screen, claims of all kinds, from fraud or spam until child pornography and possible terrorist conspiracies, generally in less than a minute ”, publishes the aforementioned portal.
These moderators are not employees of Facebook or WhatsApp, as detailed by ProPublica, but they have been outsourced for them, and they work for $ 16.50 an hour. One of the requirements for this job is to remain silent and not reveal your work, under nondisclosure agreements.
More than 1,000 employees read and moderate the “millions of private messages, images and videos” that are sent
The contents to which these employees must pay attention are all those that have been reported by users or marked by artificial intelligence. When a WhatsApp user reports a message, it and the previous four are decrypted and are sent to a moderator for you to review.
On the other hand, the messages selected by the artificial intelligence are examined based on unencrypted data collected by WhatsApp, such as “the names and profile pictures of users’ WhatsApp groups, their phone numbers, profile photos, status messages, phone battery level, language and time zone, unique mobile phone ID and IP address, wireless signal strength and phone operating system, as well as a list of their electronic devices, any related Facebook and Instagram accounts, the last time they used the app, and any previous history of violations “.
Secondly, some messages have been reviewed by moderators by mistake. WhatsApp is an application that is operating in hundreds of countries and its users share messages in hundreds of languages.
These moderators have Facebook translation tools to analyze flagged messages, although they are not always 100% reliable for decode local slang and the political content. Even an employee has described these same tools as “horrible”.
Finally, and according to information from ProPublica, WhatsApp shares private data with law enforcement actors, as is the case with the United States Department of Justice.
This portal denounces, for example, that user data from this instant messaging application helped prosecutors build a case against a former Treasury Department employee, Natalie Edwards, who allegedly leaked confidential bank reports on suspicious transactions to BuzzFeed News.
According to ProPublica
WhatsApp shares private data with law enforcement actors
The communications director of WhatsApp, Carl Woog, has recognized the existence of the figure of the moderators, that is, of these outsourced workers to identify and eliminate “the worst” abusers. However, for Woog and the company the existence of these figures cannot be considered as a content moderation.
“Actually, we do not usually use this term for WhatsApp,” he assured ProPublica, portal that has published the aforementioned report.
“WhatsApp is a lifesaver for millions of people around the world. The decisions we make, about how we create our application, focus on the privacy of our users, maintaining a high degree of reliability and preventing abuse,” added sources from the company.