“We don’t see any of the content on WhatsApp,” Facebook CEO Mark Zuckerberg said during his testimony before the US Senate.
What so many people feared seems to be real Facebook has under contract more than 1,000 employees who continually read and moderate “millions of private messages, images and videos“sent through your affiliate service WhatsApp, according to a report published on Tuesday by the ProPublica portal.
Likewise, WhatsApp continues to promote its privacy policies and emphasizes that messages between users cannot be decrypted by the company. The CEO of Facebook, Mark Zuckerberg, he assured during his testimony before the US Senate in 2018: “We do not see any of the content on WhatsApp“However, according to a new report based on internal documents and interviews with moderators, it turns out that this is not the case.
“We don’t see any of the content on WhatsApp,” Zuckerberg said.
“WhatsApp has over 1,000 contract workers, filling office building floors in Austin, Texas, Dublin and Singapore. They make judgments about anything that appears on their screen, claims of everything from fraud or spam to child pornography and possible conspiracies. terrorists, generally in less than a minute, “wrote ProPublica. These moderators are not direct employees of WhatsApp or Facebook, but are contractors who work for u $ s16.50 an hour and they are required to remain silent about their work, under nondisclosure agreements.
What do those who spy on us see?
According to the medium, the contents that employees should be aware of are messages reported by users or marked by iartificial intelligence. Thus, when an Internet user presses’report‘, the message in question, as well as the previous four in the respective chat, are decrypted and sent to one of those moderators for review, as reported by the Actualidad RT site.
The messages selected by artificial intelligence are examined based on unencrypted data collected by WhatsApp, such as “the names and profile pictures of users’ WhatsApp groups, their phone numbers, profile pictures, status messages, phone battery level, language and time zone, Unique mobile phone ID and IP address, wireless signal strength and phone operating system, as well as a list of your electronic devices, any related Facebook and Instagram accounts, the last time they used the app and any previous history of violations. ”
On the other hand, some of the messages reviewed by the moderators were marked by mistake. WhatsApp has 2,000 million users They speak hundreds of languages, and staff sometimes have to rely on Facebook’s translation tool to analyze flagged messages. According to one employee, that tool is “horrible” for decoding local jargon and political content.
The reports also indicate that WhatsApp shares certain private data with law enforcement agencies such as the US Department of Justice As an example, ProPublica reports that data from WhatsApp users helped prosecutors build a case against a former Treasury Department employee, Natalie Edwards, who allegedly leaked confidential bank reports on suspicious transactions to BuzzFeed News.
“WhatsApp is a lifesaver for millions of people around the world,” said Carl Woog.
For his part, the WhatsApp communications director, Carl Woog, acknowledged that its contractors review messages to identify and eliminate “the worst” abusers, noting that the company does not consider such work to be content moderation. “Actually, we don’t usually use this term for WhatsApp,” Woog told ProPublica.
“WhatsApp is a lifesaver for millions of people around the world. The decisions we make, about how we create our application, focus on the privacy of our users, maintaining a high degree of reliability and preventing abuse.“, they explained from the company.