The moderators of WhatsApp can read your messages

by -
Photo: Francisco Seco (AP)

techno.rentetan.com – WhatsApp is not Facebook’s impenetrable private courier service, ProPublica learns.

On WhatsApp, the end-to-end encrypted messaging service that Facebook could not espy, Facebook planted its privacy flag. Mark Zuckerberg said categorically in a 2018 Senate hearing: ‘It’s totally encrypted, we don’t see any contents in WhatsApp.’ Today, while accessing an app, a privacy policy and a ToS update reads: ‘Your personal interactions can’t be read or heard because they are encrypted end-to-end. [Their emphasis] It’s never going to change.”

This just doesn’t happen, a new WhatsApp content control system ProPublica report finds. We were aware of the existence of moderators from WhatsApp, the information being transmitted from WhatsApp to the police and the firm having long shared user data inside its data thirsty app ecosystem. This research provides a fuller overview of the tactics that Facebook has previously purposely disguised when it tries to pitch users on a privacy-based platform. When reported by the recipient, WhatsApp can read parts of your messages.

Therefore, there is some uncertainty over what it means by saying “end-to-end encryption,” which by definition indicates that only the recipient and sender have cryptographic tokens that make a message readable.

ProPublica states that the Facebook moderator Accenture reviewer user-reporting content, highlighted by its machine-learning technology, employed at least 1000 moderators.

They track spam, disinformation, hate speech, possibly dangerous threats to terrorism, child sexual abuse (CSAM), blackmail and “sexually-oriented companies,” among other things. The moderators might prohibit the account on the basis of content, put the user “on-look” or leave it. (This is different from Facebook or Instagram, which can also remove individual posts by administrators.) WhatsApp leader Will Cathcart wrote in an Option for Wired earlier this year that “400,000 reports had been sent last year to child safety authorities and people were subsequently convicted.”

Most can agree that imagery and CSAM should be checked and reported; media crises for lack of moderation are generated often by Facebook and Pornhub. But WhatsApp censors informed ProPublica that a large amount of innocent posts, such as bathroom kids, are being sent by the artificial intelligence system. ProPublica says that moderators can view the last fünf messages in a string once the flagged information reaches them.

WhatsApp reveals, in its service terms, that it “receives the latest messages” from the reporting group or user when an account is reported as well as “details on the latest user interactions.” This does not clarify that such information might include telephone numbers, profile images, linked Facebook and Instagram profiles, their IP and mobile phone IDs, which can be viewed by moderators. And WhatsApp does not reveal that it collects metadata from all users regardless of their privacy configurations.

The gathering contradicts the large public of WhatsApp’s protest against the Indian government earlier this year. The company claimed in a statement shared with Reuters that the fight against a new law which probably would have allowed Indian law enforcement officials to scan questionable messages:

It is also necessary to request message apps in “track” chats, to request that we preserve a fingerprint of each message transmitted to WhatsApp that would breach end-to-end encryption and weaken the privacy right of individuals fundamentally.

But like Facebook, WhatsApp appears to be excited about providing metadata with U.S. law enforcement authorities, including information to prevent accountability by the government. In a case against the whistleblower of a Treasury Department who provided secret paperwork with BuzzFeed, prosecutors argued that at the time of publication Natalie Edwards interchanged several dozen texts with a reporter. Edwards is currently in jail for six months.

Legal enforcement can acquire a court-ordered subpoena, but WhatsApp can alternatively choose not to save information – its competitor Signal claims your contact details are the only metadata it collects. WhatsApp would not be able to share anything if it chose to encrypt Signal’s feature to encrypt the metadata.

WhatsApp didn’t clarify much about how it can get decrypted communications, simply that the person who tapes the report button generates new messages automatically between him/herself and WhatsApp. WhatsApp appears to have a kind of copy-paste feature deployed, but the details are yet unclear.

Facebook told Gizmodo that the WhatsApp can view conversations because the business and reporter regard them as a version of direct messaging. Their logic means Facebook collection of this material does not contradict with end-to-end encryption. It is not an intentional decision for users who report content to share information with Facebook.

So, yes, without your approval, WhatsApp can see your messages.