Affiliate links on Android Authority may earn us a commission. Learn more.
Investigation raises fresh concerns over WhatsApp privacy and encryption
- A new report highlights the inner workings of WhatsApp’s content reviewal system.
- The report suggests that despite WhatsApp’s claims that employees cannot read messages, it nonetheless employs contractors to review content.
- WhatsApp maintains that its employees can only read messages reported to the firm.
According to a lengthy new report, WhatsApp‘s privacy-focused claims may not be as watertight as users might expect. ProPublica revealed the inner workings of the company’s moderation system, which suggests that WhatsApp contractors, under specific circumstances, are able to read messages sent between users.
Per the report, WhatsApp employs at least 1,000 contractors using “special Facebook software” to scan content flagged by the company’s machine learning system or reported by users. This content varies from child abuse material to spam, terrorist activity, and beyond.
WhatsApp regularly notes that only senders and recipients can see their chats due to end-to-end encryption, which first made its debut on the platform in 2016. Since then, it has been a key marketing tool for the Facebook-owned service. However, the existence of a content review system arguably goes against the company’s privacy push.
WhatsApp’s content review system
WhatsApp has a good reason for implementing a message reporting and reviewing system, though. It told ProPublica that this process allows the company to ban abusive and harmful users from the platform. It also suggests that users have to initiate this reporting process. When a user is reported, only the offending content, as well as four previous messages in a thread, are sent to WhatsApp “unscrambled.” While moderators can see these messages, they don’t have access to a user’s entire chat library, nor can the machine learning system access it. Reviewers can either dismiss the reported message, ban the reported user’s account, or place them on a “watch” list.
However, some unencrypted info can also be scanned. According to the report, unencrypted data from accounts placed in a “proactive” list can be used to compare them to suspicious practices. This info ranges from the details of a user’s groups to their phone number, from their status message and unique mobile ID to their battery level or signal strength.
See also: Here’s everything you need to know about encryption
It’s understandable that a chat platform would want to implement a review and reporting system to allow users to report abuse, however, it perhaps WhatsApp’s lack of clarity regarding the system that’s the biggest issue at hand. In a statement to ProPublica, Facebook noted that it believed its content reviewing system isn’t a problem for users. “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp, we receive the content they send us,” it noted.
Nevertheless, the report is likely still a blow to WhatsApp’s privacy optics, especially on the back of its divisive privacy policy changes. The company announced the changes in January which would allow some data to be shared with Facebook. WhatsApp has since altered its rollout plans. WhatsApp was also fined $267 million for breaking privacy laws in the EU.