Apple announced last week that it would start scanning your iPhones and iCloud for photos of child sexual abuse. The new feature sparked a lot of controversy and concern among both users and other companies. When asked whether it would apply the same technology, WhatsApp said it wouldn’t, while at the same time calling Apple’s move “surveillance.”
Head of WhatsApp Will Cathcart tweeted that Apple’s move is “the wrong approach and a setback for people’s privacy all over the world.” When asked whether WhatsApp would adopt the same system, he replied with a short “no.”
In a rather long Twitter thread, Cathcart writes that everyone wants to see the abusers caught. However, he points out that there are different ways to contribute to it: Cathcart adds that Apple’s approach “introduces something very concerning into the world.” “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone,” Cathcart writes. “That’s not privacy.” There’s also a bit of Sinophobia the head of WhatsApp shows in his tweets, wondering if Apple’s technology will be used in China and what kind of content would be considered illegal there. But in the overly politically correct Western society, I believe that this technology could be misused in a variety of ways as well. Matthew Green of Johns Hopkins Information Security Institute also expressed concerns about Apple’s new system, calling it “a really bad idea”. He warns that it could eventually become “a key ingredient in adding surveillance to encrypted messaging systems.”
WhatsApp has been a part of Facebook since 2014. I find this pretty ironic considering that Facebook hasn’t exactly been known for taking care of its users’ privacy. Only instead of scanning their photos for child abuse, Facebook only leaked passwords of millions of Instagram and Facebook users, multiple times. Whoopsie. On the one hand, we do need a variety of methods for tracking down child abusers and those who distribute photos and videos of such content. But on the other hand, this technology does sound concerning and there are plenty of ways it could be misused. In addition, there’s so much going on “under the surface” in the realms of the dark web, that whatever is found through iCloud and iPhone scanning is just the tip of the iceberg. [via Gizmodo]