Right after announcing that it would scan your devices and iCloud for child sexual abuse images, Apple faced a strong backlash. So strong, in fact, that the company has decided to postpone the controversial feature. At least for a while.
Apple’s announcement arrived last month, explaining that there would be a new feature scanning users’ photos for child sexual abuse material (CSAM). As probably expected, people were alarmed almost immediately. The company tried to explain that it would only scan already flagged photos as if it were any better, but individuals, current Apple users, various groups – everyone expressed their concerns over the upcoming feature. WhatsApp even called out Apple publicly, calling the announced scanning system “surveillance.”
Because of all this, Apple decided to postpone the launch of the new feature. My first thought was that they would just wait until the dust settles and launch it anyway. And I was mostly right. In a statement to The Verge, an Apple spokesperson said: So, Apple won’t be scanning your iPhone and iCloud photos, at least not yet. But the feature is most likely coming anyway, only it will happen later than it had previously been planned. “The critically important child safety features” are quite a slippery slope, as many folks would apparently agree. I know that children should be protected from predators, and I support it… But I can’t help but think that it’s just a cover for privacy abuse, and not to even mention potential data leaks in the future. [via The Verge]