WhatsApp Opposes Apple's Image Scanning 08-07-2021, 02:16 AM
#1
WhatsApp has opposed Apple's latest measures, which will include scanning your photos for possible child abuse. This means Apple will scan your phone for images matching their algorithm, which is prone to error/false positives. If you're still thinking about buying an iPhone, you might want to give Android a try.
Read More: https://www.pcmag.com/news/whatsapp-scan...ivacy-risk
Quote:An Apple effort to use iPhones to detect child sexual abuse imagery risks paving the way for widescale surveillance, according to rival WhatsApp.
“I think this is the wrong approach and a setback for people's privacy all over the world,” Will Cathcart, the head of the Facebook-owned WhatsApp, tweeted on Friday.
For years now, companies including Facebook have been using algorithms to scan, detect, and remove child porn from social media, video sites, and cloud storage platforms. However, Apple said this week that it would use “on-device” processing on the iPhone itself to detect and flag child sexual abuse material (CSAM) as the files are uploaded to an iCloud account.
The on-device processing prompted Cathcart to speak out against Apple’s upcoming system, which arrives in iOS 15.
Read More: https://www.pcmag.com/news/whatsapp-scan...ivacy-risk
![[Image: fSEZXPs.png]](https://i.imgur.com/fSEZXPs.png)