Sinisterly
iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Printable Version

+- Sinisterly (https://sinister.ly)
+-- Forum: General (https://sinister.ly/Forum-General)
+--- Forum: World News (https://sinister.ly/Forum-World-News)
+--- Thread: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch (/Thread-iOS-15-will-begin-scanning-every-photo-using-3rd-Party-neuralMatch)

Pages: 1 2


iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Spooky - 08-06-2021

Quote:Apple confirms it will begin scanning iCloud Photos for child abuse images.
This tech might help in cracking down on child pornography, but it can also be misused.

[Image: 4422416e012ed02f5c33f559d7435846.jpg]

Apple is purportedly poised to announce a new tool that will help identify child abuse in photos on a user’s iPhone. The tool would supposedly use a “neural matching function” called NeuralHash to detect if images on a user’s device match known child sexual abuse material (CSAM) fingerprints. While it appears that Apple has taken user privacy into consideration, there are also concerns that the tech may open the door to unintended misuse—particularly when it comes to surveillance.

The news comes via well-known security expert Matthew Green, an associate professor at Johns Hopkins Information Security Institute. Green is a credible source who’s written extensively about Apple’s privacy methods over the years. Apple has also confirmed to TechCrunch that the tech will be rolling out later this year with iOS 15 and macOS Monterey. It’s also published technical details of how the tech works and says it was reviewed by cryptography experts.

No one wants to go to bat for child pornography, but Green points out this tech, while nobly intended, has far-reaching consequences and can potentially be misused. For instance, CSAM fingerprints are purposefully a little vague. That’s because if they were too exacting, you could just crop, resize or otherwise edit an image to evade detection. However, it also means bad actors could make harmless images “match” problematic ones. One example is political campaign posters that could be tagged by authoritarian governments to suppress activists, and so forth.

The other concern is that Apple is setting a precedent, and once that door is open, it’s that much harder to close it.
-
https://gizmodo.com/apple-reportedly-working-on-problematic-ios-tool-to-sca-1847427745

Does anyone remember the San Bernadino shooter in 2016, where the U.S. govmt was attempting to force apple to unlock the phone. But it raised huge privacy concerns regarding the govmt basically having a backdoor to the phone unlock.

https://www.nbcnews.com/storyline/san-bernardino-shooting/judge-forces-apple-help-unlock-san-bernardino-shooter-iphone-n519701

There is huge problems with a 3rd party detecting photos among many in gallerys. Will they have non stop access to my camera? What about my underage sister? What if the govmt wanted to find all opposing poltical party's and listen in on conversations? This is fucking nuts man. Im gonna switch to android and if that option faces the same issue I will just give up smartphones. SMH.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - mothered - 08-07-2021

Quote:Will they have non stop access to my camera?
And perhaps all data stored on the device.

Thankfully I own an Android.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Accident Man - 08-07-2021

This is a very slippery slope. I do not have anything to obnubilate, but I will never use iCloud again.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Skullmeat - 08-07-2021

I used IOS for a few years and switched back to andriod. Looks like that was a better choice than I realized.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Accident Man - 08-07-2021

(08-07-2021, 08:53 PM)Skullmeat Wrote: I used IOS for a few years and switched back to andriod. Looks like that was a better choice than I realized.
I have always used Apple over Android, but I will now be switching to Android as soon as possible.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Skullmeat - 08-07-2021

(08-07-2021, 08:56 PM)Party Pete Wrote:
(08-07-2021, 08:53 PM)Skullmeat Wrote: I used IOS for a few years and switched back to andriod. Looks like that was a better choice than I realized.
I have always used Apple over Android, but I will now be switching to Android as soon as possible.

I couldnt stand IOS and how rigid it was. Apple's design philosophy that one size fits all irritates me. I tweak the hell out of andriod.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - d4ggm4sk - 08-07-2021

Pretty creepy actally. If you have a dense definition of what child porn is,
lots of hentai imaginery with young chibi girls could be labeled as 'child porn'
and hence use survelliance for opression.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Drako - 08-07-2021

Glad I don't own any Apple products. They claim that they care about user privacy, but then they do things like this. And its always been like that with Apple. Sometimes they actually do good things, like their recent App Tracking Transparency feature that rolled out in iOS 14.5. But this latest feature is in no way good for anybody's privacy.


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - laininthewired - 08-08-2021

they gonna see all the pictures of my cock lmao. sucks for them


RE: iOS 15 will begin scanning every photo using 3rd Party - neuralMatch - Dismas - 08-08-2021

Apple has clarified the specific measures, which rely more on hashing than checking your actual image:
Quote:"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."

This is likely less invasive than what Discord uses to identify NSFW images. Made a post about WhatsApp opposing the measure. Still terrible for privacy.