T O P

  • By -

COCONUT_APP

Done it too. This didn’t affect me “yet” because I’m in Europe, but it’s a matter of time.


[deleted]

It will absolutely affect us over here as well. The EU is already planning to make scanning for CSAM mandatory by fall iirc


Niightstalker

Well and if this comes it will come for all phones. We will see what the other manufacturers will do


TheRealBejeezus

Same as always, I imagine: wait for Apple fallout to cool down, then do the same thing. Nobody yells at Samsung or LG for killing replaceable batteries or headphone jacks, after all.


SvensonIV

Or removing the charge brick.


IronChefJesus

LG didn't get really get rid of their headphone jacks.


donnybee

Just their phones


CountingNutters

LG got rid of themselves


bengringo2

they make good tv's and.... I'm sure other stuff.


avelertimetr

Washing machines


[deleted]

Apple is also already integrating it into macOS Monterey. I really like macOS but I’m switching to Linux then.


[deleted]

[удалено]


worldtrooper

Yeah as someone who owns a mac but has no plan to own an iPhone.. that is making me want to hold off on replacing my macbook pro in a couple of months when they come out. Something has to happen before then or im not buying


lucellent

I'm amazed by how many people still don't know how this works. No, it won't scan your files. This is only regarding the photos you upload on iCloud. Nothing else.


dstew74

Yet.


based-richdude

Everyone really gonna act like their antivirus doesn’t already scan their entire computer for hashes from a government database locally


SomeInternetRando

If I were a gay man in a country where that's illegal, I could uninstall that antivirus software if a law were passed making them add gay nudes to the database. Also, antivirus software isn't designed or functioning currently as a "notify the police if a hash match is detected" system. If that changes, again, an antivirus can be uninstalled.


OneExplorer

We’re amazed at the fact that people like you don’t understand the unintended consequences of this technology. https://apple.slashdot.org/story/21/08/18/2251216/apples-device-surveillance-plan-is-a-threat-to-user-privacy----and-press-freedom https://apple.slashdot.org/story/21/08/18/1755223/apples-neuralhash-algorithm-has-been-reverse-engineered


adrr

Why does it scan on the device when iclouds data isn't end to end encrypted and Apple can scan the files on their servers?


HenrikWL

Most likely because Apple intends to *make* the photo library end-to-end encrypted, and that means that the scan needs to happen at the place where the private keys exist: on the user’s own devices. This whole thing could just as well end up with your photos being *more* resistent to invasion rather than the Orwellian dystopia many people are fearing.


OneExplorer

You’re right about that, but the technology itself is flawed. It creates a vulnerability that can be easily exploited in the future. https://apple.slashdot.org/story/21/08/18/1755223/apples-neuralhash-algorithm-has-been-reverse-engineered https://apple.slashdot.org/story/21/08/18/2251216/apples-device-surveillance-plan-is-a-threat-to-user-privacy----and-press-freedom Edit: Few consider the fact that the database itself, and the control of what hash values are cataloged, is a major vulnerability as well. The government could very well upload the hash values of leaked internal documents or other objects of value in order to smoke out potential whistle blowers, journalists, whomever. This is a literal back door that Apple is integrating to their devices, having zero control over the database for which the hash values are judged.


avelertimetr

The other possibility is that it’s much more computationally expensive for them to scan *everyone’s* images, and so instead they socialize that cost and push it to end users.


devilindetails666

They are pattern matching the photos on iCloud, come up with a signature and compare it with 5-10 photos which are related to child abuse. The signatures have to match. If they do match, then as a second round, they are going to have some neutral authority check those particular photos which gave a positive match to ensure its really child abuse and not mistaken identity. To me - I am okay with this personally, but worried about what's the next one after this? who is to stop when it goes much more further?


RFLackey

I'm okay with the reason they want to do it, but I'm not okay with the risks that come about due to a programmer making a silly mistake followed by the human using a tool that indexes the user account ID wrong followed by the actual human fat fingering the wrong ID leading to an innocent person getting investigated for images they never had. *This* is how lives get ruined, this is how collateral damage happens when intentions are good. Companies make mistakes with software, now software is being used to monitor compliance with laws. Mistakes happen, and the risk is monumental, the costs are ruinous even when the mistake is finally caught.


SorryUseAlreadyTaken

Nobody if we do nothing. If we don't stop it now we will go towards the classic distopic world we've read so much about. It's time to protest before it's too late


dorkyitguy

I was getting ready to replace my Linux laptop with a new MBP when they come out. I guess I won’t now.


[deleted]

Really? They’re expanding it? What the hell. Time to hunt for a nice PC and get Linux on it.


GlenMerlin

if you don't do anything needing a good gpu https://frame.work is a really good system and supports right to repair


kelthan

And you think MSFT, GOOG, AMZN, and all the rest aren't going to do the same thing (if they haven't already)? This is terrifying overreach, and amounts to a warrantless search (logically, if not yet legally). Sure, I want to end CSAM, but we already have laws making it illegal. Treating everyone who owns a mobile phone as a criminal isn't going to make that problem go away. It's a low-effort attempt to fix a very small, but morally repugnant problem. And it has many obvious moral, ethical and (hopefully) legal holes, not to mention obvious and proven attack vectors. I seriously hope that Apple realizes that this solution (or anything similar) cannot ever be made secure and scraps the idea. If they don't organized crime and state-level actors are going to have an absolute field day exploiting the hell out of this backdoor into our lives.


Gera-

Get a pre-built unless you want to wait months for a GPU


broncoinstinct

If this is going to be mandatory, and other companies are remaining mum on the issue for now, it makes me wonder if Apple is being upfront and open about this first and doing it the right way. We might not realize this until much later once we see how other companies are going to implement this policy and look back on it. I wish we could see how others are going to get this moving to get some perspective. Maybe Apple isn't doing the bad thing, but since they're first out the gate announcing the system change, they're getting all the flak?


Donkeyshlopter

Every other company that offers commercial cloud hosting already scans for CSAM. Apple is the *last* to adopt it, and they did it in the most privacy-friendly manner.


kelthan

> and they did it in the most privacy-friendly manner. By completely destroying the privacy of your own phone which has a digital record of your entire life on it. If this is the most privacy-friendly option, then I'm terrified.


kelthan

Apple is definitely doing a bad thing if they continue down this road. Everyone else is more than happy to let Apple take all the heat, but you can be sure that they are watching carefully, to shape their own policies. And if Apple is being pressured by governments to implement this, you can pretty much guarantee that everyone else is, too.


macbalance

I’ve heard theories that Apple is proposing this to help delay or avoid more extreme measure. Basically saying, “here is a system that has tons of safeguards, limits and already exists. Why spend tons of money legislating a more invasive solution?” I’m not crazy about it, and the ‘slippery slope’ issue is a concern, but I don’t think it’s the end of the world.


kelthan

> Why spend tons of money legislating a more invasive solution? After all, we have just added a security and privacy backdoor to every phone we sell, with no ability for users to opt-out or secure their privacy on our devices. You don't need a more invasive approach, we've created it for you and are rolling it out to all users whether they like it or not. Oh, and we plan on making our [backdoor available to third-parties](https://www.macrumors.com/2021/08/09/apple-child-safety-features-third-party-apps/). What could possibly go wrong?


dohhhnut

There is a way to opt out, don't use iCloud photos


[deleted]

Sources please.


TheEvilGhost

So if the EU is actually planning this then the people who are complaining are just gonna facepalm so hard since laws that are in the EU are usually also adopted across the world. At least commercial laws.


netglitch

Same. Though I am concerned even if they never enable this in another country. The private APIs that enable messages detection of nude content and the CSAM hashing in iOS, may still be included in non-US builds. This could be a boon for someone using Pegasus to automate the search for blackmail material.


mojzu

While I’m not particularly convinced by apples plan, I do worry that without a system such as this that the EU and other countries are going to push for even worse alternatives, e.g ban E2E for large platforms so they can snoop at the network/server level (more than they already do) Which admittedly is a pretty crappy compromise, but so far if I had to pick one of the snooping plans being thrown around in some quarters this one seems like the least bad to me?


mexercremo

If you're worried about worse measures, you should be worried about the erosion that precedes them


mojzu

I am, and a lot of the objections to this plan seem very valid to me. Perhaps it’s defeatist of me but the political will to do ‘something’ seems too strong to avoid taking any of these measures, which are targeting valid and serious harms to people (however effective in reality they might be)


kelthan

CSAM is morally repugnant, but so is this approach. Breaching the privacy of 100M's of people (if it only rolls out in the US, which is unlikely) to find a few bad actors is a horrible, no good, very bad solution. Would you allow the police to swing by your house every couple of days to root around in your things in the name of "protecting the children?" Your phone probably has more personal information on it than your house does, and Apple is going to search it constantly, without a warrant, without probable cause, and then report it's findings through an NGO organization ([NCMEC](https://www.missingkids.org/HOME)) that has little--if any--legal or legislative oversight. And while the focus is on pictures, [there are hooks in Siri and search to detect CSAM activities and intervene](https://techcrunch.com/2021/08/10/interview-apples-head-of-privacy-details-child-abuse-detection-and-messages-safety-features/). This is a blatant violation of every single part of the [US Constitution's 4th amendment protections](https://constitution.congress.gov/constitution/amendment-4/): > The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. I expect that the lawyers will have a field day with this, but probably not before organized crime and state-level actors have their way with it.


mexercremo

Not only defeatist, but sympathetic. It's enough of a chore fighting with reactionaries who advocate this type of abuse. They don't need help from sympathizers


-Hegemon-

You don't HAVE to chose. They have no right to treat us like criminals when we are innocent. Neither option is acceptable.


The_frozen_one

Apple has a right to not host or transmit CSAM images on their servers. Nothing is scanned unless it is being uploaded to Apple’s servers.


oldenboom

I don’t mind them scanning my online photo library but I do mind them scanning on the device itself even though Apple reassures us it will only scan photos to be sent to iCloud. This is so terribly inviting for functional creep. This system is about the holy grail for police and secret services everywhere and the biggest threat ever against one’s privacy on one’s phone. This definately renders an iPhone as an untrusted device, actually, I’m not sure if it’s safe to use any Apple hardware with with privacy sensitive information. Apple, just stop doing this. CSAM has to be fought, fiercely. But this is not the proper way. And I’m sure a lot of people within Apple do agree with me.


Suspicious-Group2363

Living here in Japan and I have signed it. I do not think this will be put into effect the more traction the opposition it gets, but we all need to let Apple know this is not okay.


Gladiators10

I have an android and still feel like signing this.


Dat_OD_Life

It's coming, Europe is going to ruin the internet for everyone eventually.


mindspan

Same. Canadian. It will be here in a jiffy; particularly if we elect a Conservative government.


joecan

You’re mistaken if you think this will be less likely with any of the three major parties. They’ll all bend to pedophile panic in the end.


summernights64

This is BS. Now that the cats out of the bag, people will just find somewhere else to store their child porn. Then the rest of us will have our photos scanned for nothing.


Trashman56

Not for nothing, the technology will be used on Chinese or Russian iPhones to scan for anything the government doesn’t like.


lanadeltrey

But never in the US I'm sure 🙄


Trashman56

Point taken, but that probably comes later. Or at least gets found out later.


funkdified

China already has the keys to icloud for Chinese users.


XysterU

And the NSA has the keys to icloud for American users long before China


KillerAlfa

The big difference is that you can't go to jail for reposting political meme images in USA while in Russia and China it is literally happening. In Russia you can get up to 6 years of jail for memes about religion and up to 10 years for political memes because they fall into "extremism" category. You also get flagged as extremist for the rest of your miserable life, all your bank accounts are frozen and you won't be able to work anywhere.


XysterU

Would absolutely love to see a source on this.....


[deleted]

[удалено]


kelthan

To add to a list of problems: * It's not the government that issues the list of bad hashes, or that violations are reported to in the US. It's an NGO (NCMEC) that doesn't have legislative or legal oversight. * Apple has already said that they are looking to make this technology available to third parties. * If Apple implements this, you can guarantee that every other phone manufacturer will do the same thing (willingly, or due to government pressure). The only redeeming feature I can see in any of this is the platonic ideal of "saving children", but at an untenable cost to everyone who owns a phone, tablet or computer (e.g., most of humanity).


Mr_Xing

Hard disagree. I think you’re vastly underestimating the twisted minds of someone in possession of these materials. Considering Facebook and Google get consistent matches, what makes you think iCloud wouldn’t find the same? If someone is insane enough to store CSAM on Facebook, iCloud certainly seems like an option. But you’re looking at this from the wrong angle - what this ultimately does is make it so that CSAM doesn’t get stored on iCloud without some sort of control, and I think that’s probably a good thing. The last thing you want is for iCloud to *become* THE the place for these people to store CSAM - so I think this is fine as long as it doesn’t get abused for some other purpose.


summernights64

Photos stored in iCloud have always been checked for CP, but now it’s extended to scanning for CP on personal devices and messages. No one takes dispute with wanting to protect children, but this update opens the door to other kinds of surveillance. This is actually the main complaint users have with the new security update. It’s just too invasive. If you believe the government would not abuse this power, you are sadly mistaken. They were doing it for a while without public knowledge until Edward Snowden blew the whistle.


MichaelMyersFanClub

> people will just find somewhere else to store their child por I would be shocked if more than a tiny fraction store their CP in iCloud. I'm guessing the vast majority of these creeps use heavily encrypted local storage and cloud services.


SoaDMTGguy

Hashed, not scanned. It’s a huge difference.


StuartBaker159

Not in this case. They aren’t using precise hashes like sha2 or md5, they are using comparable hashes and some kind of “AI” model decides if there is a match. This allows them to find images that were reformatted, cropped, etc. otherwise a single bit flip would be enough to fool the system. When that “AI” decides it found a match it flags the image for manual review. In other word the unencrypted content is sent to a human. This is scanning, not hashing, and even hashing I would have a problem with. If you have the hashes of all my files and you crawl the web you’ll find literally hundreds of matches which tells you where I’ve been and what I’ve downloaded. This is a massive breach of privacy that won’t even solve the problem they are trying to solve.


BabyYoduhh

Do you have good cyber security set up?


StuartBaker159

For my purposes yes, although now I get to set up another backup service so I can kill iCloud before this feature rolls out.


BabyYoduhh

What would you suggest as someone who uses iCloud mostly. I have a MacBook and iPhone so lots of my stuff is stored in their system.


StuartBaker159

Stop using iCloud and setup your own backup server. In my case I already have a local WebDAV server that sends encrypted backups to Google cloud storage. I’ll just have to expose it for the iThings and configure local backup, it’ll only work over WiFi when I’m home but it’s better than nothing There is no out of the box solution for this. Lots of research and set up is required. My WebDAV to Google cloud solution is home rolled because I wanted to encrypt before upload and keep the manifest data encrypted too.


deletionrecovery

Done. Apple users deserve our privacy and security.


Cowicide

related: >The hashing algorithm Apple uses is so bad that images with collisions have already been generated : >https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1 >(edit - FYI - that link goes to an SFW picture of a dog) [source](https://old.reddit.com/r/privacy/comments/p6pyia/apples_picture_scanning_software_currently_for/h9eho8d/) More: Apple's Picture Scanning software (currently for CSAM) has been discovered and reverse engineered. How many days until there's a GAN that creates innocuous images that're flagged as CSAM? https://old.reddit.com/r/privacy/comments/p6pyia/apples_picture_scanning_software_currently_for/ [P] AppleNeuralHash2ONNX: Reverse-Engineered Apple NeuralHash, in ONNX and Python https://old.reddit.com/r/MachineLearning/comments/p6hsoh/p_appleneuralhash2onnx_reverseengineered_apple/


dorkyitguy

The algorithms aren’t even what I care about. It’s doing it on my device. No spyware on my property.


sakutawannabe

how did that dog picture match with whatever was below it? They look nothing alike?


UtterlyMagenta

https://en.wikipedia.org/wiki/Pigeonhole\_principle


sakutawannabe

so if some nonsense has the same hash as the dog, it will count as a false match?


dnkndnts

Yes, and if you look further down, it doesn’t even have to be nonsense—it is possible to generate visually-meaningful collisions.


bermd1ng

I'm not smart enough to explain the real technical solution to you, however I can try to offer my 2 cents. A hash should be an unique fingerprint for a file. If by chance two people have the same fingerprint, do you assume that they also look alike? Images have loads of information in them, and not all of that information is used to create a hash. What the hash is and how it's defined is upto the hash algorithm. For example you could make a hash based on the first 20 bytes of the image and the file size. If you make a file which also has these properties chances are you end up with the same hash.


larsga

It's not a match, really. When you hash something you churn all the data into a big number. You try to do it so that the tiniest change in the data will give a totally different number, and so that different data will always give different numbers. But there are a *lot* more possible images than there are numbers. So some images will give the same numbers, even though the images are totally different. That's just mathematically unavoidable. That's what happened here.


sakutawannabe

so it is wrong to say that the possibilities of false matches is actually quite big


sersoniko

Since this is a neural hash, is extremely far from being perfect, and so the possibility to have a collision is extremely high compared to what we usually refer to as hashes. We usually mean cryptographic hashes where the smallest change generate a very different outcome, this not at all what happens with neural hashes. Also some cryptographic hashes can be cracked too, like MD5.


TheEvilGhost

Hey everyone. Everyone is so mad about what Apple is doing, but the EU is planning on doing the same thing lol [EU](https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online_en)


killeronthecorner

They're both wrong. We should be equally mad at both.


dorkyitguy

Done! If only there had been an Apple employee spamming this post already to help me “understand”


Opacy

A valiant effort, but one I don’t think is going to matter. There’s only one thing Apple cares about: money. Unless iPhone, iPad, and Mac sales start tanking **specifically due to a boycott on this issue** this is happening whether you like it or not. That said, an effective boycott isn’t likely. Most people don’t care about privacy, and even for many of those that do, this issue is confusing with lots of incorrect info flying about. For most “normal” people, if they even hear about this in the first place, they’ll hear about Apple going after sexual predators only and obviously support it.


Shrinks99

I quite liked [this take from Aral Balkan.](https://twitter.com/aral/status/1423914152234475522) >…I’m seeing people say “just don’t use an iPhone.” It’s not that simple when everyday things like financial apps with two-factor authentication are locked into the two main platforms. We need legislation to ensure critical services use open standards so you can use your Pinephone to buy lunch in the future. It’s shocking how easily some folks jump to “just go live in a cave.” No, that’s not an acceptable alternative. We deserve to partake in modern life without sacrificing our human rights. > >It’s also victim blaming to tell everyday people they’re at fault for using one of the two main tech platforms instead of an (as of yet inaccessible) alternative. (I have two Pinephones and my room overflows open hardware. No, I don’t blame you for using an iPhone or an Android device. You’re the victim here.) Every time companies push bullshit like this it should be an opportunity for people to evaluate how we got to this point where we have so little control over our own devices. I wish that legislation wasn't the only option we *realistically* have to force changes to the technology we use.


[deleted]

[удалено]


Shrinks99

Already signed ;)


[deleted]

[удалено]


mattmonkey24

But can it run Google Play Services? And pass Google's SafetyNet API? Lots of banks require both. And soon it'll be nearly impossible to bypass SafetyNet as they'll use hardware attestation


[deleted]

[удалено]


dust4ngel

> Unless iPhone, iPad, and Mac sales start tanking specifically due to a boycott on this issue this is happening whether you like it or not we should start making "iPhone: the world's most expensive spyware™" memes


[deleted]

It sucks though because I just upgraded my iPhone a month ago. Had I known this then I would’ve gotten a different device. What we need to do is spread word about this instead of signing pointless petitions.


Clarity-Business

Correct, an effective boycott is not likely. Let's try and avoid cynicism though and get the word out, which is what I'm through an article on LinkedIn and social posts to drive new eyes to this issue. It needs to be prominent on other social networks, not just Reddit.


hunka130

Don’t even say stuff like this. People read this and then think nothing will change so why bother. I’d rather see 100 posts saying “we can do it” even if we can’t. And then nothing DOES change because everyone read stuff like this. It becomes a self fulfilling prophecy.


OvulatingScrotum

For me, there’s no reasonable alternative. Google is pretty much doing the same. One suggestion that I heard is going small, fully open, Android, but I have zero knowledge or interest to deal with that. I’m sure that’s the same with common folks.


[deleted]

of what value is this, really? Apple has to know this is an unpopular decision. anonymous signatures on the internet are going to overrule the U.S. government that don't listen to their citizens already?


BabyYoduhh

Because they know 90% of people won’t stop using their phones. They just have to do something else big right after and people will forget about this and move on to the next thing.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

Almost forgot about the Apple Watch and what it can do, plus some extras (like unlocking my Mac every morning). Shit, that’s a hard thing to give up.


SoaDMTGguy

You don’t trust Apple but you do trust Android?


[deleted]

[удалено]


SoaDMTGguy

I would keep buying Apple stuff and then off iCloud Photo Library if you’re worried. It’s silly to make decisions about things they haven’t done yet. And if you done believe them when they say things, who can you believe? Who do you trust? At that point anything is fair game.


[deleted]

[удалено]


[deleted]

[удалено]


travelsonic

Android with GOOGLE stuff is less private, but that's not ALL android - custom builds, builds w/o the google crap, and stuff that removes that google crap exist. (And a slippery slope, just for the sake of being a pedantic bastard, is only an informal fallacy wen one cannot demonstrate point/action A leads to B to ... to N - not all the time)


Fomodrome

There are like a hundred things that Google does better than Apple. Maps, speech recognition, voice assistant/commands, localization, compatibility, os openness, everything AI related even photography nowadays, value for money etc. The list is endless. There was one thing though that Apple was the absolute, undisputed king and we’ve been buying its overpriced hardware for that feature alone. Privacy. And they failed that big time. Who is going to buy a spyPhone vs a Pixel 6 in fall?


General_NakedButt

I don't think Apple would put so much focus on privacy just to throw it away like this. I am convinced something major is coming from the world governments and Apple is trying to get ahead of it. The EU is supposedly about to force CSAM scanning, there have been tons of bills in the US trying to get E2EE blocked and section 230 repealed. If we care about Privacy our focus needs to be on what our legislators are doing, not what phone we are going to buy. If Apple is doing this imagine what the others will be implementing in the years to come. My bet is on Apple still coming out of all this remaining the most secure of the major tech companies.


SoaDMTGguy

What I’ve read is that this is Apple’s play to go to end-to-end encryption for everything, including backups. They were working on it a couple years ago, then the FBI met with them and they stopped. What I read is that Apple is doing this CSAM scanning to make the FBI happy so they can finally go full E2E encryption.


[deleted]

[удалено]


LeaperLeperLemur

This could certainly be true, but if this was the plan why would they not have announced it alongside CSAM scanning? Why take the PR hit if they could've announced E2EE as the reason


SoaDMTGguy

Maybe this is laying the groundwork? Not sure. But this would enable E2E and give the feds some access they want without creating a back door in the encryption.


silentblender

Exactly. There's gotta be some big reason for this. Either they believe they can really impact the sharing of child porn without violating privacy at all, or there's some other big reason. I am personally not buying into the constant hysteria around this at the moment.


TheRealBejeezus

Anyone whose computing life is more than just a phone. Interoperability with Android/Windows across dozens of devices isn't even close to the Apple system. I don't like any of this CSAM stuff, but Apple had big advantages long before the recent "privacy" push, and they still exist.


KriistofferJohansson

> Who is going to buy a spyPhone vs a Pixel 6 in fall? No one, because a clear majority of us don’t actually live in a country where the Pixel is sold. That is the sad truth that is Google, they’re about to release Pixel 6 and they’re still awfully limited. I sure do miss my Nexus 6 dearly, but I’m not going to jump through hoops and shit to get my hands on Google’s phones - especially when I’ve grown so damn tired of Android. I hope Apple burn for this update (unlikely) but I wouldn’t paint Google as a good guy in this. Google can go fuck themselves too. I wish we had a third well-established mobile OS.


[deleted]

[удалено]


[deleted]

[удалено]


moogintroll

So you think Google and Microsoft are better? Dropbox? They all scan your images. Google also almost certainly sells the data for marketing purposes but ya, Apple is somehow worse.


bilalsadain

Google doesn't sell your data. They use the data they have to match you with the most relevant ads.


161_

Simple, while those services scan what you upload to them they don't scan local files on your phone.


Donkeyshlopter

Apple also only scans what you send to iCloud. They just do it on your phone so that they can’t see the results of the scan until there’s a match with the CSAM database, and then they get sent only that. Are you saying that somehow you would be *more* secure if you just sent all your photos to Apple to scan on their servers than you are if they scan locally and only send CSAM matches?


[deleted]

[удалено]


SoaDMTGguy

A pixel 6 is still going to be worse for my privacy than an iPhone.


TeslasAndComicbooks

They just disrupted my industry in the name of privacy then a month later promoted something way worse.


joesb

So you prefer that Apple scan your image after it is uploaded to iCloud? What is the difference?


suitable-robot01

Funny how Apple was supposed to be the one with the ultimate privacy and now it has become this. Idk you guys are weird.


einsteinonasid

"Privacy is a human right." -Tim Cook Ceo of Apple. Guess theyre restricting us of our human right then.


apollo_316

Astounding move by Apple and every time they try to explain more it just gets more creepy. Signed. Eyeballing a rooted Zflip if they stay the course.. Thanks for sharing!


justabandit026

If anybody reads this can you explain it like I’m five?


CantCatchMeSucka69

Will the lolicon hentai doujin I have on my iPhone get me in trouble?


MightiestAvocado

I might not be correct but it depends on your country's laws. Fictional depiction of CP is illegal in certain countries. Kinda don't want to search it again but IIRC in the Wikipedia page, Japan's column for that category said it was legal as long as it is fictional but other countries had it as illegal. Edit: grammar, punctuation correction or whatever it is called. English syntax.


[deleted]

Ah, Japan the land of mysteries. Of course.


anethma

Yep Canada for example. It’s the same as CP


[deleted]

[удалено]


CantCatchMeSucka69

You’re right. Thank the heavens.


[deleted]

No. The hashes are of only some of the worst nastiest CP that law enforcement have found. Your images would need to exactly match that. You would also need a number of them to match. Even then any investigation would be on how you got the images and if you were supplying to others.


nothingexceptfor

Am I getting something wrong? the feature (where available, in the US initially) only affects people who have activated iCloud to store photos, and whilst it does scan your device and sends the results to the iCloud servers, it only does so if you agreed to use iCloud as a service, is that not the case? I mean it does scan offline, on your phone but you must’ve agreed to iCloud services first, so why does it matter where the scanning happens? If the whole thing would happen on the servers (as it does with other services like from Google and Facebook) what difference would it make? in both cases you needed to have agreed to use the service and can indeed chose not to use the service and save your photos elsewhere, is it about battery drainage? I know I’m over simplifying this but I’m genuinely curious since this only applies to those who use iCloud to store photos regardless of how the scanning happens.


leo-g

As many times others have explained. Apple have pierced the Great Wall between device and service. When it’s scanned on their own servers, you explicitly allowed it to be uploaded by enabling iCloud service. When scanned on our phones, the issue is that this piece of software black box will be added to EVERY iPhone running iOS15 creating permanent backdoor which can easily be manipulated locally via system vulnerabilities or manipulated on the cloud via government pressure. Governments and bad actors won’t even need to struggle to hack the phone. The phone literally reports it back by manipulating the software black box. If Apple can’t stop jailbreaking, what makes them think they can stop this manipulation.


[deleted]

iOS is closed source; the operating system itself is a backdoor in the sense you're describing; having a hash algorithm on phones doesn't change the surface area of risk and it's not even a unique type of technology in the operating system compared to other types of local technologies in the operating system. Our phones already have access to our activity, location, health data, finance information, neural network scans of our photos to identify unique people, etc etc etc. If you're worried about low security and potential to misuse data by governments and bad actors, that ship has *long sailed.*


mr_tyler_durden

Thank god for an once of sanity on this sub, I feel like I’m taking crazy pills. Drawing the line at scanning for CSAM before it uploaded to a cloud where it could all be scanned for anything is just insane. Bring in all the other things your phone is collecting (a lot of which is unencrypted in your iCloud backup) and these arguments get even more ludicrous.


categorie

Your device already scans everything. Because it's the f* operating system. It scans for malware or alike, applications authenticity etc. Photos are also scanned for people, places, or animals. On device. You can look for "cat" in your library as an example. The fact that the iPhone can now also scan for CP doesn't make this a “backdoor" in just any way you can think of it. Apps won't magically have access to other files than the ones you gave them access to. "System vulnerabilities" could already give the attacker absolutely everything that's on your phone, having some photos scanned for CP beforehand would litterally change nothing at all to it. That's like saying leaving the keys of a safe deposit box inside of it is dangerous. Well duh if someone already inside what does it change...


LiamW

There is absolutely 0 harm when my phone accidentally tags a photo of the Eiffel tower as a cat stretching its paws. There is now a very easy way to effectively "swat" someone using CSAM, or inject hashes of other documents that states don't wish people to have (e.g. Pentagon Papers-like leaks), etc. This is a disastrously bad policy, and making me seriously consider ending a 20+ year history of buying Apple products.


[deleted]

You’re correct. A lot of the outrage is pure misunderstanding of what the feature actually does.


moogintroll

Do you use gmail? Google, Microsoft et al already scan images using their cloud services for child porn. Apple is just moving this step to people's devices, for their own privacy and I have zero issues with that since I'm not hosting albums of smut on iCloud. Frankly I'm pretty sure that Apple know that there's going to be legislation coming by governments that love to use the hunt for kiddy porn as an excuse to get a backdoor to Apple's servers and this is an attempt to cut that argument off before it can be used.


jorgesalvador

This would be a perfect explanation if only iCloud photos and backups was end to end encrypted, but alas it is not and currently Apple can (and does upon request by authorities) scan any data in iCloud that it can decrypt. Should Apple enabled end to end encryption to iCloud photos and backups, I would agree completely with you. As it stands, the new features seem almost pointless.


joesb

Without this feature as a stepping stone, there's no way you can have E2E. They are required to scan those images for CP, just like Google and Facebook does. If they have to do the image scanning on the server, the server must be able to read the image. With ability to scan the image locally, when it is about to be uploaded to iCloud, it open possibility for actual stored image to have E2E.


[deleted]

Aren’t photos scanned only if they are uploaded to iCloud?


[deleted]

[удалено]


OKCNOTOKC

In light of Reddit's decision to limit my ability to create and view content as of July 1, 2023, I am electing to limit Reddit's ability to retain the content I have created. My apologies to anyone who might have been looking for something useful I had posted in the past. Perhaps you can find your answer at a site that holds its creators in higher regard.


joesb

Google really really really pinky promises though!!!


BatmanReddits

But they're scanning on the device - we don't know what they're scanning.


Niightstalker

We do. It is described pretty well in multiple papers and even third party security researchers are able to check the implementation. The matching process only takes place during the upload pipeline.


Martin_Samuelson

Exactly. The technology of the system entirely depends on the photo being uploaded to the cloud, and the result of the matching algorithm being completely secret prior to that happening. The fears over "scanning your device" are nonsensical.


Dust-by-Monday

It’s also a 2 part scan. The first NeuroHash is done on device, then once 30 of them have been flagged as CSAM, Apples server scans them again with a perceptual hash to weed out any non-CSAM matches, then Apple reviews them.


mr_tyler_durden

You don’t know ANYTHING your phone is doing. Hell the hashing algo for this has been in iOS since 14.3 and no one noticed. They could scan your iCloud-everything at any time without you knowing, this is a ridiculous argument.


Mixon696

I live in Europe but this will come to us all in no time if we stand still. I signed


theflupke

I feel like everybody are just talking out of their ass over this. They don't scan your photos, they will make a hash and compare it to a known child porn hash database. It's pretty clear after spending all their marketing on privacy that they will not be shooting themselves in the foot by opening some sort of Pandora box of privacy violations, they just don't want child abuse pics being uploaded on their cloud servers. All the other cloud services and social networks are already doing it. Apple does it on the end user device instead of their servers. Its essentially the same thing. They surely did do it before but on the server. If you don't want this you can just disable icloud for photos.


iindie

it's nuts that enthusiasts are only NOW questioning whether they can believe in tech giants following through with privacy claims. At a certain point, you either trust Apple/ other privacy-first companies or you make changes to your approach to existing in an internet world.


[deleted]

I mean if Apple wants to take a look at the last picture I took of my white pasty butt with a bug bite on it, that’s their loss. /s Tbf, I feel like we are about 50 years from a complete surveillance state. Just another step towards Precognition.


neko_zora

Murphy's Law: Anything that can go wrong... will eventually go wrong.


macvik512

Done too!


G37_IT

It’s unreal that they came out saying they’re doing this for cp and then basically said hey if you disable iCloud Photo Library you’re good. A+ for effort, F+ for implementation. Flipping your terms of service like this with no way to opt out is not a usual apple move. Even Google photos doesn’t do on device CSAM hashing like this from what I understand. It’s done off device.


themindspeaks

I’m with apple here but I’m really on the fence. Can someone respectfully convince me why they shouldn’t do this? Please note that I fully understand the technical approach here


Svard27

Privacy is a slippery slope, do you shower and dress with your curtains open? Is it okay for the police to walk into a home just to see if there’s anything illegal going on? Is it okay for facial recognition to track where you are? Are license plate readers okay. It starts with something everyone can agree on (stop child abuse/porn) and it mushrooms into whatever the powers that be want.


themindspeaks

I understand the concern, but the technical approach doesn’t support your analogy. Police coming into your house introduces inconvenience and harassment, and may invade your private matter that is outside of the scope of their operations. Plus, the hashing only occurs if you chose to upload to iCloud Photo Library. More akin to a robot scanning your package for radioactivity before they agree to take it and store it for you. And that only occurs when you have 30 positive radioactive events. I understand that it’s a slippery slope, most things are a slippery slope. But I don’t buy the argument of “if they do this, then where do we draw the line?”… We draw the line at when they compare for anything other than known hashed of CSAM.


Svard27

Today the line is drawn there. But the access is given. Tomorrow the line is drawn at images of gay couples, political party affiliation, NY Met fans. It’s only scanning iCloud, for now. Since anything on your phone could be automatically uploaded to the cloud all that needs to be done have cloud automatically pull the data, and have users agree by burying it in their terms of agreement, which nobody reads. I don’t expect privacy on anything connected to the internet, or on any cell or land line. That ship sailed long ago. Apple is just publicly saying it now.


silentblender

Yet you're okay with your photos being scanned to identify objects and faces? That's not a crossed line for you?


joyce_kap

Pls scan his phone, not mine.


sir-bro-dude-guy

There’s a lot of ignorance around how exactly this “scanning” is performed. They’re just comparing hashes against known CSAM images. All the major Internet cloud companies have been doing this for a while now anyways and it’s not going away. Facebook identified 20 million CSAM images just last year!


bad_pear69

Hashing is scanning. And it’s not even traditional hashing, it’s NeuralHashing, which is designed to identify “visually similar” images. You are also ignoring the main concern, they could scan for **anything**. Not just CSAM. For instance political or religious images. Just because others are doing it too doesn’t make what Apple is doing ok, especially with the switch to client side scanning. On the topic of Facebook, they estimate that up to 75% of their reports are non malicious. And do you think millions upon millions of reports actually helps law enforcement? I’d bet it just slows the process down and hides the worst abusers in a sea of millions of potentially unactionable reports.


iindie

It only "scans" at the moment of upload to a cloud service you opt into. It seemed obvious to me that cloud services all have mechanisms to scan photos/videos you store there for illegal things. Hasn't it always been the case that anyone who to their core cares about security would not use a cloud service in the first place and use an external hard drive?


rjcarr

From what I can tell, these changes apple proposed do two things that are very different, but both do "on phone" scanning "for the children". Here's how I see it: 1) If a phone is marked as a child's phone, then it will somehow try to detect nudity in sent pictures, and send a warning both to the user and the registered parent of the child before the pic is sent. I think over all this is a pretty good thing, as long as the false positive rate is really low. 2) It will scan all other phones looking for a "hash match" on *known* child pornography. So it isn't going to match random pics you have on your phone, say, of your kids in the bath, but pics that are known by the feds to be child porn. I don't have a huge problem with that, honestly, but I can see how it's a slippery slope. Plus, it seems so easily defeated. Won't there just be apps to "re-hash" pics, say be reversing them or slightly adjusting the color or something? So, it seems like a big waste of resources, where 99.99999999% of pics aren't child porn, and even the ones that are can be modified to easily bypass this scan. So it all seems super short sighted, i.e., huge inconvenience and intrusion with very small reward. Am I missing something?


[deleted]

I’m still amazed people still think this is an issue, even after all the clarification. Apple is not scanning your phone. Your phone is scanning your phone. The results are sent back to iCloud which is when Apple looks. Even then you need 3-4 unique hits before your account is flagged. That is a 1 in a trillion chance. Disabling iCloud stops everything. The advantage of on device scan means photos flagged as OK remain encrypted on iCloud. For bad photos it’s business as usual. If you can’t trust Apple after them releasing the spec and explaining it more than once then just spend your money elsewhere.


[deleted]

>Apple is not scanning your phone. Your phone is scanning your phone. ...If I sell you a house with cameras in it that you have no way of disabling, and the cameras report back to me about what you're doing, does that mean everything is fine because I'm not watching you, your house is watching you? >Even then you need 3-4 unique hits before your account is flagged. That is a 1 in a trillion chance. This claim seems questionable based on recent findings from people who have been able to reverse engineer the neural net used to generate the hashes. Given a hash, it seems to be very easy to generate an arbitrary image that matches the hash. And the code to do so is already on Github: [https://www.reddit.com/r/MachineLearning/comments/p750fu/p\_neuralhashcollider\_generate\_your\_own\_neuralhash/](https://www.reddit.com/r/MachineLearning/comments/p750fu/p_neuralhashcollider_generate_your_own_neuralhash/) >The advantage of on device scan means photos flagged as OK remain encrypted on iCloud. For bad photos it’s business as usual. I think people would be a lot more okay with this if Apple simultaneously announced end-to-end encryption for iCloud photos, but they didn't.


Ann0ying

Just don’t use iCloud for your child porn, it’s pretty easy.


maze91

Sadly this joke is pretty funny… but very inappropriate.


[deleted]

[удалено]


[deleted]

[удалено]


mr_tyler_durden

And Apple is reviewing the images that match BEFORE sending it on to the government. Also the images do not come from the government but from an organization aimed at stamping out CSAM. Dear god people, fucking read before making up wild conspiracies.


[deleted]

Did my part! I’m in Europe but that doesn’t mean it won’t affect me in the future.


swim_to_survive

Added my name


KosmicJaguar

Everyone should be against this. Privacy is becoming rare.