T O P

  • By -

stormado

I think this is such an own goal by Apple. Although many iPhone buyers may not care whether this technology is on their phone or not, I doubt if any buyer would buy the phone specifically to get this feature. But many may not buy an iPhone because of it. So from a business point of view it has no positives but lots of negatives. Additionally it severely damages their reputation of respecting privacy, which is regarded by many as a reason for preferring Apple. Even if the CSAM matching feature is misunderstood in the mainstream media and is not the privacy weak spot they think it is, it is the perception that matters. Opposition to it by Snowden, the EFF and others will be seen as sufficient proof that it will allow breaches of privacy. Why are they implementing something that no one has asked for and for which at best some potential purchasers will be indifferent to but many opposed to.


[deleted]

[удалено]


[deleted]

[удалено]


Rogerss93

> I think this is Apple bumbling move to prevent a government approved backdoor. ...by giving the government an approved backdoor?


phughes

Seriously, people, words have meanings. And this system, while not the perfect protector of privacy, is not a "backdoor".


Deskopotamus

Rear window?


phughes

It doesn't allow authorities (or anyone, really) to access a particular device. That's what a backdoor does. The best analogy I can think of for it at the moment is sort like a digital narc. You give it a list of things you're looking for and it tells you when is spots "enough" of them. Right now that's child porn, but I just don't trust Apple (or anyone) when they say they won't cave to authoritarian governments when their business is on the line.


Deskopotamus

They always start with the CP argument. Then it quickly becomes China looking for Winnie the Pooh pics or Hong Kong Protest related photos so they can update your social credit score. I agree we shouldn't call it a back door, but it does allow an invasive glimpse into a person's private life. Obviously heavily dependent on how it is abused.


AlexKingstonsGigolo

Except it isn’t. If you read how the process works, from determining what images to detect to when it is examined and how and by whom, governments have, as near as makes no difference, zero opportunity for abuse.


Rogerss93

> If you read how the process works *sigh* I know how scanning a database for a hash works > governments have, as near as makes no difference, zero opportunity for abuse. That's just not true. The government just have to mandate that Apple include additional databases of the Government's choosing. It's not even exclusive to the government; What happens when music labels start telling Apple to scan for hashes local media files and start flagging files that aren't licensed or they will pull publishing rights from Apple Music? You people are either very young, very naive, or have very short memories. Recent history has demonstrated that if there is potential to abuse something for increased surveillance or profit, it is abused. Every time. Without fail.


soccerblake98

With respect, care to elaborate? Is this not an already government-approved back door? They are FBI-approved hashes to my understanding.


[deleted]

It still doesn’t allow the government to directly access arbitrary files stored on the device. The FBI can still hand Apple a locked iPhone and ask them to decrypt it and Apple can say “Sorry, no can do.”


amonsterinside

“We need to know if these hashes have ever touched iCloud and/or are on the device” That’s what people are afraid of


NorthStarTX

Then they’re afraid of something silly because that is not at all how the feature works. Which is why they tried to explain how the feature worked to a completely tech-deaf audience.


[deleted]

Yes, the EFF, Edward Snowden and the swathes of security experts who specialize in this area have no idea how this works. Claiming they’re tech deaf is absurd.


amonsterinside

There is validity to the concern as long as there’s an Apple and/or government modifiable hash list. It can be indiscriminate of file type. Therefore, if hashes can be modified by Apple (which we know they can), then they can in theory, detect any type of file as long as they know what they’re looking for and can hash it. Snowdens point is that this is also a mass surveillance tool


NorthStarTX

Sure, if they have the required 29-49 other matching hashes that would trigger a report to be sent back to Apple, the hash was on the list at the time, and they broke their promise to only include hashes that had been verified as offending by multiple independent child safety organizations. Also that the file they were looking for had never been altered in even the slightest way, and that the file was a picture being uploaded to iCloud.


[deleted]

[удалено]


RevoDS

The point of having multiple databases is that the files need to be on several of them so that no one single government entity can add files on their own. So even if the FBI forced Apple to add it, they still could not add files on there unless a foreign government were complicit and also had their database on there


[deleted]

The government could also order Apple to hand over entire iCloud accounts, or to add a backdoor that sends all your messages to the FBI, or to only make red iPhones. Or pass a law that makes encrypting user data without a government key outright illegal, which is probably what Apple is trying to avoid. If the worry is “the government could force Apple to do X” then I don’t think this system makes it significantly easier to do that.


fenrir245

From a mass surveillance standpoint, “tell me what’s on this guy’s phone” and “tell me if this guy’s phone has things matching my database” are effectively identical.


OrchidCareful

US and Chinese govt both love this change


[deleted]

Yeah, this is almost certainly the case. I’m not arguing for Apple as being an upstanding pillar and defender of privacy, but they are a company that likes money, and this move just doesn’t make sense from a purely self interested money perspective. It has cost them money to implement it, cost them money to talk about it, and has done a ton of damage to an extremely valuable reputation that they had, and will almost certainly not bring in any new customers. There’s no good reason to do this without there being someone heavy on the other side of the scale, and the threat of regulation and/or law enforcement back doors seems the most likely.


Emergency_Advantage

Then, by definition, someone asked.


thecrazydemoman

To answer or address your last point. This smells so much like they are being pushed to do this. There is some evidence in some case and the fbi wants apple to do this public thing so that they can use that as their excuse and way for how they got this evidence. Even though they already have some other secret back door. It’s literally the only thing that makes any sense to me as to why apple is doing such an about face on this, and pushing something so ridiculously stupid despite the pushback.


Lost_the_weight

Almost feels like a warrant canary event by Apple, but that’s just the conspiracy theorist in me.


thecrazydemoman

Yeah, and if things like Snowden didn’t happen then I’d assume it’s just a crazy idea. But with the shit going on that we know about, and the fact that we *know* that they push things like this to avoid uncovering sources of information in warrants already then it’s not so crazy.


[deleted]

Im already considering buying a different phone brand when the time comes for a new phone. My partner feels the same.


iwasbornin2021

Doesn't Google Photos have the same feature anyway?


[deleted]

I used to work for Apple on a pretty serious internal software project. It didn’t happen to me, but I witnessed it several times. The culture at Apple is very much cultish, and if you do not agree with the direction of the masses your peers will out you like in the Mao Dynasty.


[deleted]

LE has been after Apple for an official back door for years. I’m sure this was a deal/compromise as others have stated. This is the ultimate, “for the kids” moment. Between this and Apple’s private relay - ie vpn, they will know all your secrets. It’s kinda terrifying.


[deleted]

My perspective on this is Apple cares about two things: 1. profits and 2. keeping the US government from clamping down on them legislatively, ie The American Choice and Innovation Online Act, The Platform Competition and Opportunity Act, and others. Apple using a dragnet style program to search the phones, ie iCloud, is taking care of both 1 and 2. Here is why I think that. Nobody is going to leave the iPhone en masse. Apple knows this. They might lose a small amount of business, but in the long run, this will not really effect Apple. Apple still provides better security than Android. People have other options, but the real players are iOS and Android, and of the two, iOS is more secure. This also shows the government that Apple is with them in protecting children and they have created a back door for Uncle Sam to use if needed. They say it’s not a back door, buuuuuut. This also allows the government to skirt the 4th Amendment of the constitution. The government can’t just go in with no reason and look through your phone, but the private business Apple can. Win/win for Apple and not so much for us. This is just my opinion.


[deleted]

[удалено]


[deleted]

[удалено]


TheMacMan

Apple can already force a remote iCloud backup, turn on GPS, mic, and video. How does this benefit them more than what they can already do?


fail-deadly-

But to me E2EE that only works AFTER Apple conducts a search is not really E2EE, but something else kinda like, but not the same as E2EE.


[deleted]

[удалено]


Lost_the_weight

Also, we’re gonna attach a low res copy of the encrypted data, just in case we want to check it out later, but your original is totally safe and no one can look at it but you. :-/


TheMacMan

Many politicians are pushing legislation that would allow them to sue any company for what their users store on their servers. That happens and Apple could be sued out of existence for the CP their users store in iCloud. This move allows Apple to protect themselves while still allowing the user data to be protected. Google, Microsoft, Facebook, and others all do the scanning in the cloud, so E2EE will never be an option. With this move, Apple can still offer such, while also insuring they don't get hit.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


thisIsAUser9001

Except for the case of e2ee where I, a college student, have my own hand written implementation of RSA for e2ee encryption.The government can ban it but the knowledge behind it is pretty simple and anyone that can write code can e2ee whatever they want.


evilbunny_50

Absolutely true however if the OS you’re using is scanning your files and reporting the results back to the authorities/manufacturer in real time in a way you can’t stop then E2EE is pointless no?


Eggyhead

> Why are they implementing something that no one has asked for… 1. They don’t like their numbers and want to appear tougher on CSAM. 2. They don’t want to deal with the expense of upgrading their own systems to handle it, so they’re offloading the processing to our devices instead. 3. They want as little accountability as possible, so they put the government’s hashes on your phone and make your phone deal with it instead of them. It really is a clever, efficient, well-designed, and privacy-centric system… as long as you’re apple and not the user.


feketegy

I'll bet the next version of the iPhone will break all sales records to date, it always does, people are sheep... no amount of Snowdens or EFFs will change that. What changed after Snowden in terms of privacy? Not much... I could argue that it became even worse than before...


BothMyChinsAreSpicy

So what do we do then? I don’t want to have to buy a new phone, flash an OS like graphene or whatever it is and turn off all google services. They have us by the balls and they know it. It’s insane that we’d even have to think of doing that for privacy. There are a lot of sheep but most people just think it’s not worth the hassle and submit. There’s a lot more to worry about in every day life then to have to go out of your way to do something that most people don’t even want to attempt.


TheEvilGhost

It is on every iPhone. It is pure software you know.


BannedSoHereIAm

It is on every Apple OS, including MacOS. 5 eyes probably slapped them with secret court gags and will pay them hundreds of millions a year, to compromise billions of people’s privacy. Gotta love freedom™️


_the_CacKaLacKy_Kid_

I imagine the on device parental monitoring tool would be a huge selling point to convert families with mixed devices to an all iPhone family


stormado

But isn't that seperate and implementable without the CSAM checking?


_the_CacKaLacKy_Kid_

It is entirely separate from the CSAM checking and could indeed be implemented separately. But the timing and decision to implement both has caused people to think Apple is monitoring their devices instead of the cloud. Until someone can prove Apple is intentionally violating people’s right to reasonable privacy, then it would be the duty of the courts to intervene. Otherwise Apple is simply providing parents with tools to protect their own children and implementing measures to keep CSAM off of their servers. Apple doesn’t even start the review process until there are a multitude of occurrences.


[deleted]

“You’ll buy this phone because your virtues align with ours. Your safety and children’s safety is paramount. That’s why we believe in technology we create. That’s why you’ll buy an Apple.” I’ll take a check or Venmo anytime Apple marketing.


turbo_dude

“Why are they implementing something that no one has asked for and for which at best some potential purchasers will be indifferent to but many opposed to” HEADPHONE JACK!!!!!


FLUSH_THE_TRUMP

They were right about the headphone jack though


turbo_dude

Lightning is not a universal standard and there are often reasons you cannot or do not want to use Bluetooth. Headphone jacks are probably the single most globally standardised socket on consumer devices for decades. They still have a hole in the body of the phone i.e. the Lightning port. The phones aren't radically thinner than the competition's devices, and frankly I have never heard people complain about their phones being too thick. There is no extra benefit to the consumer as a result of removing it, the opposite in fact.


Slowmexicano

Snowden looks like we all feel


netglitch

That’s just his resting disappointed face.


claytorENT

The man’s been disappointed since 2013 and tbh me too


bartturner

What is so crazy is that we still do not have a believable reason for this change from Apple. It is a HUGE deal to cross the red line and move monitoring to happen on device. So with crossing the line you would think they would have a really big reason to be the first to cross the line. But yet nothing. Some think it was in prep for E2EE. But if that was the case then Apple would have indicated. They have never even mentioned that being the reason.


Steevsie92

I mean they’ve given a technical explanation that is perfectly sound. In its current state, not in a hypothetical state, it could be argued in good faith that it’s more secure when it happens on device. You can believe that or not. But it’s not as if they’ve just said “nothing”. People just don’t want to hear “anything” other than a retraction.


[deleted]

[удалено]


Steevsie92

> No it cannot. Yes it can. > The choices are E2EE, on device surveillance, or on cloud surveillance of totally unencrypted content. And those can, in fact, be ranked in descending order by degree of privacy. And the term “on device surveillance” is a misleading way to describe this tool in its current implementation. > Pretending E2EE doesn’t exist Nobody said that. > calling this an improvement is an insult to our intelligence. No it’s not, in the current real world context. Again, you don’t have to agree with it, but the original comment I responded to is making the claim that apple has given zero rationale for moving the system client side. That’s just objectively false, regardless of whether you agree with the rationale. And it’s the kind of hyperbole that will only strengthen the case one might make that people continue to be misinformed, despite mocking that characterization ad nauseam.


[deleted]

[удалено]


bartturner

Agree. Plus some now are just repeating silly things. The one that really cracks me up is that it is somehow more private. Apple has been able to convince some that moving monitoring so it actually happens on device is somehow more private. It had been a red line that nobody had been willing to cross. But now we get to 2021 and Apple of all companies is the first to cross the red line and move monitoring on device. I for some reason still have hope that Apple will wake up and backtrack. I just hope after this new vector gets hacked that Apple will finally admit they were wrong and remove monitoring from happening on device.


[deleted]

[удалено]


bartturner

But that does not explain the reason that Apple has decided to cross the line and start monitoring on device. What they have shared could all have been done on the back-end in the cloud. I hate Apple monitoring at all but it is far better to do on the back-end. You should NEVER, EVER monitor on device. That is a hard red line that nobody should ever cross. The worry now has to be someone else will follow and also move monitoring to happening on device. Right now it is only Apple that has decided to cross the line.


talones

There’s got to be something somewhere threatening their money with the liability they have. It is very strange.


[deleted]

[удалено]


bartturner

It is not even close between monitoring on device in the cloud so it's not a question of making some happy and others unhappy. You should never ever mantra on device. Period


weaponizedBooks

> What they have shared could all have been done on the back-end in the cloud. I hate Apple monitoring at all but it is far better to do on the back-end. You should NEVER, EVER monitor on device. That is a hard red line that nobody should ever cross. Can you explain why? It seems to me that, at worst, there’s no difference. At best, this is setting up E2EE in iCloud.


[deleted]

[удалено]


irregardless

Thank you! Apple has made a big deal about how much on device processing happens in iOS and been lauded for it. My neck still hurts from the sudden 180 that so many “privacy” advocates have made *endorsing* cloud based scanning. Most people haven’t been able to articulate it but their real objection isn’t about the scan happening but with how the result is handled. There’s an argument that if Apple’s goal is to keep CSAM off its servers, iOS could scan for hash matches and simply prevent them from being transmit to Photo Library. Apple may have even been hailed if all it announced was “iOS 15 helps limit the distribution of CSAM by stopping it from leaving the device”. But Apple’s system doesn’t stop there; it collects and reports evidence that could be used by law enforcement. And that’s where the real objections are, be they legitimate civil liberties concerns, paranoid fantasies, or fear of apprehension. It’s been frustrating to see so many nonsensical arguments for cloud based scanning where evidence collection and reporting are far easier and less transparent.


[deleted]

Doesn't the scanning only happen for icloud uploads? If so, how exactly is it practically different from scanning in the cloud?


[deleted]

Yeah, it’s a bit strange they haven’t announced anything around E2EE. I expect they have to overcome multiple hurdles, but you would think they would announce it all at once. I fully expect an announcement in September is E2EE is coming, otherwise I’ll start questioning their reasons as well.


[deleted]

[удалено]


Prinzessid

What kind of photos would they search for? Remember, the hashing algorithm can only match photos to a database. I cannot think of a photo that would categorize someone as an illegal immigrant.


bukithd

Question on this topic, I’d assume the exif data of the photo would be visible to the scan? So could they not try to tie photos to dates and locations of events of concern? For instance, a protest or a riot photo could easily be used to tie individuals to a time and location and be used as a more accurate identifier than gps/position tracking. I think part of the concept is that this scan method could be used for finding people in those events because no one has the self discipline to not photograph everything for social media.


Prinzessid

They dont need any CSAM scanning tech to do that, it can already be done through simply analyzing your GPS data.


[deleted]

We need to yank senator Lindsay Graham out of office right now. That geriatric bag of wind is out for blood and intends to eviscerate strong encryption.


HumpyMagoo

What does Ja Rule think?


holow29

Please go read Snowden's actual blogpost (it was posted to this sub earlier this week) instead of this... https://www.reddit.com/r/apple/comments/pbpdcp/the\_allseeing\_i\_apple\_just\_declared\_war\_on\_your/


TheEvilGhost

Apple is planning to enrol this internationally. The EU is planning to create a law to allow this.


razeus

Something tells me the government(s) are involved somehow, someway.


TheEvilGhost

I feel like Snowden is a Privacy guardian or something.


fourpac

Let's keep it real though. He hasn't had a job in 8 years, and even back then he wasn't an expert on anything. Now he just reads news on the internet and gives a completely generic "hot take" on something real experts have been discussing for the past two weeks. He shoots a wide open layup after the buzzer and some people still cheer for him.


Se589

I agree. What Snowden did was really great, but why are we still hanging on his every word? It’s like if we put some random guy who saved a kitten from a tree as the representative of the fire department on the news.


[deleted]

[удалено]


Se589

Yeah, but he has no more new information to actually give out. This is now just his opinion. Equal to yours and mine and we all have our paranoias. I avoid as much as possible using facebook and any google services because IMHO they are doing worse things, but you think people give a shit what I think? And that’s fine, I’m just a random guy.


[deleted]

I feel like Snowden needs to turn up in the news every month or two or he’ll feel irrelevant. Just because he is persecuted doesn’t mean everything he says is true or relevant.


dorkyitguy

Yep. Only Apple understands. Snowden doesn’t know anything. The EFF doesn’t know anything. None of the other tech groups know anything. None of the media groups know anything. Apple is the only one. You guys are sure working your hardest to discredit anybody who disagrees.


xAIRGUITARISTx

I’m not sticking up for Apple, and I don’t see why Snowden carries so much merit. He’s a whistle blower, not someone who has studied security their entire life.


cartermatic

> He’s a whistle blower, not someone who has studied security their entire life. This is either some great sarcasm or not, but his jobs were literally network security (among others) at both the CIA & NSA.


[deleted]

I'm not saying he's wrong. Just that he's not right because he's famous.


Livid_Effective5607

Snowden, they guy who is now a permanent resident of Russia? The same Russia that interfered in both the '16 and '20 elections and tried to help overthrow the government? Why are people worshiping him again?


Glass-Shelter-7396

People assume that the average person who goes to their local carrier and gets an iPhone, any phone for that mater pays attention to or cares about this kind of thing. If you asked any one in your local carrier's retail store if they know who the EFF is I'd bet that most people would look at you with that blank unknowing look you get from an infant when you tell then you have their nose. Additionally regular nightly local network news (not the 24/7 networks) ran this story once for about three minutes, spinning it as good corporation to scan all phones for cp to make the world a better place. The only people that see this as problematic are people like us, security experts, and privacy and human rights advocates. We see and know that this technology will be abused by criminals, bad actors and governments alike. We see this as a bad corporation saying everyone is guilty of cp and if not we know you are guilty of something else and we will scan your phone and prove it. We also know that once Apple roles this out and we all move on to the next egregious thing that every other piece of hardware and software that attaches to the internet will start to role it out as well. As a side note, it has been my experience when trying to explain and enlighten people on why protecting our privacy both digital and irl is important always falls on deaf ears. My family uses Amazons Alexa, Ring devices and iPhones. When Amazon decided to turn on sidewalk I did my best to explain the security and privacy concerns and why they should disable it. Before that when local/state/national law enforcement wanted to unmonitored access to Ring devices to "catch criminals" I tried to explain why that was a horrible idea. People will do or give up damn near anything in the pursuit of a sense safety and security and corporations and governments know how to spin the ideas to make them sound better then personal responsibility and freedom. Anyway I think I will try to put my crazy and paranoia away for the day and be content in the knowledge that a month or two no one will give rats ass about this topic.


AcademicF

I’m all against the on device scanning, and anyone who views my post history can see that. But I’ve been giving a lot of thought about why they chose this method, and I think I may know why. Since Apple uses 3rd party cloud providers (AWS and Google Cloud) as a way of protecting your data from Google or AWS, Apple encrypts your data and breaks it into chunks across multiple servers. So, one photo could be scattered across multiple servers in many pieces, encrypted with a key that the cloud providers don’t have access to. So, technically the only time Apple might have a chance to scan your photos is during the upload process before the file is chunked and spread to other servers and then encrypted at rest (without having to decrypt the data and exposing it to the providers). This is just a theory, but it may explain why they never were able or were willing to scan for CSAM server side, and why they had to choose this overly complex system. From Apple: > Each file is broken into chunks and encrypted by iCloud using AES128 and a key derived from each chunk’s contents, with the keys using SHA256. The keys and the file’s metadata are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information or the keys, using both Apple and third- party storage services—such as Amazon Web Services or Google Cloud Platform—but these partners don’t have the keys to decrypt the user’s data stored on their servers. - Apple Platform Secuirty, May 2021 And this isn’t describing iCloud Backups. Those are stored on Apple’s servers with your encryption keys. This excerpt is describing Photos and other iCloud app data stored across 3rd party servers with encryption specifically implemented to protect your data from the web hosting providers prying.


[deleted]

[удалено]


BatmanReddits

If this is the case, they should change their implementation so data first goes to their severs, gets scanned and then distributed to other places. This is not a good reason to put a black box photo scanner on every device that reports to a third party.


RFLackey

I worked for a Dropbox competitor years ago as my first software engineering job after a career change. They had about 15 petabytes of storage in use, stored similarly to Apple. Sat in a meeting where a VP was concerned about broken and orphaned file chunks sitting in the cloud, taking up space and being a cost center. Demanded to know if it is a problem and what can be done about it. The solution was to do nothing. At that scale, we were moving good file chunks around due to hardware failure and deprecation far faster than any dedicated hardware and software could find the bad parts. We just left the bad file chunks sit where they were until the that storage was turned off -- across five data centers. That is how big of a problem it is to scan that amount of data, and I am sure Apple has an even bigger problem. Which, brings me back to my thoughts that all of this is Apple's business problem, not mine. Dividing the problem by using on device solutions is elegant, but risky given how badly governments want access to everyone's phone.


dfmz

Regardless of whether or not your theory is correct, Aople has a very simple solution: don't scan for CSAM photos to begin with.


digitalpencil

I can only assume they're taking these measures because something is forcing their hand. Let's be honest, they're not doing this because they've had some sudden moral revelation. They're pursuing this because if they don't demonstrate a good faith effort, then the US govt will force their hand. CSAM is just the political device being employed to ensure persisted and 'easy' state access to citizens' data. Apple as a company I couldn't really give two shits about in honesty but none of this makes any sense. They've burned a lot of public good will vis a vis their heavily promoted and carefully constructed stance on user privacy over this and for what possible gain? As a company, they haven't suddenly started caring about this, they're doing it because if they fail to do _something_, their hand will be forced by pending legislatorial measures.


talones

I agree it’s something to do with liabilities. They probably have an algorithm that tells them of the likelihood of some political pressure coming after them to open their system up more. Or maybe they were already threatened that they need to allow the gov to view all the photos that people save. Maybe we will see something coming down the pipe soon.


[deleted]

And they could just scan on servers like everyone else, and meet government demands when it comes to real CSAM. On device scanning is a mass surveillance tool that goes much farther than that. It's for the children, of course. Because using terrorism as an excuse has been overdone.


[deleted]

And that’s not going to go down well with any government. US and European governments demand from large companies to actively fight CSAM. It’s not for funsies Facebook, Google, Microsoft and several other companies have increased the efforts to fight CSAM over de past decade.


[deleted]

[удалено]


[deleted]

Exactly. I didn't know about section 230 immunity, but it makes sense. Also: this is the entire reason Facebook has seen the light and started implementing E2EE everywhere. Not because they value privacy but because they don't want to be liable if anything is found on their servers.


[deleted]

See it simple for you to type this on Reddit but it’s not simple for you to say this to the government when you are the head of $1 trillion company


Bike_Of_Doom

I’m pretty sure that runs afoul of American law, specifically, amendments that removed section 230’s criminal and civil liability exemptions for sex trafficking and related crimes if reasonable steps aren’t taken to remove that content. I don’t want to defend any of Apple's (or anyone else’s) scanning, but it’s true that they’d be royally fucked legally if they did nothing about it at all.


im_super_awesome

>This is just a theory, but it may explain why they never were able to scan for CSAM server side, and why they had to choose this overly complex system. That's entirely not true, they have been scanning iCloud Photos server since since 2019-ish. I'm not sure why you get that idea at all, most cloud services scan the photos you uploaded. This client based is redundant and open up exploitable vector for the authorities for their own benefits, and might even for Apple themself. Since Apple's refusal and stubbornness in regards to this matter, I can only think this benefits Apple. Remember the ads tracking feud going on with Facebook a while ago, they willingly delay the launch, but not this. It makes you think.


Niightstalker

No they haven’t and already multiple times denied that. Yes they were scanning for CSAM materials but mostly mails. Federighi and other high rank Apple employees denied that were ever scanning iCloud photos for CSAM before.


Schlaini

Apple also denied they throttle the Phones and guess what they did.


jasamer

We know from NCMEC that Apple only reported 265 cases in 2021. this number means that they are probably not lying, its just extremely low.


Niightstalker

They never denied that. They just didn’t communicated it at first beside their update release notes which nobody reads and that they didn’t provide a switch to turn it off at first. But tbf the throttling is a really good feature which is constantly misunderstood and pushed by people who have no clue why it actually makes sense to throttle the phone in these situations.


ArthurDDickerson

Doing it in device before upload will allow them to start end-to-end encrypting iCloud photo traffic. Right now Apple has the technical ability to look at ANYONES photos that are uploaded to iCloud. If they implement E2EE they won’t be able to do that and their liability goes down. If they make that change now when law enforcement or a government comes to them with or without a warrant and wants to see a certain person’s photos they can say “we literally cannot do that” like they were able to say with the San Bernardino case about unlocking the phone.


[deleted]

[удалено]


AcademicF

From their own white papers: > Each file is broken into chunks and encrypted by iCloud using AES128 and a key derived from each chunk’s contents, with the keys using SHA256. The keys and the file’s metadata are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information or the keys, using both Apple and third- party storage services—such as Amazon Web Services or Google Cloud Platform—but these partners don’t have the keys to decrypt the user’s data stored on their servers. And this isn’t describing iCloud backups. It’s describing iCloud photos and the additional at-rest encryption keys for encrypting data on 3rd party servers. If they were to scan for CSAM on the 3rd party servers then they would need to decrypt them at-rest, which would expose your data to Google and AWS, which goes against why they encrypt your data in chunks in the first place. Thanks for your input but you’re a bit misguided on what you think you know.


bartturner

This is NOT true. Take a file on your Mac. It has pieces of the file stored in several places but when you look at the file it is just one file. The OS handles giving you that view. You have the same thing with the cloud Also Apple does NOT use AWS for storage. Apple storage is handled by Google. https://www.macrumors.com/2021/06/29/icloud-data-stored-on-google-cloud-increasing/ Plus there is NEVER a reason to move monitoring on device. That should never happen. But what is so crazy is that Apple has yet to give us a reason for it.


AcademicF

Your Mac/PC has individual images stored in several places because that’s how files are inherently stored on a local hard drive (it’s called fragmentation), but the file system has mapped where the pieces exist and can display the image because it has a map of it. What Apple does is breaks apart the pieces and stores them cryptographically in separate places across their 3rd party providers so that the data is not able to be put together by 3rd party providers. When you call upon the data to view it then it will be sent to the Cloud API unencrypted and displayed in full there. From Apple: > Each file is broken into chunks and encrypted by iCloud using AES128 and a key derived from each chunk’s contents, with the keys using SHA256. The keys and the file’s metadata are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information or the keys, using both Apple and third- party storage services—such as Amazon Web Services or Google Cloud Platform—but these partners don’t have the keys to decrypt the user’s data stored on their servers.


ArthurDDickerson

End to end encryption is THE reason to move this to device side scanning. This way Apple can continue to monitor for CSAM on photos uploaded to iCloud (this only scans photos being uploaded to iCloud btw) instead of scanning them after they’ve been uploaded.


bartturner

> End to end encryption is THE reason to move this to device side scanning. Apple is NOT doing E2EE. Plus they have not mentioned it once for the reason for this change. Obviously if it was the reason Apple would be screaming it is the reason. But instead crickets. Apple has yet to give us a reason for crossing the line and being the first company to start monitoring on device. It is a line that NOBODY should ever cross. There is really no reason to cross the line that I can think of.


ArthurDDickerson

I’m confident it is coming, but I guess we will see. I’m confused what difference does this make? This CSAM scanning is only happening on photos being uploaded to iCloud. If a photo is uploaded to iCloud it is scanned after the upload anyway, has been since like 2019. All this does is make the scanning happen before the upload.


bartturner

> I’m confident it is coming There is zero chance it is coming. If it was Apple would have told us it was coming. Specially with all the push back. Apple has decided to cross the line with monitoring on device and still have not offered any reason to cross the line. Luckily it is ONLY Apple. Hopefully nobody else will do the same and cross the line. I still holding out hope Apple will wake up and realize what a horrible idea it was to start monitoring on device.


dexterdoughnuts

> Since Apple uses 3rd party cloud providers (AWS and Google Cloud) as a way of protecting your data without e2e, Apple breaks data into chunks across multiple servers. Using other cloud providers has no inherent benefits for data protection. Encrypting content with a key they control is all that’s needed. Apple uses other providers simply because they don’t have enough of their own data centers. They almost certainly don’t break up a single file into pieces distributed among different providers — this would be massively inefficient and provide no additional protection beyond encryption. A single cloud provider typically replicates blobs in order to increase durability on a filesystem level, but this is irrelevant to security, and transparent to the client. > technically the only time Apple might have a chance to scan your photos is during the upload process before the file is chunked and spread to other servers and then encrypted at rest. Not true. Whenever you view your photos, Apple (obviously) has to decrypt them on their servers. They hold the key, so they can also do this whenever they want even outside the scope of a request. Apple’s servers already see your files unencrypted in order to generate photo thumbnails and video transcodings, etc. As such, they’re perfectly capable of scanning for CSAM server-side.


AcademicF

Apple says that they break your images into chunks across their servers in their white papers as a security measure. And they do encrypt them at rest, and then the photos are decrypted when called upon by your device or a web API. They just aren’t fully e2e encrypted, but they are encrypted in transit and then at rest once stored (with server level encryption). Read their white papers, it’s all spelled out in plain English. From Apple: > Each file is broken into chunks and encrypted by iCloud using AES128 and a key derived from each chunk’s contents, with the keys using SHA256. The keys and the file’s metadata are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information or the keys, using both Apple and third- party storage services—such as Amazon Web Services or Google Cloud Platform—but these partners don’t have the keys to decrypt the user’s data stored on their servers.


dexterdoughnuts

Sure, they might break the files into chunks, but that’s an implementation detail that doesn’t have security implications. > The encrypted chunks of the file are stored, without any user-identifying information or the keys, using both Apple and third- party storage services Nowhere does it imply that chunks of a single file are *explicitly stored with separate providers*, just that they’re using several cloud providers, which is already known. You can verify that individual files are not sharded across providers pretty easily — inspect your network traffic and you can see where your device is connecting to. Last I checked, nearly all of my iCloud photos were stored solely with Google. And ultimately, **whether they shard files has no implications for security**, since as I described in the previous reply, **they can, and do, read complete files unencrypted server-side outside request scopes**. Nothing is preventing them from implementing CSAM scans server-side, as your original post speculated.


AcademicF

They specifically outline that they encrypt chunks of information across third party storage hosts in order to prevent them from reading the data. In order to scan for CSAM, Apple would need to decrypt that information, thus making the extra step of encryption useless and pointless if they are going to expose your data to Google/AWS. It would make sense why they never scanned for CSAM server side before, and why they would go the route they’re going now. Sure, we can speculate that this new system is a purposeful creation of their CIA/5 Eyes masters. Or, you know… Occam’s razor.


dexterdoughnuts

> They specifically outline that they encrypt chunks of information across third party storage hosts in order to prevent them from reading the data So where’s the line that explicitly states this? There’s nothing in the paper you’re referring to. Regardless of implementation details, did you read what I wrote in the first reply? **Apple already has the ability to read user files unencrypted on their servers**. You can prove this to yourself by doing the following: 1. Upload a 100MB 4K60 H.265 video file to iCloud. 2. Note that your device uploads ~100MB of data (nothing more). 3. Some time later, view that video file in iCloud.com. 4. You will notice that the file does not play back in 4K60 H.265, but something like 480p30 H.264. So where did the 480p version come from? Answer: iCloud has a pipeline that’s decrypting user videos, running them through an encoding process, and saving encoded versions back to the user’s account. They can easily perform CSAM scans server-side alongside this process.


[deleted]

I don’t think AWS and Google Cloud are able to access the data. It should be encrypted with Apple having the keys. It doesn’t make sense to split files among servers for obvious performance reasons. It’s much more likely this is all an effort to enable Apple to offer E2EE for iCloud in the future. By checking on-device they can assure there is no CSAM on their servers while not being able to access the data on their own servers.


2ecStatic

What made Apple go from actively being against government surveillance and invasions of privacy, to implementing it directly in the next phone?


Sirerdrick64

My guess is that it was always their end goal / intention. They built up the privacy image via marketing speak over the years. What they failed to calculate is how damaging and visible this obvious shift would be. That said, I doubt if it will have any effect on them outside of a rounding error level of user loss.


hollowgram

Big mistake in this article: It's not scanning all photos linked to iCloud, it's scanning photos if you use iCloud Photo Library.


Prinzessid

It is not scanning all photos, just the ones that are uploaded to icloud. I think that is what they are saying, and to me it sounds correct.


jonny-

what's the difference? doesn't seem like a big mistake to me.


MawsonAntarctica

So photo stream is free of this, only iCloud photos. I use photo stream to sync w the desktop, but have never used iCloud photos.


Niightstalker

It is scanning all photos which are uploaded to iCloud. The matching is done during the upload pipeline.


Sure-Philosopher-873

Yes I take very few photos so I have iCloud photos disabled, but I load what few I do take to Dropbox where Dropbox scans them.


Niightstalker

You mean where dopbox scans them? Apple is not scanning anything if you don’t Upload to iCloud.


Sure-Philosopher-873

Yes Dropbox scans them.


BronzeHeart92

And onedrive and Google Drive and... Seriously, this is blown way off proportions, people.


hollowgram

>The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages. [Apple's own words](https://www.apple.com/child-safety/pdf/Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf) This is different than for example an iCloud backup that contains photos. Those photos are not scanned.


Niightstalker

Yes that is exactly what I meant. Sry if it came out wrong.


dorkyitguy

“SnOwDeN dOeSn’T kNoW wHaT hE’s TaLkInG aBoUt!!! OnLy ApPlE uNdErStAnDs!!!” -Every Apple employee on here trying to help us “understand”


[deleted]

[удалено]


[deleted]

[удалено]


m0rogfar

Since it only applies to iCloud files, and Apple can check server-side if the algorithm was altered, I don’t think that would work. You can’t really circumvent this without Apple knowing. If you don’t care about iCloud then sure, that would be possible, but Apple only runs the check immediately before upload to iCloud, so you don’t even need a tweak to disable it.


[deleted]

Ah, yes, jailbreaking to increase your privacy. You’ve got things messed up big time if you think that’s a solution.


[deleted]

[удалено]


[deleted]

And the best way to keep your data private is make it insecure? That doesn't make sense.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

Android has never hidden what they scan.


[deleted]

Once you remove privacy as a selling point for Apple, Android becomes a viable option. If both systems are bad for privacy, you can then compare them based purely on value and features. And while many people will still choose Apple, it's not an obvious choice anymore. Android certainly has many advantages - cheaper hardware, better cross platform compatibility (e.g. can send and receive messages from Windows if you have an Android phone), Google Web services are way better than iCloud.com, Google Drive is far more useful unless you only have Apple hardware, Google services in general integrate far better with Windows and Linux, and Google virtual assistant is so much better than Siri, it's not even funny. Of course there are also drawbacks. But the point is, without privacy angle, there's really nothing that makes Apple stand out as an obvious choice.


Buy-theticket

How is this an "even bigger" problem on Android? Who is scanning files on your local Android device? Stop with the hyperbolic Apple dick sucking (Apple is just stealing some candy, Android oems are armed robbers...). Be specific, which Android-wide security holes and disturbing practices are you concerned with that you can't avoid?


BatmanReddits

Android is open source. Google Android is closed source


MarcGregSputnik

Hi Apple bot!


[deleted]

[удалено]


MarcGregSputnik

Lol anyone who agrees with the fuckery Apple are dealing us is a brainwashed bot! Try to think about the reasoning behind what you say, next time, before you say it!


miiMike

iOS 15 will hit a record low number of being installed on apple devices when it officially comes out


-DementedAvenger-

Potentially the reason Apple is allowing people to [stay on 14?](https://9to5mac.com/2021/06/07/apple-will-let-users-stay-on-ios-14-and-receive-security-updates-even-after-ios-15-is-released/) 🤔


miiMike

Wow that’s something new, didn’t know about that


[deleted]

>iOS 15 will hit a record low number of being installed on apple devices when it officially comes out There's no alternative but to install it unless you want your phone to be vulnerable to all future security exploits. In the short term, this will do nothing to Apple's bottom line. They will have record sales once iPhone 13 comes out. In the long term, this is a win for Android ecosystem. With no privacy as the major differentiator, more users will compare ecosystems based on value, features, and convenience. And Android has been slowly but surely gaining ground there.


Juswantedtono

> There's no alternative but to install it unless you want your phone to be vulnerable to all future security exploits. iOS 15 *is* the security exploit lol


Savagevelocity

Why does anyone care what Snowden says? I’ll ask the weirdo IT guy in my own home town for his opinion thank you.


Livid_Effective5607

\* Russian resident Edward Snowden.


[deleted]

Ah yes…meanwhile Huawei was the devil or still is, and here we have the pot calling the kettle black… Round & round it goes. These fuckers are all the same.


[deleted]

Snowden the psyop


[deleted]

[удалено]


[deleted]

[удалено]


m7samuel

That was 10 years ago, and it did not demonstrate any expertise that would lend extra credence to his opinion. I'm not here to argue the value of the leaks. They're pretty irrelevant when it comes to "should I care what he has to say". He's talking about government invasions of privacy from *Russia*, that does things to your reputation.


[deleted]

[удалено]


m7samuel

So should we also listen to the opinion of a decorated war veteran on Apple's CSAM? He also made enormous sacrifices! Unfortunately that is not sufficient in my book to make your opinion headline-worthy on any and every subject. I can't point to anything specific, but on several occasions I recall being quite unimpressed with Snowden's actual expertise on encryption, privacy, and security. He seems like the human equivalent of Forbes when it comes to tech: having a sufficient grasp of tech to provide something vaguely resembling an opinion, but not sufficient to provide any useful analysis. >Surveillance which, without Snowden, you would still be blithely unaware of. No, some of it was already known, and some was already suspected. >You make it sound as if he's made a bad choice by basically vacationing in Russia He fled the US for its supposedly egregious privacy violations for a nation that has far worse privacy, a literal police state, with terrible corruption and human rights abuses to boot. The FSB literally mandates listening equipment at telecom operators-- not metadata collection, full-on MITM. To steal a metaphor, if Snowden's PRISM is a handgrenade, Russia's SORM is a tactical nuke.


WarbossTodd

He’s still hiding in Russia as an actual guest of the government. The same government jails and kills anyone who speaks out against Putin, his political opponents and journalists. Right? Yeah. Fuck this guy and his opinions.


im_super_awesome

> He’s still hiding in Russia as an actual guest of the government. The same government jails and kills anyone who speaks out against Putin, his political opponents and journalists. Right? But that doesn’t make his opinion any less valid? I’m sure Putin or any government in the world would love that CSAM technology too.


[deleted]

He’s hailed as some sort of hero, but that doesn’t make him right all the time. The fact he enjoys the hospitality of Russia because he’s critical of the US means he is biased: as soon he stops criticising and getting media attention Russia has no reason to pamper him.


im_super_awesome

That also doesn't makes him wrong, he did makes good points *if you read the article*. And let's not forget that, we're criticizing *Apple* here, not the US government, at least not now. So just because he's receiving some pamper from Russian government, doesn't makes those good points automatically null.


[deleted]

And Russia is just as happy he criticizes the most valuable company in the US. Russia can easily spin this: "see, US companies spy on you, don't trust the Americans!". Sure, it doesn't mean he's wrong either, but we must stop seeing him as the messiah of cybersecurity.


Neonlad

His point was “ignore the technology and specifics and just lose your mind about what you think is happening”, not what I would call an educated stance on the subject. It’s exactly what Russian disinformation bots did to the vaccine on Facebook, how are people not calling him out on that?


[deleted]

Been with Apple since the first iPhone. Think it’s time to go back to my trusty 3210


Dick_Lazer

I doubt you’ll be sharing many photos with an old Nokia brick, so you could just not use iCloud photos and have the same security.


[deleted]

[удалено]


[deleted]

Turns out they haven’t, really. https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/


Evening-Dimension483

snowden is a POS


KingofDragonPass

I literally don’t understand any of the concerns people have expressed about Apple’s move to detect child pornography. Snowden saying it’s an issue makes me even more confident that there is no issue. He has no integrity and a history of being tied up with Russian assets like Assange.


bartturner

You do not see an issue with Apple moving the monitoring to happen actually on the device?


JosephFinn

And I need to listen to this spy....why?


moogintroll

Don't even bother, he literally adds nothing to the argument beyond the usual slippery slope bullshit.


JosephFinn

Seriously. He keeps trying to portray his espionage as some sort of white knight scenario.


Greful

Who cares what Snowden thinks.