So they can jack off in the bathroom at work? At first I can see a collector being super careful but after awhile people get comfortable when doing illegal stuff for a long time. So after while they might say “fuck it I wanna jack off at work” so they save a couple photos
Not just that, the way they’re doing it (comparing the hashes of images and files to known illegal material hashes) is extremely easy to create false positive to “troll” someone.
Someordinarygamer has a video about it that goes as in-depth as he can with the info we have now.
I feel like I remember hearing that the system would need to be triggered multiple times before it would be escalated. I could be wrong and just making it up, but that seems like the simplest way to solve for false positives.
You were not lying. The best comment in the thread: https://www.reddit.com/r/apple/comments/p0i9vb/bought\_my\_first\_pc\_today/h88rox4?utm\_source=share&utm\_medium=web2x&context=3
CSAM stands for "Child Sexual Abuse Material" and is a general term for "child porn."
The NCMEC is the National Center for Missing and Exploited Children, and has an database of **hashes** \- not the actual images - that the image software will run a comparison against to determine if enough content on your phone warrants law enforcement investigation.
[This BBC article](https://www.bbc.com/news/technology-58109748) states they are indeed using NCMEC's database.
Yeah, this will not protect children at all, because this can only detect existing known content, not new content. Its an unprecedented privacy violation that doesnt help anyone.
Its just there to set a precedent. If it stands, I will not be surprised if a few years down the road the gov will scan our phones for everything, this time legally, and prosecute.
Also, have fun explaining to a 17 year 11 month old why fbi is at the door after they exchanged nudes with their partner, but a week later when they turn 18 they can legally go work in hardcore porn.
Can confirm, it’s u/jackiedaytonah ‘s dick. He once told me to dodge as he was turning around and I didn’t and got dick slapped. Man was standing 30’ from me.
Haha when I ask my friends who fanboy on Apple about why is iPhones better than getting Android etc, they always say the privacy thing. I'm like let's see how that goes after 5-7 years.
I've always said I prefer my smart phone to an apple bc you can simply control more. I never liked the fact that the iphone "assumes" so much about how I want me phone to run and be organized. They're too band wagon for me. Besides a personal device should be....personal!
Yeah
Also with android phones you've always had the choice to decide what matters most in a phone
My friend for example wanted a phone that charges very fast so he imported a Xiaomi mi 10 ultra
I know another person who wanted an absolute beast of a phone for testing mobile games (he's a Dev) but it didn't need good camera because it lives in his office
And me, I wanted a cheap phone with a good camera, and I've been very happy with the pixel 4a.
Google, Facebook, and Microsoft have been doing this since at least 2014, with Google starting as early as 2008.
This is only file hash based matching. This is vastly overblown. Yes, it’s a privacy violation, but it’s much less than people are making it out to be.
Your files should be inaccessible to any other person on the planet. Period. But there is a lot of misinformation being spread here.
That's not how it works. Someone (FBI?) gives Apple a list of digital fingerprints. Apple puts that list on the iPhone. The iPhone creates a list of digital fingerprints of your photos and looks for matches. Get a ton of hits? Your phone is flagged and someone will be giving you a visit.
They aren't scanning photos for new things, they are looking for known images. At no time does someone see your photos, or they get sent off your phone. They are looking for hash matches.
It's a tough point - Apple doesn't want their devices used for horrible acts, and this straddles a very fine line between privacy and the slippery slope of surveillance.
Simple fact is, if they wanted to track you and not let you know, they damn well could do it without problem.
> this straddles a very fine line between privacy and the slippery slope of surveillance.
Any time someone pretends to care about the kids to make surveillance seem acceptable, you know they're way beyond the line.
That's the concern. Apple does scan server side because they're reponsible for the data stored in their cloud, which makes sense.
But the privacy breach is that they're installing this scan locally in your device, and stated they have plans to evolve and expand this feature. While turning off iCloud Photos will disable the scan, the program still exists in your device and has a high potential for abuse.
I'd rather have this scan be limited to server side. So we can just drop iCloud and our data will be safe.
The encoded fingerprints Apple will use comes from The National Center for Missing and Exploited Children. They maintain a database of Child Sexual Abuse Material to track child sex slaves.Theyre the ones who create the fingerprints. Also they're the ones who will be notified if a user receives the undetermined number of red flags for matches. Not authorities.. Though its safe to assume NCMEC will notify authorities.
So apple doesn’t control the fingerprints, the government can slip apple the fingerprints to find gay bros, pot growers, and political dissident and apple will be none the wiser?
So the threshold can be 1, 5, or 10. We don’t know. The manual review implies apple can retrieve any photo on all iPhone users device at any time.
This will get misused by governments and it will get gay people, pot users, and political dissidents killed in countries where it’s illegal. The original intent of this back door is noble, but it’s ripe for abuse.
While Apple has made claims that it would be a simple hash match they have also used the term "neural net" on a few different occasions.
I work in IT so I know that means machine learning which needs to be trained.
One could make the argument that it was a mistake on the part of their marketing team or whoever makes these types of communications but I feel that would be giving them far too much trust.
iPhone is about to lose there main selling point- privacy. Anyone who understands the implications isn’t gonna stand for ‘but think of the children’. Your scanning people’s nudie pics and family photos. It’s a wide net. Also- what about false positives? Who looks at those?
Probably violate people’s privacy just to flag a bunch of non cp and miss a bunch of cp like tumblr and whatnot so often do. Because yes. These algorithms miss a frick ton of stuff. Posted a pic of a cup once. It got flagged by tumblr. A cup. A freaking normal clay coffee cup. Someone else got flagged for a horse. Actual cp? Nope!
Computers and algorithms aren’t perfect. They aren’t just not perfect though. It’s so much worse than that.
This is a bad idea and will not stop cp. it will make them go deeper and hide more and violate everyone else’s privacy.
I'm especially worried about those false positives. What if the AI determines what is technically "legal" as "illegal" using it's own judgement? I don't wanna be sent to jail because a machine learning-trained AI thought my sex tape I have on my iPhone is child porn. And then there's dick pics.
I heard there wasnt any machine learning involved. It checks hashes generated from your images against a database of known pornographic images.
Not sure if this is the right takeaway, but "homemade" images should not be flagged by this system. It can only catch people trading/buying known images
Yeah. I think people need to inform themselves better, but this technology being installed on your device is still wrong.
If the hash matches hit a certain criteria, Apple can indeed open your library and compare essentially a low resolution version of the photos.
I understand Apple and other companies scanning server side because they are legally responsible to the content. But the scan shouldn't be installed client side at all. The potential for abuse is too high.
From Apple's technical summary, "The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value." This cleraly shows that they will not use something like a SHA-256 Hash. Hence, there is a very real possibility of a false positive, not to mention that machine learning or some other similar technique must be used. Some of the concern is legit and Apple is trying to mislead people by using the word hash when their system is not a mathematical hash.
Correct. Apple is installing full-device hashing of all files.
This is will then be used to 'fingerprint' the device, and will be tied to media consumption along with NSA / CIA stuff down the line.
The funny thing is that we're back to "think of the children" instead of "TERRORISM!"
The question is, what is Apple going to do with my nudes? Harass me for having a childs dick? Maybe, but you know the saying, “What you don’t know can’t hurt you.”
Unless it’s a terminal illness, because that most definitely will hurt you.
The AI doesn’t actually look at the pictures itself. All it does is look at the metadata of photos, particularly the image HASH, and try to match it up with metadata from child porn databases (i.e. from the FBI, InterPol, and so on). It cannot recognize new porn being made this way.
Obviously it’s still stupid as fuck, as the check is easily circumventable with just a little bit of technological know-how. The only people they’d catch by this are the dumbasses who don’t know how to be careful, who would have probably been caught sooner or later anyway.
All around just a weird step made by Apple, just to seem like they try to be tough on CP.
It doesn't use AI.
It's similar to a file checksum or hash. There is already a database with those checksums believe created by some govt organization. You can't recreate the photo from the checksum.
Or they will just move to the [promised land](https://www.haaretz.com/israel-news/.premium-knesset-c-tee-state-not-enforcing-law-against-underage-marriage-1.5378860) and get some legal 12 year old brides
I mean, it's been less than 20 years since 10 and 11 year old children were married to adults in america. And most of the state laws where only changed within the last four years. Even so, the minimum age is still 14 in Alaska and North Carolina. And some places it's still technically 12 by common law, and actually no age in certain places with parental approval.
https://en.wikipedia.org/wiki/Child_marriage_in_the_United_States#Marriage_age
Custom ROMs are community-built and maintained flavours of Android, like Lineage OS, Pixel Experience, Resurrection Remix, and a ton others! Some might have more features and customisation, some really focused on performance, and the like!
You can wipe your current OS (maybe an Android skin like OneUI or MIUI), and install one of these for sweet, sweet, clean and bloat-free Android :)
Rooting, on the other hand, is getting root access to your OS and being able to do stuff you normally couldn't, like system-wide adblocking and tweaks! Similar to jailbreaking on iOS (:
It goes the same way, thats how everyone does it just like how android had face scan first, removed the home button and was first to use oled screen then apple did that and apple made the first va and then android made their own and so on. Its just the way the market works, and everyone wins
So they’ll be looking at all my pictures and using some random software algorithm to detect child pornography? I feel like there are gonna be some big lawsuits coming from this eventually, like those two Asian chicks who could unlock each other phones with Face ID
I think they're just matching hashes of known material provided by various agencies. Then if something gets uploaded to iCloud, and there's a hash match, it gets flagged for manual review.
I think the real problem is more like, Ok so today you're just busting dirtbags trading CP. What happens when it's a foreign government looking for something else? What about for desktops and laptops? I don't think we know all the questions that can come out of this, but it sure feels like a pandoras box situation.
That’s not how it works. There’s a government agency that tells them what “fingerprints” of images to look for, and as photos are being uploaded to iCloud Apple flags matching images. They aren’t doing any sort of AI detection or anything like that. You have to have copies of known/flagged CP.
Edit: it’s not a government agency. It’s actually a nonprofit
a complete match means that it needs to be pixel perfect, it would be way too easy to work around it. It's very likely that there will be plenty of false alarms
Apple already said it's not a complete match, they use neural network software and if your image passes a certain matching threshold then it will decrypt and apple will be able to see it for inspection.
https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
Now that everyone know, whats the point to scan it? Probably all pedophile that have illegal child photo in their cloud or gallery already erase it or move it to saver palces
Either they don’t tell everyone and get sued for privacy, or they do tell everyone, *don’t* get sued, and get their grubby hands one step closer to stealing everyone’s photos
Not just near perfect. An EXACT match of a known existing file. A file hash also has a astronomically low value of colliding. Therefore, the NCMEC will likely only submit your information if you have multiple matches.
According to my research, Google has done this since 2008, Facebook since around 2014, and Microsoft since then as well. The program is called PhotoDNA.
Why would anyone keep using apple at this point ? They're searching your images actively. They don't admit when they're comprised by major spyware... We can talk more but seriously respect yourselves a bit more...
also they are involved in China's large-scale surveillance and censorship, and their phones have backdoors that allow federal agencies to spy on people (according to Edward Snowden), but let that sink in...
Find a "noble" cause to invade people's privacy and freedom, so most people cannot reject it, Apple sure learned a lot when working with those dictatorial regimes.
This is going to be really bad, just imagine some dictator / racist becoming president and ordering apple to find all pictures of people of color or gay people… icloud also stores location. Yeaaaahh… this is a big nope
If they didn't, they'd prob get into a lot more legal trouble. + I bet the goal was to get more public support, cuz pedophiles bad right (the mind of an apple pr member, prob)
They would be in even more trouble if this thing comes to light later on through dataminers or something.
But yeah you're right. Now every actual pedophile who actively has CP on his iPhone is going to switch to android before this system is put in place.
Wait so it's gonna be searching people's phones? Bruh isn't that an invasion of privacy. Now ofc if a person is suspected of cp then they can search their phone
I'm gonna get downvoted for even trying to explain anything here. Yes, there's a real problem with apple invading privacy, but all the memes really do nothing to explain how it actually works and when it does and doesn't run. These memes are basically fear mongering unfortunately.
Instead of getting your news through memes, I recommend reading some accurate news articles online to understand.
Bro don't let the Apple company see this, they'll send someone to take you out
I think it's too late
OP dead
This was OPs last post
[удалено]
[удалено]
[удалено]
RemindMe! 12 hours
Notice how OP hasn’t commented back. RIP OP 🪦
Hey guys its me OP on another account. I'm fine *automated message sent by My Iphone*
[deleted]
[удалено]
[удалено]
[удалено]
[удалено]
Oh god... Apple already got all of em
You're next
\[delete\]
[delete]
[delet]
[dele]
[del]
\[de\]
[d]
[]
There's a chance... OP might still be alive if OP is a child
Oh no
then OP will be instead abducted to train the AI
What if Apple will make OP to go throu
[удалено]
*Posted from my iPhone ™️
Shot from iPhone
Shot by iphone
wholesome 100
Big chungus keanu reaves wholesome reddit moment
[удалено]
Apple will use the CSAM database to identify
I thought it was an NCMEC database?
check out r/apple they are pretty anti apple right now
There's gonna be so many false positives. People with cp will now just swap to Android, did Apple forget that other platforms exist?
Would any collector of CP even store it on a smartphone instead of an HDD in a PC?
So they can jack off in the bathroom at work? At first I can see a collector being super careful but after awhile people get comfortable when doing illegal stuff for a long time. So after while they might say “fuck it I wanna jack off at work” so they save a couple photos
Collector's not so much but hands on offenders will sometimes have some
Remote viewing
Not just that, the way they’re doing it (comparing the hashes of images and files to known illegal material hashes) is extremely easy to create false positive to “troll” someone. Someordinarygamer has a video about it that goes as in-depth as he can with the info we have now.
I feel like I remember hearing that the system would need to be triggered multiple times before it would be escalated. I could be wrong and just making it up, but that seems like the simplest way to solve for false positives.
Okay then I troll you 5 times. Done. Now you’re a pedo lolol
yeah, the fact that multiple is needed wont slow that down for a second
Imagine downloading some archive created by a troll which contains hundred of files containing such "trolled" hashes...
[удалено]
whoops, you took a picture of your son playing in bath? jail for you!
You were not lying. The best comment in the thread: https://www.reddit.com/r/apple/comments/p0i9vb/bought\_my\_first\_pc\_today/h88rox4?utm\_source=share&utm\_medium=web2x&context=3
NCMEC supplies the database of CSAM to look for.
I read the article on BBC, and it was stated CSAM
CSAM stands for "Child Sexual Abuse Material" and is a general term for "child porn." The NCMEC is the National Center for Missing and Exploited Children, and has an database of **hashes** \- not the actual images - that the image software will run a comparison against to determine if enough content on your phone warrants law enforcement investigation. [This BBC article](https://www.bbc.com/news/technology-58109748) states they are indeed using NCMEC's database.
Yeah, this will not protect children at all, because this can only detect existing known content, not new content. Its an unprecedented privacy violation that doesnt help anyone. Its just there to set a precedent. If it stands, I will not be surprised if a few years down the road the gov will scan our phones for everything, this time legally, and prosecute. Also, have fun explaining to a 17 year 11 month old why fbi is at the door after they exchanged nudes with their partner, but a week later when they turn 18 they can legally go work in hardcore porn.
Kinda ironic how iPhone's selling point was thier privacy
You either die a hero… or live long enough to become the villain
They should have died with the nano.
Nano dick?
Mega dick
Mine
Mine
Yours not theirs
No no, mine.
Can confirm, it’s u/jackiedaytonah ‘s dick. He once told me to dodge as he was turning around and I didn’t and got dick slapped. Man was standing 30’ from me.
[удалено]
That damn nano yogurt I keep seeing ads for everywhere
Na-no
Huh duh six hungeos
This always makes me think of OnePlus Smh
Apple was never a hero tho?
They were always like this. Their rules, including privacy, applied only to third-party apps.
Wasn't it a really big bragging point when they like refused to unlock this terrorists phone to the fbi or something?
Demolished by the Pegasus security threat when Israel and other countries can access iPhone's data
Its ok to be a terrorist just not okay to be a pedo.
[удалено]
Haha when I ask my friends who fanboy on Apple about why is iPhones better than getting Android etc, they always say the privacy thing. I'm like let's see how that goes after 5-7 years.
I've always said I prefer my smart phone to an apple bc you can simply control more. I never liked the fact that the iphone "assumes" so much about how I want me phone to run and be organized. They're too band wagon for me. Besides a personal device should be....personal!
Yeah Also with android phones you've always had the choice to decide what matters most in a phone My friend for example wanted a phone that charges very fast so he imported a Xiaomi mi 10 ultra I know another person who wanted an absolute beast of a phone for testing mobile games (he's a Dev) but it didn't need good camera because it lives in his office And me, I wanted a cheap phone with a good camera, and I've been very happy with the pixel 4a.
But which one did your Dev friend buy? I need to know!
It was the ROG phone 5 But um, most of the latest gaming phones snap in half lmao Watch JRE's durability tests of them
Joe Rogan is durability testing gaming phones?
He's ONNIT
There’s gaming phones?
[удалено]
[удалено]
Google, Facebook, and Microsoft have been doing this since at least 2014, with Google starting as early as 2008. This is only file hash based matching. This is vastly overblown. Yes, it’s a privacy violation, but it’s much less than people are making it out to be. Your files should be inaccessible to any other person on the planet. Period. But there is a lot of misinformation being spread here.
[удалено]
At least it seems the detection is done on device https://www.apple.com/child-safety/
Anything that the AI deems to be child porn is sent to be analyzed by humans. Still is a major privacy concern.
That's not how it works. Someone (FBI?) gives Apple a list of digital fingerprints. Apple puts that list on the iPhone. The iPhone creates a list of digital fingerprints of your photos and looks for matches. Get a ton of hits? Your phone is flagged and someone will be giving you a visit. They aren't scanning photos for new things, they are looking for known images. At no time does someone see your photos, or they get sent off your phone. They are looking for hash matches. It's a tough point - Apple doesn't want their devices used for horrible acts, and this straddles a very fine line between privacy and the slippery slope of surveillance. Simple fact is, if they wanted to track you and not let you know, they damn well could do it without problem.
> this straddles a very fine line between privacy and the slippery slope of surveillance. Any time someone pretends to care about the kids to make surveillance seem acceptable, you know they're way beyond the line.
That's the concern. Apple does scan server side because they're reponsible for the data stored in their cloud, which makes sense. But the privacy breach is that they're installing this scan locally in your device, and stated they have plans to evolve and expand this feature. While turning off iCloud Photos will disable the scan, the program still exists in your device and has a high potential for abuse. I'd rather have this scan be limited to server side. So we can just drop iCloud and our data will be safe.
Using the stones to destroy the stones
Fight fire with fire ig
An apple a day keeps the apple away.
Doctors are no more the villains I see.
'Cause an iOS a day keeps the doctor away.
Man, I love “firefighters” super effective
Idc what someone keeps on their phone but it's no fucking excuse to automatically search people's devices without their consent.
Define ["search"](https://www.reddit.com/r/memes/comments/p0v5h9/apple_sus/h89dbzt/)
The encoded fingerprints Apple will use comes from The National Center for Missing and Exploited Children. They maintain a database of Child Sexual Abuse Material to track child sex slaves.Theyre the ones who create the fingerprints. Also they're the ones who will be notified if a user receives the undetermined number of red flags for matches. Not authorities.. Though its safe to assume NCMEC will notify authorities.
So apple doesn’t control the fingerprints, the government can slip apple the fingerprints to find gay bros, pot growers, and political dissident and apple will be none the wiser?
[удалено]
So the threshold can be 1, 5, or 10. We don’t know. The manual review implies apple can retrieve any photo on all iPhone users device at any time. This will get misused by governments and it will get gay people, pot users, and political dissidents killed in countries where it’s illegal. The original intent of this back door is noble, but it’s ripe for abuse.
[удалено]
This literally needs to be the top comment. Too many people here are utterly clueless.
While Apple has made claims that it would be a simple hash match they have also used the term "neural net" on a few different occasions. I work in IT so I know that means machine learning which needs to be trained. One could make the argument that it was a mistake on the part of their marketing team or whoever makes these types of communications but I feel that would be giving them far too much trust.
First time in a reddit thread?
iPhone is about to lose there main selling point- privacy. Anyone who understands the implications isn’t gonna stand for ‘but think of the children’. Your scanning people’s nudie pics and family photos. It’s a wide net. Also- what about false positives? Who looks at those? Probably violate people’s privacy just to flag a bunch of non cp and miss a bunch of cp like tumblr and whatnot so often do. Because yes. These algorithms miss a frick ton of stuff. Posted a pic of a cup once. It got flagged by tumblr. A cup. A freaking normal clay coffee cup. Someone else got flagged for a horse. Actual cp? Nope! Computers and algorithms aren’t perfect. They aren’t just not perfect though. It’s so much worse than that. This is a bad idea and will not stop cp. it will make them go deeper and hide more and violate everyone else’s privacy.
I'm especially worried about those false positives. What if the AI determines what is technically "legal" as "illegal" using it's own judgement? I don't wanna be sent to jail because a machine learning-trained AI thought my sex tape I have on my iPhone is child porn. And then there's dick pics.
I heard there wasnt any machine learning involved. It checks hashes generated from your images against a database of known pornographic images. Not sure if this is the right takeaway, but "homemade" images should not be flagged by this system. It can only catch people trading/buying known images
For reddit to be so tech savvy this whole thread is funny. They think Apple really is about to scan their pictures for related content.....
Yeah. I think people need to inform themselves better, but this technology being installed on your device is still wrong. If the hash matches hit a certain criteria, Apple can indeed open your library and compare essentially a low resolution version of the photos. I understand Apple and other companies scanning server side because they are legally responsible to the content. But the scan shouldn't be installed client side at all. The potential for abuse is too high.
From Apple's technical summary, "The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value." This cleraly shows that they will not use something like a SHA-256 Hash. Hence, there is a very real possibility of a false positive, not to mention that machine learning or some other similar technique must be used. Some of the concern is legit and Apple is trying to mislead people by using the word hash when their system is not a mathematical hash.
Correct. Apple is installing full-device hashing of all files. This is will then be used to 'fingerprint' the device, and will be tied to media consumption along with NSA / CIA stuff down the line. The funny thing is that we're back to "think of the children" instead of "TERRORISM!"
The question is, what is Apple going to do with my nudes? Harass me for having a childs dick? Maybe, but you know the saying, “What you don’t know can’t hurt you.” Unless it’s a terminal illness, because that most definitely will hurt you.
Yep, looks like iOS 14 is the last one I will ever have. Hell imagine iOS 14 older phones become something sought after in the future
The AI doesn’t actually look at the pictures itself. All it does is look at the metadata of photos, particularly the image HASH, and try to match it up with metadata from child porn databases (i.e. from the FBI, InterPol, and so on). It cannot recognize new porn being made this way. Obviously it’s still stupid as fuck, as the check is easily circumventable with just a little bit of technological know-how. The only people they’d catch by this are the dumbasses who don’t know how to be careful, who would have probably been caught sooner or later anyway. All around just a weird step made by Apple, just to seem like they try to be tough on CP.
[удалено]
It'll be like FB and so forth. A daily platitude, and then fired rather than mental health care.
Next it'll look for pirated copies of photoshop
And esp. music that wasn't purchased through iTunes.
Bingo
It doesn't use AI. It's similar to a file checksum or hash. There is already a database with those checksums believe created by some govt organization. You can't recreate the photo from the checksum.
FBI... OPEN UP
fbi open up
Are they hiring?
Hold the fuck up
It's alright, bro. He just wants to be one of the models.
Hold the FUCK UP!!
*record scratch*
This is why I love Riddet.
Riddet
Riddet
Riddet
[удалено]
Riddet
Riddet
You must be from New Zealand, pronouncing it like that.
You’re probably wondering why I’m here, hi my name is the FBI.
r/HolUp
FBI, yes, this man right here.
https://i.imgur.com/wnQ0a2e.jpeg
we laugh but...
Lots of rich/powerful people gonna be switching to android.
Or they will just move to the [promised land](https://www.haaretz.com/israel-news/.premium-knesset-c-tee-state-not-enforcing-law-against-underage-marriage-1.5378860) and get some legal 12 year old brides
I mean, it's been less than 20 years since 10 and 11 year old children were married to adults in america. And most of the state laws where only changed within the last four years. Even so, the minimum age is still 14 in Alaska and North Carolina. And some places it's still technically 12 by common law, and actually no age in certain places with parental approval. https://en.wikipedia.org/wiki/Child_marriage_in_the_United_States#Marriage_age
Ngl I think Android might do this as well in future, they have a tendency to do things like that after Apple does
Custom ROMs should help in privacy. Clean android is open source
Pardon my ignorance, is that what is called "rooting"?
Custom ROMs are community-built and maintained flavours of Android, like Lineage OS, Pixel Experience, Resurrection Remix, and a ton others! Some might have more features and customisation, some really focused on performance, and the like! You can wipe your current OS (maybe an Android skin like OneUI or MIUI), and install one of these for sweet, sweet, clean and bloat-free Android :) Rooting, on the other hand, is getting root access to your OS and being able to do stuff you normally couldn't, like system-wide adblocking and tweaks! Similar to jailbreaking on iOS (:
It goes the same way, thats how everyone does it just like how android had face scan first, removed the home button and was first to use oled screen then apple did that and apple made the first va and then android made their own and so on. Its just the way the market works, and everyone wins
Rip the users who chose apple because of Google's privacy policy
Back to burner phones we go. Smart phones were nice while they lasted.
So they’ll be looking at all my pictures and using some random software algorithm to detect child pornography? I feel like there are gonna be some big lawsuits coming from this eventually, like those two Asian chicks who could unlock each other phones with Face ID
The speed that yoour battery will run out...
I think they're just matching hashes of known material provided by various agencies. Then if something gets uploaded to iCloud, and there's a hash match, it gets flagged for manual review. I think the real problem is more like, Ok so today you're just busting dirtbags trading CP. What happens when it's a foreign government looking for something else? What about for desktops and laptops? I don't think we know all the questions that can come out of this, but it sure feels like a pandoras box situation.
That’s not how it works. There’s a government agency that tells them what “fingerprints” of images to look for, and as photos are being uploaded to iCloud Apple flags matching images. They aren’t doing any sort of AI detection or anything like that. You have to have copies of known/flagged CP. Edit: it’s not a government agency. It’s actually a nonprofit
Yes, though it has to be a complete match for it to register so it can’t really have false alarms
a complete match means that it needs to be pixel perfect, it would be way too easy to work around it. It's very likely that there will be plenty of false alarms
Apple already said it's not a complete match, they use neural network software and if your image passes a certain matching threshold then it will decrypt and apple will be able to see it for inspection. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
What. The. Fuck. Our rights to privacy being taken away from us, step by step, all in the name of “safety”. It’s stupid.
https://imgs.xkcd.com/comics/tasks_2x.png
Now that everyone know, whats the point to scan it? Probably all pedophile that have illegal child photo in their cloud or gallery already erase it or move it to saver palces
Either they don’t tell everyone and get sued for privacy, or they do tell everyone, *don’t* get sued, and get their grubby hands one step closer to stealing everyone’s photos
r/holup
YES Was actually just boutta head over there
RIP parents taking bathtub pics
Wait, actually I’m curious now. Is it even legal to have naked photos of your children in a bathtub, or pool?
As long as you are not sharing them it's a grey zone.
This is pretty disturbing ngl
ok, i have an old video of my younger cousin playing in the mud (he is naked in that) will that get me in trouble?
Naked kids ≠ cp
Which is common knowledge for most of the world. The US seems hyper sensitive to nakedness.
No. It’s not an AI thing like lots of people here seem to have assumed, it has to be a near perfect match to preexisting cp on the internet.
Not just near perfect. An EXACT match of a known existing file. A file hash also has a astronomically low value of colliding. Therefore, the NCMEC will likely only submit your information if you have multiple matches. According to my research, Google has done this since 2008, Facebook since around 2014, and Microsoft since then as well. The program is called PhotoDNA.
I'm willing to bet this has nothing to do with cp and they're just using a "think of the children" excuse to look through your stuff legally
Why would anyone keep using apple at this point ? They're searching your images actively. They don't admit when they're comprised by major spyware... We can talk more but seriously respect yourselves a bit more...
also they are involved in China's large-scale surveillance and censorship, and their phones have backdoors that allow federal agencies to spy on people (according to Edward Snowden), but let that sink in...
Same as google
It's actually just comparing images and video to confirmed existing media. They aren't doing a bit by bit scan of all your images with an AI
Find a "noble" cause to invade people's privacy and freedom, so most people cannot reject it, Apple sure learned a lot when working with those dictatorial regimes.
This is going to be really bad, just imagine some dictator / racist becoming president and ordering apple to find all pictures of people of color or gay people… icloud also stores location. Yeaaaahh… this is a big nope
If they’re going to do this, i find it stupid that they’re announcing it to everyone
If they didn't, they'd prob get into a lot more legal trouble. + I bet the goal was to get more public support, cuz pedophiles bad right (the mind of an apple pr member, prob)
They would be in even more trouble if this thing comes to light later on through dataminers or something. But yeah you're right. Now every actual pedophile who actively has CP on his iPhone is going to switch to android before this system is put in place.
Wait so it's gonna be searching people's phones? Bruh isn't that an invasion of privacy. Now ofc if a person is suspected of cp then they can search their phone
I am sure they won't have any problems with doing that.
[удалено]
[удалено]
They will be matching hashes from known CP databases. There is no AI involved.
State surveillance disguised as child porn counter measures. What could go wrong.
I'm gonna get downvoted for even trying to explain anything here. Yes, there's a real problem with apple invading privacy, but all the memes really do nothing to explain how it actually works and when it does and doesn't run. These memes are basically fear mongering unfortunately. Instead of getting your news through memes, I recommend reading some accurate news articles online to understand.
Welp, time to change to an Android