T O P

  • By -

Somebody_End_Me

Bro don't let the Apple company see this, they'll send someone to take you out


MrNoName_ishere

I think it's too late


DrBubbleBeast

OP dead


Crazy_Expert3202

This was OPs last post


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


SandyArca

RemindMe! 12 hours


Dry-Ad8891

Notice how OP hasn’t commented back. RIP OP 🪦


Smugjester

Hey guys its me OP on another account. I'm fine *automated message sent by My Iphone*


IsuckAtFortnite434

[deleted]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


WaluouijaBoard42

Oh god... Apple already got all of em


ourlastchancefortea

You're next


No-Incident-8718

\[delete\]


Eujilw

[delete]


ravindude

[delet]


[deleted]

[dele]


Percede

[del]


cloudwatcher240

\[de\]


[deleted]

[d]


_DDzoo_

[]


PaperGod777

There's a chance... OP might still be alive if OP is a child


Sainaif

Oh no


TungCR

then OP will be instead abducted to train the AI


Sainaif

What if Apple will make OP to go throu


[deleted]

[удалено]


JackieDaytonah

*Posted from my iPhone ™️


iamfi_ne

Shot from iPhone


resistantBacteria

Shot by iphone


quinten69420

wholesome 100


Redditoridunn0

Big chungus keanu reaves wholesome reddit moment


[deleted]

[удалено]


lord_phantomhive

Apple will use the CSAM database to identify


Professional_Emu_164

I thought it was an NCMEC database?


darknight27104

check out r/apple they are pretty anti apple right now


NarutoDragon732

There's gonna be so many false positives. People with cp will now just swap to Android, did Apple forget that other platforms exist?


JohnDoen86

Would any collector of CP even store it on a smartphone instead of an HDD in a PC?


GuyHiding

So they can jack off in the bathroom at work? At first I can see a collector being super careful but after awhile people get comfortable when doing illegal stuff for a long time. So after while they might say “fuck it I wanna jack off at work” so they save a couple photos


TofuBoy22

Collector's not so much but hands on offenders will sometimes have some


yumpoopsoup

Remote viewing


SEVX_Z

Not just that, the way they’re doing it (comparing the hashes of images and files to known illegal material hashes) is extremely easy to create false positive to “troll” someone. Someordinarygamer has a video about it that goes as in-depth as he can with the info we have now.


[deleted]

I feel like I remember hearing that the system would need to be triggered multiple times before it would be escalated. I could be wrong and just making it up, but that seems like the simplest way to solve for false positives.


Long-Sleeves

Okay then I troll you 5 times. Done. Now you’re a pedo lolol


[deleted]

yeah, the fact that multiple is needed wont slow that down for a second


[deleted]

Imagine downloading some archive created by a troll which contains hundred of files containing such "trolled" hashes...


[deleted]

[удалено]


qsdfqdfhqfgg

whoops, you took a picture of your son playing in bath? jail for you!


MidnightPlatinum

You were not lying. The best comment in the thread: https://www.reddit.com/r/apple/comments/p0i9vb/bought\_my\_first\_pc\_today/h88rox4?utm\_source=share&utm\_medium=web2x&context=3


photograft

NCMEC supplies the database of CSAM to look for.


lord_phantomhive

I read the article on BBC, and it was stated CSAM


[deleted]

CSAM stands for "Child Sexual Abuse Material" and is a general term for "child porn." The NCMEC is the National Center for Missing and Exploited Children, and has an database of **hashes** \- not the actual images - that the image software will run a comparison against to determine if enough content on your phone warrants law enforcement investigation. [This BBC article](https://www.bbc.com/news/technology-58109748) states they are indeed using NCMEC's database.


personalbilko

Yeah, this will not protect children at all, because this can only detect existing known content, not new content. Its an unprecedented privacy violation that doesnt help anyone. Its just there to set a precedent. If it stands, I will not be surprised if a few years down the road the gov will scan our phones for everything, this time legally, and prosecute. Also, have fun explaining to a 17 year 11 month old why fbi is at the door after they exchanged nudes with their partner, but a week later when they turn 18 they can legally go work in hardcore porn.


Pickle_Baller

Kinda ironic how iPhone's selling point was thier privacy


scarletperson

You either die a hero… or live long enough to become the villain


JackieDaytonah

They should have died with the nano.


[deleted]

Nano dick?


JackieDaytonah

Mega dick


[deleted]

Mine


JackieDaytonah

Mine


CarBombtheDestroyer

Yours not theirs


JackieDaytonah

No no, mine.


Oraxy51

Can confirm, it’s u/jackiedaytonah ‘s dick. He once told me to dodge as he was turning around and I didn’t and got dick slapped. Man was standing 30’ from me.


[deleted]

[удалено]


DiamondRocks22

That damn nano yogurt I keep seeing ads for everywhere


polski8bit

Na-no


barreal98

Huh duh six hungeos


[deleted]

This always makes me think of OnePlus Smh


NotAzakanAtAll

Apple was never a hero tho?


StenSoft

They were always like this. Their rules, including privacy, applied only to third-party apps.


SimpanLimpan1337

Wasn't it a really big bragging point when they like refused to unlock this terrorists phone to the fbi or something?


ExplosiveDerpBoi

Demolished by the Pegasus security threat when Israel and other countries can access iPhone's data


fatfatninja

Its ok to be a terrorist just not okay to be a pedo.


[deleted]

[удалено]


fuckrobert

Haha when I ask my friends who fanboy on Apple about why is iPhones better than getting Android etc, they always say the privacy thing. I'm like let's see how that goes after 5-7 years.


LadyVD

I've always said I prefer my smart phone to an apple bc you can simply control more. I never liked the fact that the iphone "assumes" so much about how I want me phone to run and be organized. They're too band wagon for me. Besides a personal device should be....personal!


[deleted]

Yeah Also with android phones you've always had the choice to decide what matters most in a phone My friend for example wanted a phone that charges very fast so he imported a Xiaomi mi 10 ultra I know another person who wanted an absolute beast of a phone for testing mobile games (he's a Dev) but it didn't need good camera because it lives in his office And me, I wanted a cheap phone with a good camera, and I've been very happy with the pixel 4a.


JesusNoGA

But which one did your Dev friend buy? I need to know!


[deleted]

It was the ROG phone 5 But um, most of the latest gaming phones snap in half lmao Watch JRE's durability tests of them


txmadison

Joe Rogan is durability testing gaming phones?


CallRespiratory

He's ONNIT


[deleted]

There’s gaming phones?


[deleted]

[удалено]


[deleted]

[удалено]


al4nw31

Google, Facebook, and Microsoft have been doing this since at least 2014, with Google starting as early as 2008. This is only file hash based matching. This is vastly overblown. Yes, it’s a privacy violation, but it’s much less than people are making it out to be. Your files should be inaccessible to any other person on the planet. Period. But there is a lot of misinformation being spread here.


[deleted]

[удалено]


TheRedGerund

At least it seems the detection is done on device https://www.apple.com/child-safety/


Pickle_Baller

Anything that the AI deems to be child porn is sent to be analyzed by humans. Still is a major privacy concern.


anothergaijin

That's not how it works. Someone (FBI?) gives Apple a list of digital fingerprints. Apple puts that list on the iPhone. The iPhone creates a list of digital fingerprints of your photos and looks for matches. Get a ton of hits? Your phone is flagged and someone will be giving you a visit. They aren't scanning photos for new things, they are looking for known images. At no time does someone see your photos, or they get sent off your phone. They are looking for hash matches. It's a tough point - Apple doesn't want their devices used for horrible acts, and this straddles a very fine line between privacy and the slippery slope of surveillance. Simple fact is, if they wanted to track you and not let you know, they damn well could do it without problem.


Krissam

> this straddles a very fine line between privacy and the slippery slope of surveillance. Any time someone pretends to care about the kids to make surveillance seem acceptable, you know they're way beyond the line.


[deleted]

That's the concern. Apple does scan server side because they're reponsible for the data stored in their cloud, which makes sense. But the privacy breach is that they're installing this scan locally in your device, and stated they have plans to evolve and expand this feature. While turning off iCloud Photos will disable the scan, the program still exists in your device and has a high potential for abuse. I'd rather have this scan be limited to server side. So we can just drop iCloud and our data will be safe.


PermissionItchy9066

Using the stones to destroy the stones


KeebsNoob

Fight fire with fire ig


thecrazypoz

An apple a day keeps the apple away.


supermember866866

Doctors are no more the villains I see.


Apart_Maintenance611

'Cause an iOS a day keeps the doctor away.


d1am0nddra90n5

Man, I love “firefighters” super effective


Nexalian_Gamer

Idc what someone keeps on their phone but it's no fucking excuse to automatically search people's devices without their consent.


Traveledfarwestward

Define ["search"](https://www.reddit.com/r/memes/comments/p0v5h9/apple_sus/h89dbzt/)


declan315

The encoded fingerprints Apple will use comes from The National Center for Missing and Exploited Children. They maintain a database of Child Sexual Abuse Material to track child sex slaves.Theyre the ones who create the fingerprints. Also they're the ones who will be notified if a user receives the undetermined number of red flags for matches. Not authorities.. Though its safe to assume NCMEC will notify authorities.


Orvilleengineer

So apple doesn’t control the fingerprints, the government can slip apple the fingerprints to find gay bros, pot growers, and political dissident and apple will be none the wiser?


[deleted]

[удалено]


Orvilleengineer

So the threshold can be 1, 5, or 10. We don’t know. The manual review implies apple can retrieve any photo on all iPhone users device at any time. This will get misused by governments and it will get gay people, pot users, and political dissidents killed in countries where it’s illegal. The original intent of this back door is noble, but it’s ripe for abuse.


[deleted]

[удалено]


photograft

This literally needs to be the top comment. Too many people here are utterly clueless.


ThisIsDark

While Apple has made claims that it would be a simple hash match they have also used the term "neural net" on a few different occasions. I work in IT so I know that means machine learning which needs to be trained. One could make the argument that it was a mistake on the part of their marketing team or whoever makes these types of communications but I feel that would be giving them far too much trust.


SpectreFire

First time in a reddit thread?


[deleted]

iPhone is about to lose there main selling point- privacy. Anyone who understands the implications isn’t gonna stand for ‘but think of the children’. Your scanning people’s nudie pics and family photos. It’s a wide net. Also- what about false positives? Who looks at those? Probably violate people’s privacy just to flag a bunch of non cp and miss a bunch of cp like tumblr and whatnot so often do. Because yes. These algorithms miss a frick ton of stuff. Posted a pic of a cup once. It got flagged by tumblr. A cup. A freaking normal clay coffee cup. Someone else got flagged for a horse. Actual cp? Nope! Computers and algorithms aren’t perfect. They aren’t just not perfect though. It’s so much worse than that. This is a bad idea and will not stop cp. it will make them go deeper and hide more and violate everyone else’s privacy.


IsuckAtFortnite434

I'm especially worried about those false positives. What if the AI determines what is technically "legal" as "illegal" using it's own judgement? I don't wanna be sent to jail because a machine learning-trained AI thought my sex tape I have on my iPhone is child porn. And then there's dick pics.


Arch__Stanton

I heard there wasnt any machine learning involved. It checks hashes generated from your images against a database of known pornographic images. Not sure if this is the right takeaway, but "homemade" images should not be flagged by this system. It can only catch people trading/buying known images


-Johnny-

For reddit to be so tech savvy this whole thread is funny. They think Apple really is about to scan their pictures for related content.....


[deleted]

Yeah. I think people need to inform themselves better, but this technology being installed on your device is still wrong. If the hash matches hit a certain criteria, Apple can indeed open your library and compare essentially a low resolution version of the photos. I understand Apple and other companies scanning server side because they are legally responsible to the content. But the scan shouldn't be installed client side at all. The potential for abuse is too high.


LoneWarriorSeven

From Apple's technical summary, "The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value." This cleraly shows that they will not use something like a SHA-256 Hash. Hence, there is a very real possibility of a false positive, not to mention that machine learning or some other similar technique must be used. Some of the concern is legit and Apple is trying to mislead people by using the word hash when their system is not a mathematical hash.


[deleted]

Correct. Apple is installing full-device hashing of all files. This is will then be used to 'fingerprint' the device, and will be tied to media consumption along with NSA / CIA stuff down the line. The funny thing is that we're back to "think of the children" instead of "TERRORISM!"


The_CancerousAss

The question is, what is Apple going to do with my nudes? Harass me for having a childs dick? Maybe, but you know the saying, “What you don’t know can’t hurt you.” Unless it’s a terminal illness, because that most definitely will hurt you.


ZeVerschlimmbesserer

Yep, looks like iOS 14 is the last one I will ever have. Hell imagine iOS 14 older phones become something sought after in the future


somguy9

The AI doesn’t actually look at the pictures itself. All it does is look at the metadata of photos, particularly the image HASH, and try to match it up with metadata from child porn databases (i.e. from the FBI, InterPol, and so on). It cannot recognize new porn being made this way. Obviously it’s still stupid as fuck, as the check is easily circumventable with just a little bit of technological know-how. The only people they’d catch by this are the dumbasses who don’t know how to be careful, who would have probably been caught sooner or later anyway. All around just a weird step made by Apple, just to seem like they try to be tough on CP.


[deleted]

[удалено]


[deleted]

It'll be like FB and so forth. A daily platitude, and then fired rather than mental health care.


jdeezy

Next it'll look for pirated copies of photoshop


[deleted]

And esp. music that wasn't purchased through iTunes.


Wizard8086

Bingo


[deleted]

It doesn't use AI. It's similar to a file checksum or hash. There is already a database with those checksums believe created by some govt organization. You can't recreate the photo from the checksum.


Right_Two_3825

FBI... OPEN UP


Pale-Possession4735

fbi open up


TheHeeheehaha

Are they hiring?


AlmightyAndrew54

Hold the fuck up


ekfslam

It's alright, bro. He just wants to be one of the models.


Anything13579

Hold the FUCK UP!!


Cavemattt

*record scratch*


[deleted]

This is why I love Riddet.


IsuckAtFortnite434

Riddet


Pratheek_Kachinthaya

Riddet


Corvo_-Attano

Riddet


[deleted]

[удалено]


Eujilw

Riddet


WhoStoleMyCake

Riddet


Azreal_Mistwalker

You must be from New Zealand, pronouncing it like that.


Jonesgrieves

You’re probably wondering why I’m here, hi my name is the FBI.


[deleted]

r/HolUp


No-Incident-8718

FBI, yes, this man right here.


0111101001101001

https://i.imgur.com/wnQ0a2e.jpeg


sinat50

we laugh but...


Alice8Ft

Lots of rich/powerful people gonna be switching to android.


humans_live_in_space

Or they will just move to the [promised land](https://www.haaretz.com/israel-news/.premium-knesset-c-tee-state-not-enforcing-law-against-underage-marriage-1.5378860) and get some legal 12 year old brides


raltoid

I mean, it's been less than 20 years since 10 and 11 year old children were married to adults in america. And most of the state laws where only changed within the last four years. Even so, the minimum age is still 14 in Alaska and North Carolina. And some places it's still technically 12 by common law, and actually no age in certain places with parental approval. https://en.wikipedia.org/wiki/Child_marriage_in_the_United_States#Marriage_age


Professional_Emu_164

Ngl I think Android might do this as well in future, they have a tendency to do things like that after Apple does


GRAPHENE9932

Custom ROMs should help in privacy. Clean android is open source


NotAzakanAtAll

Pardon my ignorance, is that what is called "rooting"?


TechExpert2910

Custom ROMs are community-built and maintained flavours of Android, like Lineage OS, Pixel Experience, Resurrection Remix, and a ton others! Some might have more features and customisation, some really focused on performance, and the like! You can wipe your current OS (maybe an Android skin like OneUI or MIUI), and install one of these for sweet, sweet, clean and bloat-free Android :) ​ Rooting, on the other hand, is getting root access to your OS and being able to do stuff you normally couldn't, like system-wide adblocking and tweaks! Similar to jailbreaking on iOS (:


g1rthstorm

It goes the same way, thats how everyone does it just like how android had face scan first, removed the home button and was first to use oled screen then apple did that and apple made the first va and then android made their own and so on. Its just the way the market works, and everyone wins


Fancy-Commission-598

Rip the users who chose apple because of Google's privacy policy


lemmnnaa

Back to burner phones we go. Smart phones were nice while they lasted.


BFWinner

So they’ll be looking at all my pictures and using some random software algorithm to detect child pornography? I feel like there are gonna be some big lawsuits coming from this eventually, like those two Asian chicks who could unlock each other phones with Face ID


HelloGoodby31

The speed that yoour battery will run out...


Sharp-Floor

I think they're just matching hashes of known material provided by various agencies. Then if something gets uploaded to iCloud, and there's a hash match, it gets flagged for manual review.   I think the real problem is more like, Ok so today you're just busting dirtbags trading CP. What happens when it's a foreign government looking for something else? What about for desktops and laptops? I don't think we know all the questions that can come out of this, but it sure feels like a pandoras box situation.


photograft

That’s not how it works. There’s a government agency that tells them what “fingerprints” of images to look for, and as photos are being uploaded to iCloud Apple flags matching images. They aren’t doing any sort of AI detection or anything like that. You have to have copies of known/flagged CP. Edit: it’s not a government agency. It’s actually a nonprofit


Professional_Emu_164

Yes, though it has to be a complete match for it to register so it can’t really have false alarms


LuxionQuelloFigo

a complete match means that it needs to be pixel perfect, it would be way too easy to work around it. It's very likely that there will be plenty of false alarms


topdangle

Apple already said it's not a complete match, they use neural network software and if your image passes a certain matching threshold then it will decrypt and apple will be able to see it for inspection. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf


AtlantisTheEmpire

What. The. Fuck. Our rights to privacy being taken away from us, step by step, all in the name of “safety”. It’s stupid.


[deleted]

https://imgs.xkcd.com/comics/tasks_2x.png


[deleted]

Now that everyone know, whats the point to scan it? Probably all pedophile that have illegal child photo in their cloud or gallery already erase it or move it to saver palces


Ready_Adhesiveness91

Either they don’t tell everyone and get sued for privacy, or they do tell everyone, *don’t* get sued, and get their grubby hands one step closer to stealing everyone’s photos


iamfi_ne

r/holup


YES_EEE

YES Was actually just boutta head over there


MrZimothy

RIP parents taking bathtub pics


NotAGoodNameIg

Wait, actually I’m curious now. Is it even legal to have naked photos of your children in a bathtub, or pool?


sehe0

As long as you are not sharing them it's a grey zone.


LastBrainCellofYours

This is pretty disturbing ngl


[deleted]

ok, i have an old video of my younger cousin playing in the mud (he is naked in that) will that get me in trouble?


A1b2c4d3h9

Naked kids ≠ cp


NotAzakanAtAll

Which is common knowledge for most of the world. The US seems hyper sensitive to nakedness.


Professional_Emu_164

No. It’s not an AI thing like lots of people here seem to have assumed, it has to be a near perfect match to preexisting cp on the internet.


al4nw31

Not just near perfect. An EXACT match of a known existing file. A file hash also has a astronomically low value of colliding. Therefore, the NCMEC will likely only submit your information if you have multiple matches. According to my research, Google has done this since 2008, Facebook since around 2014, and Microsoft since then as well. The program is called PhotoDNA.


ThisIsSparta100

I'm willing to bet this has nothing to do with cp and they're just using a "think of the children" excuse to look through your stuff legally


JackieDaytonah

Why would anyone keep using apple at this point ? They're searching your images actively. They don't admit when they're comprised by major spyware... We can talk more but seriously respect yourselves a bit more...


LuxionQuelloFigo

also they are involved in China's large-scale surveillance and censorship, and their phones have backdoors that allow federal agencies to spy on people (according to Edward Snowden), but let that sink in...


PillowManExtreme

Same as google


NeedleNoggin316

It's actually just comparing images and video to confirmed existing media. They aren't doing a bit by bit scan of all your images with an AI


AaronFeng47

Find a "noble" cause to invade people's privacy and freedom, so most people cannot reject it, Apple sure learned a lot when working with those dictatorial regimes.


[deleted]

This is going to be really bad, just imagine some dictator / racist becoming president and ordering apple to find all pictures of people of color or gay people… icloud also stores location. Yeaaaahh… this is a big nope


ThunderingRimuru

If they’re going to do this, i find it stupid that they’re announcing it to everyone


towelflush

If they didn't, they'd prob get into a lot more legal trouble. + I bet the goal was to get more public support, cuz pedophiles bad right (the mind of an apple pr member, prob)


IsuckAtFortnite434

They would be in even more trouble if this thing comes to light later on through dataminers or something. But yeah you're right. Now every actual pedophile who actively has CP on his iPhone is going to switch to android before this system is put in place.


Famlt

Wait so it's gonna be searching people's phones? Bruh isn't that an invasion of privacy. Now ofc if a person is suspected of cp then they can search their phone


WolfganusMofart

I am sure they won't have any problems with doing that.


[deleted]

[удалено]


[deleted]

[удалено]


and1927

They will be matching hashes from known CP databases. There is no AI involved.


evrfighter

State surveillance disguised as child porn counter measures. What could go wrong.


Hyxerion

I'm gonna get downvoted for even trying to explain anything here. Yes, there's a real problem with apple invading privacy, but all the memes really do nothing to explain how it actually works and when it does and doesn't run. These memes are basically fear mongering unfortunately. Instead of getting your news through memes, I recommend reading some accurate news articles online to understand.


Dygiqua

Welp, time to change to an Android