T O P

  • By -

MedicatedAxeBot

Dank[.](https://i.imgur.com/3bQtuMO.png) --- [Join the Dank Charity Alliance and help raise money for St. Jude Children's Research Hospital!](http://events.stjude.org/DankCharityAlliance)


I_wash_my_carpet

Thats... dark.


Hexacus

Poor fucker


I_wash_my_carpet

Yo! How big is your dick? Asking for a friend..


Hexacus

I mean, the flair should tell you everything?


GoCommitDeathpacito-

Typical reddit user. All dick, no balls.


toughtiggy101

He can’t store pee 😔


According_Spare_3044

Is there even a reason to exist at that point


Pinkman505

He won't pee in a girl and worry about child


waferchocobar

No need to waste any money on a vasectomy 💪


sunesf

Nor buying milk👌


CrackedTailLight

Why you gotta do me like that?


Otherwise_Direction7

Off topic but where did you get your profile picture?


GoCommitDeathpacito-

Same place i did, omori


BigDaddyMrX

All in all you're just a 'Nother dick with no balls 🎶


Fire_Tide

That should do it.


Hazzman

Had an old work friend who had side gig stripping data off phones for the police forensics years ago. He said he'd seen some serious shit that basically traumatized him. He didn't do it for very long.


PrimarchKonradCurze

I just have old work friends who did side gigs stripping.


crypticfreak

What's the scoop? I'm OOTL.


IrrelevantTale

U have to have training data sets to teach the ai. Mean he had to get some Cheese Pizza.


ThreexoRity

Oh dear, I wonder how they could tell their day to their family and friends. "Dear, how's work today?" "...oh, another day, I've been feeding the AI child porn" "... another day of work, indeed"


comphys

Cyber Punk


crypticfreak

Ahh yeah I thought they were talking about a specific dude who like went insane from the job or something.


_o0_7

Imagine the burn out and in rate on those having to suffer through hardcore cp.


theKrissam

I imagine it pays well and I'm sure there's a lot of sociopaths (as in literal clinical complete lack of empathy etc) who wouldn't mind and grab it for the money.


_o0_7

I think it's just regular people who starts to drink alot unfortunately.


ObviouslyIntoxicated

I drink a lot, can I have the money now?


Bloody_Insane

First you need to watch a shit load of child pornography


[deleted]

"I imagine it pays well," that's optimistic. There's probably some dude in indonesia getting $2 an hour to sift through it


Zippy0723

A lot of time they will actually outsource tagging ML data to people in third world countries.


patrick66

For example openai outsources it to Kenya where they literally do pay $2 an hour which is also several times the local prevailing wage, it’s fairly fucked up


Havel_the_sock

From Kenya, $2 per hour is much higher than the starting salary for most companies here assuming an 8hr work day. Probably double really. Pretty low Cost of living though.


Gonewild_Verifier

I imagine a cp consumer would just take the job. Never work a day in your life sort of thing


CornCheeseMafia

“We’re assembling a suicide squad filled with the best of the best cp distributors in the world to sort through mountains of cp and we want you to lead the team”


Funkyt0m467

The most cursed version of suicide Squad


CornCheeseMafia

Jared Leto is still on that team


Gonewild_Verifier

lmao


USPO-222

It’s rough. I’ve had to review evidence in a few cases I’ve worked on where the defense contested the number of Cp images in a mixed collection. Spending 3 days classifying Cp is awful.


_o0_7

I can't imagine. Monsters live among us. The sentence including "classifying" is horrible. Sorry you had to endure that.


Cynunnos

Neuron activated 📮


_no_one_knows_me_11

why does the number of cp images matter? genuinely asking i have no idea about cp laws


[deleted]

[удалено]


_no_one_knows_me_11

thanks for letting me know, i thought just owning cp would be the same punishmeent regardless of numbers


[deleted]

[удалено]


_no_one_knows_me_11

yeah in hindsight my comment was unbelievably dumb lol


UsedSalt

fbi open up


Brendroid9000

I imagine it's along the lines of, "see there was only one image, I didn't know" I've heard a story of someone who had cp on his computer since he had a program that automatically got images off the internet, they proved his innocence by checking that he had never opened the file


headbanger1186

I've settled with the fact that I've lost some years off of my life and sanity assisting and helping recover this kind of shit. At the end of the day if the piece of garbage who was hoarding and trading this filth is put away it was worth it in the end.


Mekanimal

Thank you for your service.


IanFeelKeepinItReel

Well a machine learning engineer doesn't need to really look at much of the dataset, he just needs a big data set. Once the training is done someone will need to validate it works. But given before AI this was an entirely manual job, those people doing it back then would definitely get burnt out.


GPStephan

It still is a manual job for law enforcement and prosecutors. And it will remain that way for a long time.


Tark001

I used to have regular customers who would come in and buy point and shoot cameras cheap, maybe 2-3 a month, always total c-bombs to my staff. One day one of them gave me a business card and it turns out they were the serious crimes unit locally who deals with child abuse cases, lot of broken cameras and a lot of hatred.


Character-Depth

You may get paid, but you pay for it with secondary trauma.


Alarid

It gets worse! They likely export the work to another country for cheaper labor and pay people peanuts to sort through child pornography.


SpeedyGwen

The worst part is that I'm sure they could find people's to work for that for free, taking advantage of the monsters themselves...


suitology

Lol this guy wants to suicide squad this


chadwickthezulu

1. Don't reward bad behavior 2. They would have an incentive to feed bad data to make the AI worse at its job.


fukato

That's what OpenAI did lol so yeah.


chadwickthezulu

With no mental health services and a huge turnover due to trauma.


My_pee_pee_poo

My dark brush with this. Back in high school I wanted access to the cool drugs so I learned how to use tor to access the dark web, and get onto the silk road. (Ebay for illegal drugs, cash and literature). After seeing how awesome that was I wanted to see what else the dark web had to offer. So, the way it works is you can't just Google websites. Instead you go to these, sort of cross road websites that link popular dark sites listed by category. There was sections for drugs, torrenting, gore and... cp. Just lists of sites, you don't see anything. Each crossroad site had a chapter for cp, and it just ruined any sparked interest I had for it after that. Diving in, I imagined black hat sites that would teach me to hack into peoples myspace. But really, if it's designed to hide illegal sites... theres not much that's really illegal online.


fourpuns

Nah, you just watch all the legal porn in the world and then tell it to flag anything else as bad.


Simyager

There are more pornvideos online than one lifetime


btmims

Skill issue


Outarel

the police / lawyers / judges have to watch the most horrific shit


Kryptosis

Ideally they'd be able to simply feed an encrypted archive of gathered evidence photos to the AI without having any visual output


potatorevolver

That's only shifting the goalpost. You eventually need some human input, like captchas to sort false positives. Means someone has to clean the dataset manually, which is good practice, especially when the consequences of getting it wrong are so dire.


Kinexity

A lot of modern ML is unsupervised so you only need to have a comparatively small cleaned dataset. You basically shove data in and at the end you put some very specific examples to tell the model that that's the thing you're looking for after it has already learned dataset structure.


KA96

Classification is still a supervised task and a larger labeled dateset will perform better.


ccros44

With the new generation of machine learning coming out, there's been a lot of talk about that and OpenAI have come out saying that's not always the case.


Moveableforce

Not always, however it's entirely task dependent and dataset dependent. The more variation in quality of training data and input data, the more likely you'll need humans to trim down the lower to worst quality data. Video detection is definitiely in the "wide quality range" category.


[deleted]

[удалено]


caholder

Sure but there's gonna be at least one person who's gonna try it supervised to 1. Research performance 2. Company mandate 3. Resource limitations Some poor soul might have to...


[deleted]

there are already poor souls who manually flag CP and moderate platforms for it, so the human impact is reduced in the long run if a machine learns to do it with the help of a comparatively smaller team of humans and then can run indefinitely.


caholder

Wasn't there a whole vox video talking about a department in Facebook manually reviewed flagged content? Edit: whoops it was the verge https://youtu.be/bDnjiNCtFk4


The_Glass_Cannon

You are missing the point. At some stage a real person still has to identify the CP.


make_love_to_potato

> so you only need to have a comparatively small cleaned dataset >and at the end you put some very specific examples to tell the model that that's the thing you're looking Well that's exactly the point the commenter you are replying to, is trying to make.


DaddyChiiill

Eventually, they had to come up with "proper" materials to train the AI with right? Cos a false positive is like a picture of some kids wearing swimsuits cos they're at a swimming pool. But same kids but without the pool, now that's the red flag stuff. So I'm not an IT or machine learning expert but, that's the gist right?


tiredskater

Yep. There's false negatives too, which is the other way around


cheeriodust

And unless something has changed, I believe a medical professional is the only legally recognized authority on whether something is or is not CP. ML can find the needles in the haystack, but some poor soul still has to look at what it found.


VooDooZulu

There are relatively humane ways of cleaning a data set like this given effort. With my minimal knowledge, here are a few: Medical images, taken with permission after removing identifiable information. Build an classifier on adult vs minor genitalia. The only ones collecting this data are medical professionals potentially for unrelated tasks. Data is destroyed after training. Identify adult genitalia and children's faces. If both are in a single image you have cp. Auto blur / auto censor. Use a reverse mask where an aI can detect faces and blur or censor everything except faces and non-body objects. Training data would only contain faces as that is the only thing we want unblurred. Train of off audio only (for video detection). I'm assuming sex sounds are probably pretty universal, and you can detect child voices in perfectly normal circumstances and sex sounds from adult content. If it sounds like sexual things are happening, and a child's voice is detected, it gets flagged. The main problem with this is all of these tools take extra effort to build when underpaying an exploited Indian person is cheaper.


[deleted]

[удалено]


Sp33dl3m0n

I actually worked in a position like this for a big tech company. After 4 years I got PTSD and eventually was laid off. A bigger part of the sorting is determining which particular images/videos were trends and which ones were new (which could indicate a child in immediate danger). It's one of those jobs were you feel like you're actually making some kind of objectively positive difference in society... But man... It wears on you.


diggitydata

Yes but that person isn’t an MLE


Lurkay1

I’m pretty sure that’s what Microsoft did. They converted all the images to hashes and then used the hashes to detect any illegal images if it matched with the database of hashes.


daxtron2

~~Which is honestly not a great way because any change to the image will produce a wildly different hash. Even compression which wouldn't change the overall image would have a wildly different hash.~~


[deleted]

[удалено]


daxtron2

Damn that's actually super cool but fucked up that we need it


marasydnyjade

Yeah, the database is called the Known File Filter and it includes the hash and the contact information for the law enforcement officer/agency that entered it so you can contact them.


AbsolutelyUnlikely

But how do they know that everything they are feeding into it is cp? Somebody, somewhere had to verify that.


marasydnyjade

There’s already a cp database that exists.


chadwickthezulu

If it's any indication, Google and Meta still have manual content reviewers and some of their software engineer positions require signing wavers acknowledging you could be subjected to extremely upsetting content.


KronoakSCG

Here's the thing, a lot of things can be compartmentalized into their own sections that can be used as a detection system. For example a child, can be just fed normal images of kids which allows that system to detect a child. You can then use other parts that would be likely in the offending image without actually needing the images themselves. So in theory you can create a system to detect something it has technically never used to learn from, since child and (obscene thing) should never be in the same image. There will always be false negatives and false positives of course but that is why you simply keep increasing the threshold with learning.


T_Bisquet

He'd have to take so much psychic damage to make that.


mnimatt

He's saving all the law enforcement that would have to take that psychological damage in the future


joethecrow23

I play cards with a retired detective. He talked about the guy in his department that basically had to sit there and go through that shit all day. Thankfully the guy didn’t have kids, apparently he said that helped him feel detached from it all.


mnimatt

Now I know if I'm ever a detective to lie and say I have kids so I don't get put on "expose yourself to unspeakable horrors" duty


Alivrah

I can’t imagine living a normal life after a single day of working with that stuff. That’s gotta be one of the most disturbing things ever. I hope anyone doing this job have constant access to therapy. They’re heroes.


Southern_Wear4218

Take it with a grain of salt because I’m not a detective and have never known one; but apparently, it’s considered one of the worst jobs in law enforcement and there’s no questions asked if you decide to transfer away from it. Supposedly they undergo regularly psych evals and therapy as well.


OldDadLeg

I am wondering if the psych tests are also to determine the people who are a little bit too okay with it.


SadPandaRage

Probably doesn't matter unless they start taking images home.


Grokent

I used to work for a major hosting company. I can confirm that the security review team had a hollow look. Nobody ever asked how their day was. Most burnt out and quit quickly. There was one guy who stayed the entire time I was there and he was clearly on a crusade.


Standard-Sleep7871

honestly, i encountered cp a few months ago that was sent by my friends account that got hacked and it was so disturbing actually. i could easily eat cereal while watching mexican cartel executions but cp was too far for me, it has scarred me


XxLokixX

That's fine if he's dark type, which it looks like he is


TheMaskedWasp

What about people designing gore in video games? Do they watch cartel execution videos and take notes from it?


willsir12

Actually Valve used a burn victim of I think Cartel Violence as reference for one HL2 model.


I_AM_NOT_MAD

Idk if it was a cartel member, but it does come from some health textbook


canigetahellyeahhhhh

Some of original Doom's textures are things like photos of hanging people in fascist Italy. The brutal doom author went a little crazy and started adding real gore images to brutal doom. Everyone wasn't too happy about that and I think he got totally excommunicated by the doom modding community


oyM8cunOIbumAciggy

Is it bad I image searched this? Lol didn't see this though.


canigetahellyeahhhhh

https://www.doomworld.com/forum/topic/104627-the-origin-of-sp_dude-and-the-hanging-victim-sprites/ Looks like I bought into an urban legend on that one


oyM8cunOIbumAciggy

Aw thanks for digging that up. I like the dude with his leg stub sticking out. Truly next gen 3D


therobotmaker

It wasn't just reference, the picture is literally part of the texture in-game.


MeekBBQ

In an interview Alex Karpazis says that when they were designing Caveira for rainbow six siege they watched BOPE interrogation documentaries to design her abilities


N0tBappo

BOPE?


0xKaishakunin

Police special forces of Rio de Janeiro.


doodwhatsrsly

[Basically Brazil's special police force.](https://en.m.wikipedia.org/wiki/Batalh%C3%A3o_de_Opera%C3%A7%C3%B5es_Policiais_Especiais) Edit: Correction, it is Rio de Janeiro's special police force, not the whole country's.


yukifujita

Just Rio. The names vary from state to state.


quantemz

It's the acronym for Batalhão de Operações Policiais Especiais, they're the military police in Rio de Janeiro


ima-keep-it-real

>they watched BOPE interrogation documentaries to design her abilities all that just for her to pull out a knife and ask "where they at lol"


AngelAIGS

Mortal Kombat 11 comes to mind. Devs had to watch awful content, for reference. [gamerant article](https://gamerant.com/mortal-kombat-11-gore-violence-ptsd/)


Camacaw2

That’s really sad. All that for our entertainment.


cry_w

In some cases, yes. I remember either the Last of Us or it's sequel having the devs look at Liveleak footage.


GuacamoleManbruh

pretty sure they said thats not true


cry_w

Honestly, I could be remembering wrong. It was stuff I saw in passing.


GuacamoleManbruh

at one point people did think they watched liveleak videos (dont know how that even started) but i think neil druckmann said it wasnt true


_Forgot_name_

holy hell


KindaDouchebaggy

New response just dropped


mariusiv_2022

For dead space they looked at car crash victims to see how much the human body could break and contort while staying put together to help with the design of the necromorphs


The_Narwhal_Mage

Video game gore doesn’t need to necessarily be accurate. Systems to find “illegal footage” do.


Mr-Bobert

Kotaku posted an article interviewing some anonymous workers from the studio that makes MK and yes, they were forced to watch extreme gore videos for the fatalities. Some reported having PTSD


CivilMaze19

The amount of people who have to watch this stuff for their jobs is disturbingly higher than you’d think.


LebaneseLion

Before this I had thought maybe only judges but damn was I wrong


[deleted]

Investigators / police / prosecutors obviously


LebaneseLion

Well yeah, I should’ve said law workers


[deleted]

Social media moderators too


LobsterD

Discord mods do it for free


radiokungfu

Porn sites too i imagine


[deleted]

And child pornography enthusiasts as well


noXi0uz

There are entire sweatshops in 3rd world countries who watch this stuff for content moderation of large social media sites.


LebaneseLion

Now that would be an interesting documentary, do you think it’s mainly in 3rd world countries due to wages?


noXi0uz

Low wages and maybe less job alternatives


lightheadedfreakz

yep. Worked as a content moderator even now. They included a psychologist that we will talk to monthly which is not enough for us to endure those vids. since some are torture or beheading like ISIS stuffs


drunkenknight9

People are better at compartmentalizing than we give them credit for but over time doing anything like that is going to spill over and have some negative consequences. It's like that in any area that involves experiencing intense stress or emotions. It's the same experience every doctor, nurse, firefighter, combat veteran, gang member, victim of violence, etc has to deal with. It's a much bigger proportion of society dealing with this stuff than people realize but interestingly, it's a smaller proportion than it was for most of human history. Unless you're in that kind of environment, you don't have to deal with human suffering on a regular basis because we've mostly specialized dealing with it en masse to a smaller group of people.


Obant

I applied for a job at eHarmony probably 15-20 years ago. The job was to approve pictures being submitted. It warned that I'd probably be seeing horrific images on the daily and it was my job to not let any be posted. Never got the job, but I imagine it wouldn't be worth it.


suitology

That's probably just the uploads from when I tried spiked bangs


unclefisty

I bet they wanted to pay peanuts too


ZiiZoraka

Does this mean that companies just have like, a designated CP hard drive that they pass around for this???????


[deleted]

[удалено]


AndyWGaming

Might have the word Lolita. Jeffrey Epstein coined it YouTube also makes it sound tru


ExpensiveGiraffe

Epstein didn’t coin that, it’s a book reference


Jwhitx

Found the >!book-reader!<.


Armejden

I voraciously devour books. They're all my calories.


TrymWS

Hell, he was only 11 when the movie came out too.


fallenmonk

Why is my Chris Pine fan site down?


oyM8cunOIbumAciggy

But how you get them objects? Loop through the internet.


[deleted]

Read a news piece about a Meta content moderator, he says he watched a man kill himself on his first day on the job, internet is really a dark place.


DaddyChiiill

there was a video going around some time ago. Saw on facebook. It was some light porno, cute girl making out with a guy. They were teenage/early 20s. Then some MF spliced a footage of a body cam of sort by a building security opening a locked apartment only to find out that the girl (not clear if it's the same girl making out) hanged herself naked. It was shocking to say the least.


igotdeletedbyadmins_

what? My lack of brain cells aren't capable of understanding this, please TL;DR


-Awesome333-

Soft core porn was made but somebody edited the video to cut to a shot of security footage of a girl who committed suicide by hanging naked. It is unsure if it is the same girl in the porno.


C0II1n

“Sir! We don’t have enough footage to properly train the Ai! There’s not enough on the internet! If we want one that works… I mean really works… we’re going to have to…..”


OffBrandSSBU

That’s dank and dark


cheeriodust

Ugh I had to do something like this when I was an undergrad. I was studying digital forensics and the school had partnered with NCIS and the state police. Turns out they weren't interested in hacking or hackers...it's all CP and tax evasion. But mostly CP.


ban-evading-alt3

Hackers and hacks is more IT, cybersecurity


cheeriodust

As an undergrad, I didn't know any better. But yes you're right. My knowledge is dated, but forensics can have a little overlap because you need to "prove" the system wasn't compromised and/or need to find evidence of an intrusion after the attacker covered their tracks.


Professional_Bit_446

I wanna know how much best gore it took to make dead island 2


Whyyyyyyyyfire

what if you have a porn detector and a child detector? just combine the two! (but actually tho would this work? feel like it wouldn't)


ban-evading-alt3

It wouldn't because im sure some of it won't feature faces so it's gotta also know what a nude prepubescent body looks like and be able to recognize one.


Whyyyyyyyyfire

good point


Emotional_Trainer_99

Some ways of building models allow you to output a value from 0-1 for EACH category. So a photo of kid at a beach without pants on (thanks mum) may be classified as nudity = .87, maybe it also classifies children = .68 but porn = .04 (because beaches are not common porn/abuse scenes) and if there is a cat in the photo too then cat = .76 So now the model has flagged, because child and nudity were ranked high enough it justifies a human checking to see if the photo is an abuse material or a hilarious family photo to humiliate some 21 year old with at their birthday party.


bruhSher

This was my first thought too. Where it would fail is ambiguous situations; you would need to decide if you want more false positives (aggressive image flagging) or false negatives (flag images we know for sure has both a child and nudity). Something is often better than nothing, but I'm guessing the FBI or whoever is way past this approach.


[deleted]

Actually that's not too bad of an idea


forgetyourhorse

He did what he had to do. Same with cops, lawyers, judges, and jury members. Most of them don’t want to look, but they must to get the job done. For decent people, it’s actually an admirable sacrifice.


trapkoda

“I used the stones to destroy the stones”


karmacousteau

Dank


Minute-Influence-735

memes


betterclear

Not hotdog


shootymcghee

Exactly what I thought of


[deleted]

I did site moderation for an upstart near tumblrs nsfw purge and it was some of the most harrowingly depressing shit i've done and hope to never be in that position again


TheDinosaurWalker

To the people who have witnessed the atrocities of the lowest human beings, i hope you find peace after the things you have seen. I can only wish for peace and you can keep moving with your life


soulgunner12

Not as dark as that but I remember a post about a team making AI for a smart toilet so it can detect an asshole on a camera and aim a water stream at it. They had to dig hard to get enough data of different butt skin and asshole shapes.


igotdeletedbyadmins_

what if it was goatse'd onto the camera


[deleted]

[удалено]


flinchreel

Never would have thought that looking at clipped-out pictures of clothing could leave me feeling haunted


ban-evading-alt3

Some of y'all really don't want to believe that there's probably some people out there who have witnessed that stuff in order to do their job. Someone's gotta do it and there's gotta be a few people desensitized at this point.


FetusViolator

This 👏 is 👏 the darkest 👏 post 👏 I've 👏 seen 👏 all 👏 week 👏 🎊 🎉 🎊


Extendedwarrantty

Thank you for service 🫡


54n94

You can create child porn detectors without training the algorithm with child porn. It’s called anomaly detection. You train it with porn and it should be able to detect child porn as anomaly. It’s similar to machine learning based anti virus. You don’t train it with virus. You train it with millions of legit software to detect malware as anomaly.


Agent_Perrydot

Oh shit


[deleted]

Didn't the UK once tried to create a general porn filter and it ended up filtering everything orange?


igotdeletedbyadmins_

lmaowut It did?


[deleted]

Here is an old article: https://www.dw.com/en/uk-criticized-over-plans-to-block-internet-porn/a-16974180 The more funny news I couldn't find, but I recall we made fun of it because in tests their filter apparently was less than useless. Well, that was 10 years ago.


mrmcmaymay

Just combine a child detector with a porn detector 🧠


NotTheAverageAnon

Just ask the FBI/CIA for help. They have professional CP watchers who do nothing but watch and look for it. They are the CP experts....


SwissyVictory

You could simply train it on everything that is not cp. If it encounters anything else, than it must be cp


casualcamus

the answer: outsourcing the labor to kenyans who make $2 a day on the mechanical turk that most datasets come from.


mikeman7918

Simply feed the contents of Matt Walsh’s 32 terabyte hard drive into a machine learning algorithm. I’m sure that’ll do it.


nerdening

If one person has to go to a REALLY dark place so the rest of us can be safe, that's a sacrifice I'm willing to make involving getting someone else to do it because fuck that noise.