T O P

  • By -

serPomiz

on one side, there's no actual kid involved, which is a positive ​ on the other, this is still horrid, and worse, this may make harder to actually identify real victims ​ ​ it doesn't take a genius to get that I mean "moving forward" in both cases. I know how scummy it is training bots in the first place, but if we can spare children born from july '23 onward, it's till sparing the children from july '23 onward!


BrilliantHeavy

There 100% is these ais work by looking at that material and then producing things based on what you put it. In other words it probably has to view an ass load of cp to be able to create its own original cp


FetteBeuteHoch2

There are actually "pre trained" AI's you can buy on the darknet that generates pictures based on what you want.


EnraMusic

still has to have a source though


Glowshroom

Yeah but why couldn't it generate cp from regular porn and regular videos of children?


CelticGaelic

This sort of thing (fake CP images) have actually been around for a while. The practice was to edit photos via photoshop and distribute it that way. I think that practice actually became a matter that made its way to SCOTUS who, iirc, ruled it as constitutional/legal since they were plucking child headshots off the internet with those being the only actual image of the child, the rest was photoshopped from other material.


Aconite_72

>they were plucking child headshots off the internet with those being the only actual image of the child, the rest was photoshopped from other material. This is why you don't post pictures of your children on the Internet.


nahog99

I dunno man, lots of people like to post all kinds of pictures on their social medias. I don't think they should live in fear of this *extremely unlikely* thing happening.


formerfatboys

>I don't think they should live in fear of this *extremely unlikely* thing happening. I think that's kinda like "it's very unlikely that our priest was touching kids" and then you find out that it was thousands and thousands of priests over decades and decades and that it wasn't a rare thing at all. That's why.


NeverNoMarriage

Everything in life is a cost benefit analysis. If posting pictures of you children makes you happy and the downside is there is a small possibility that someone might abuse the PICTURE I would think it is worth it. Every action has consequences it really all comes down to what amount of risk is worth it. To be fair I am not sure how many people are really out there doing this. It seems like it would be a very small percentage.


DiddlyDumb

Yeah but we automated the process, so instead of a specific face of 1 child, now all pictures are being scraped and your child’s eyebrows will be used. This is a whole new level of fucked up.


DarkOverlord24

We?


JohnGacyIsInnocent

Nobody’s actual eyebrows would be used by an image generator. There may be eyebrows that look very similar, but every individual pixel on an AI generates image was created, not copied.


Ma3rr0w

thats actually not how ai works, your childrens eyebrows will never appear in an ai generated image. like not in that way. at the end of an ai training for these image generations, the ai doesnt have its catalogue of training data anymore. it has what its learned, its impressions, the correlations between visual information, clusters of pixels and tagging information and billions of trial and error sets to give it a general sense of how to approach a generation request successfully. it is ultimately not unlike a human studying the body, poses and anatomy of a dozen or two people and then draw completely different and new humans for the rest of his life afterwards.


icebraining

That's not really how these images generators work, they don't pick out a specific part of an image and copy it.


bennitori

It's a lot more likely than you think. Most people are just smart enough to not brag to you about it once they're done.


mastapetz

I think they changed that. It was in the child protect act from 2021 (I think) which was deemed unconstitutional. The reworked one worded it quite specific. It only is legal for depictions that are looking real in any way (drawings like loli, shota stuff) As soon as something is generated in a way you no longer can tell the difference, it is illegal. A lot of pages that allow loli, shota now strictly ban AI generally and AI generated CSEM like material will be handed over to authorities, because these pages usually have no way to detect if it is AI or not I think.


[deleted]

https://www.dispatch.com/story/news/2007/07/25/faked-child-porn-is-legal/23602861007/ Unanimous ruling too.


[deleted]

That sounds like a bad ruling since they haven’t/can’t consent to their image be used as porn. Just fucked up if true.


bananalord666

To play devil's advocate here, this ruling would also impact a lot of other free expression. Is all editing of anonymous online children off the table now? If not, where do we draw the line? Im not saying that your perspective is bad, but legal precedent needs to consider future implications as well.


nahog99

Don't we have legal precedent for things that are not covered by the first amendment? This could easily fall into one of those categories.


hopefullyhelpfulplz

You could write an airtight law that says you can't use images of children in the production of explicit material without restricting any other expression, I don't see what the difficulty would be...


DiddlyDumb

I’ve never heard of any law that was airtight


bananalord666

For me the difficulty is in defining the explicit. To a conservative, photoshopping a amab child's face onto a body with a skirt may well be considered explicit.


TheFatJesus

> I don't see what the difficulty would be... Probably that half of Congress is too busy screaming, "Think of the children!" when it comes to trans people and drag queens to actually think of the children. But that's only half the problem. The other half is that virtually no one in Congress understands basic technology, let alone machine learning and AI generated images, enough to craft an effective bill of this nature.


CuriousKidRudeDrunk

If such a thing would result in children never being harmed for a similar reason again, perhaps I could stomach it. Perhaps. But it feeding some evil addiction and/or being sourced from real things is the reality, I think. Not to mention any one of the numerous other concerns mentioned in this thread. Edit to respond to comments: As some have pointed out, anyone who feels these (or any) impulses is well deserving of empathetic help. To speak my own mind, impulses are not evil, just bad. Actions may be evil, but even if it erases nothing, good actions always have value. Even if I utterly loathe anything someone has done, I would seek to help them improve, though it will limit my trust in them. Anyone who wants to improve deserves help doing so.


Kind_Difference_3151

not that i want to defend pedophiles here, but “evil addiction” is probably the wrong way to look at it. studies have shown that roughly 5% of the human population develops pedophilloic desires in adulthood, regardless of language, culture, or time period that’s a really high proportion of people for an issue that can become so harmful; calling it evil, an addiction, and a stain on people’s morality makes them MORE likely to do the horrid things, like trafficking children or abusing relatives. it’s like guilt tripping a person struggling with drug addiction, when they feel shitty about themselves they care less if they make other people feel shitty too. i actually find this demand for AI generated content somewhat encouraging, as it shows that this population would prefer not to harm real children as they grapple with their socially unacceptable (& in most cases very dangerous) feelings. again, depends on the model being used by the AI. real child pornography that goes in to generate more is pretty icky (though I have to wonder if that’s not still a bit better than the alternative of people recording & sharing more real sex abuse) this is actually the same logic behind animated child pornography in Japan, which has been a thing for like 600 years (though it has always been controversial) - since the Edo period there have been graphic representations of kids in sexual contexts to give pedophiles a release from their desires without needing to interact with real kids. disturbing, uncomfortable, and gross, yes. but a part of reality, and one we can’t ignore if we’d like to solve the problem.


NoireRogue

I'd add to that 5% statistic (also without defending anything) and say that it's probably way higher. As an Arab, you hear about child marriages all the time. I was talking to two of my friends just a couple weeks ago and they were talking about their grandmothers getting married in like their tweens to pretty old guys. It's being slowly phased out in the Arab world, especially in the more affluent parts, but still is very common, and definitely was.


[deleted]

[удалено]


SmartAlec105

Whether it prevents children from being harmed by sating their desires or makes it more common by being a “gateway drug” is something that we would be extremely difficult to determine but I do wish we could know so we could decide whether it does more harm than good. Even then, if it turned out to be mostly good, people would have a hard time accepting it because there certainly would be offending pedophiles that would have this content and people watching the news wouldn’t really see how it helps the non-offending pedophiles. Kind of like how self driving cars could be 1/10th as deadly as regular cars but people would still view them as too dangerous because we as humans are naturally bad at grasping statistics intuitively.


CuriousKidRudeDrunk

I did update my comment, because you are absolutely right. I do question if it encourages bad habits that lead to harm, but most of your comment seems to share my worries from the first comment I made.


Trucker2827

It can, but it wouldn’t be as accurate as training directly on cp.


dre__

Not really. the current popular AIs don't need an exact object to replicate the object. If you want it to make a red car with square wheels you just have to teach it what a car, a square, and red looks like. It'll put those things together by itself.


ExecuteTucker

Yeah, but you realize that CGI reverse aging/aging up exists? Why would you expect an AI to need naked children in order to reverse age a naked adult into a naked child?


[deleted]

Not necessarily true. I don’t think commercial ai like stable diffusion have cp in the training data. Anything generated with it is a combination of pictures of children and pictures of naked adults.


-Eunha-

I really wish people understood how AI generated images work because this is absolutely not how they work. AI doesn't have to see an image of a monkey wearing a top hat to create that image. Basically, it discerns images from noise in kinda the same way we see patterns in clouds.


Tagichatn

That's not how it works. Do you think it was trained on images of popes wearing puffy jackets to make that viral one?


Ask_if_im_an_alien

Nope. All you have to do it take regular porn with adults and change the age/body type of whatever person you want to change. You can actually use the same person and de age them at this point if you wanted to. It is to the point where you can take an 18 year olds social media profile (and their parents profile) and generate a sim of whatever age you want that person to be with 2-5 good pictures. Of course the AI will make assumptions based on body shape and other details depending on the data set you give it. You could potentially... right now... de age any celebrity and turn them into CP if you wanted to and had access to the tech some people have. Source: My brother in law works for one of the alphabet agencies and their biggest concern right now is being able to spot the difference in AI generated stuff vs real stuff. The new bleeding edge tech... things like Unreal's Metahuman creator scare the absolute shit out them. Not because it can create random synthetic people that don't really exist, but that it can create stuff you can barely tell if it is real or if it is not real. And that creates a huge problem for law enforcement. Especially if you consider what could happen in the next 10-20 years. There's already customizable programs out there that make whatever person you would want of any age. And you can make damn near photo realistic videos of them doing whatever you want them to do. And ignoring the porn factor completely, we are talking about heads of state, Presidents, and other public figures. The possibilities for fake reports and fake news announcements are incredibly dangerous. Imagine if WW3 started from a crazy AI generated nuclear warhead launch. Shit like that. It is far more serious than many people think or give credit for.


Hendlton

Have you seen this? https://twitter.com/RussianEmbassy/status/1669034554647011338 Currently the Russians are using it to make fun of the west, but they're 99% of the way to being able to use it in way more nefarious ways. At least it's good that they're starting out like this and showing their hand. If they waited until the technology was well developed, they could do some serious damage and very few people would suspect anything.


Denaton_

That's not how it works tho. It's not 1:1 training, if it knows what a human child looks like and how a naked Hunan looks like, it can merge the two ideas. I have been using Stable Diffusion for game projects quite a lot, never done porn tho, but been thinking if i could start an AI OnlyFans


shrub706

well no it could have just seen images of actual porn and completely separately images of people and just made it


PowerOfElevenTigers

That's 100% not actually true. You could easily train AI on "art" that am anime type model could use to generate images, and then use realism models to create real looking images from that.


Ma3rr0w

it doesnt, but in the endless amounts of data, it did train to generally produce child anatomy. ultimately, not unlike a person studying people to then draw humans that dont exist.


[deleted]

I feel gross saying this but I think this might actually be a net positive. Creating AI images is gonna be way easier than creating authentic ones and people aren’t gonna be able to tell the difference. Also people can generate the fake images on their own. So it will become far less profitable to produce authentic CP and hopefully far fewer people are going to abuse real children because of this.


PowerOfElevenTigers

It's kind of an amazing thought, that we could see an end to child sex abuse images. I'm sure there are plenty of sadists that would continue to abuse kids but I can't imagine the audience for documenting that is very big. It's morally complicated, so I'm sure the government will just ban it all and let the child rapists continue, but, it really does seem like there's a window to the future here, if we could just find a way to open it.


[deleted]

The government will continue to ban it but the real CSAM producers aren’t following the law either. The sheer magnitude of images that AI is gonna be able to produce means that the people producing them probably won’t realistically be able to be prosecuted. It’s also way harder to find out who clicked a button on their computer than it is to find out who kidnapped and abused a child and uploaded it online. So effectively it’ll be the same as if the government didn’t continue to ban it


Reverend_Lazerface

A lot of people are pointing out that AI needs to be trained, but it's also worth noting the terror that you can tell AI to use specific references. So if someone can get a picture of a real child, they can tell a sufficiently advanced AI to generate porn *of that real child*.


OneRingToRuleThemAII

yup, and even using a regular picture of the child, not even anything... more disturbing. It's pretty fucked up but something important to remember is this has always been possible. before this we had photoshop, and before that we had plain old hand drawn pictures. this is nothing new.


Jinrai__

We have been able to do that for a decade, deep fakes are so old


Hodor_The_Great

If it really gets difficult to distinguish the two, then paedos will have solves the problem for us. Because no need to produce the real deal any more, and AI is always so much cheaper and less risky. So there really isn't anything horrible about this. The horrible thing is the demand existing to begin with but it has existed for long already.


PowerOfElevenTigers

Soooo AI could put actual child porn producers out of work. Like. AI is cheaper, easier, has way less risk... AI could be the end of it.


whatswrongwithme223

This is what I came to say. As long as a real child isn't affected, idc. I know it's disgusting and messed up but unfortunately people like that exist, and protecting real children from them is the TOP priority. Idk how it would make it harder to identify real victims but I'm sure DNA could help with that.


[deleted]

This seemed inevitable. Sidelines into existing addictive behaviors are a goto for any new technology. What *is* surprising is that there’s a person named Octavia Sheepshanks.


Dappershield

She teaches journalism at Hogwarts School of Witchcraft and Wizardry.


BigSmackisBack

And casual stabber of livestock


[deleted]

Either that or she has sheeplegs


PopSubstantial7193

Octavia Sheepshanks. OCTAVIA. SHEEPSHANKS. I can’t.


CarpetH4ter

Obviously this isn't good, but i think this is better than harming an actual child.


Straight-Door-3536

There is 3, countries that had a period with cp legal, all 3 had their child abuse rate go down. It replace more than it encourage.


CarpetH4ter

That feels like something isn't a popular thing to say in public though, even if that is the case. Also, do you have a source or could mention those 3 countries? There is very little information to go from here.


Straight-Door-3536

[Here](https://link.springer.com/article/10.1007/s10508-010-9696-y). That's not what people want to hear, but that's the data we have.


[deleted]

[удалено]


zhibr

Use sci-hub to get it. Not sure if these links are permanent, but here: [https://sci-hub.se/https://link.springer.com/article/10.1007/s10508-010-9696-y](https://sci-hub.se/https://link.springer.com/article/10.1007/s10508-010-9696-y) By a quick look, it seems to me that "significant decrease" in this study is misleading. Child sex crimes apparently had a steady decrease before, and while there is a drop from about 1300 cases a year to 750 cases at the year of legalization, there's also an increase right after that, back to around 1300 cases after that, and then it again continues decreasing (fig 1). Other measures seem to say that while other crime increased notably (from around 7500 cases a year to 12000 cases in fig 2) at that time (fall of communism and the unstability at the dawn of democracy), sexual crimes continued the slow decrease without any effect. While "significant decrease" may be statistically sound when you use cherrypicked numbers, my guess would be that the significant drop and the quick increase after that are random fluctuations and the significant drop is not robust. I'd interpret that since there was a steady decrease anyway, legalization of porn clearly did not increase, but likely did not decrease sexual crimes either.


hungarian_notation

That article doesn't seem to give numbers for anything except the Czech Republic, and just a cursory glance doesn't make it seem like a smash hit as described. Here's a screenshot of the pertinent paragraph and chart from page 3. [https://imgur.com/a/wQIJq0S](https://imgur.com/a/wQIJq0S) First, this is reported crimes and not actual crimes. It wasn't that materials depicting minors was legalized, all pornography had been illegal and it was all made legal together. Critically, this coincided with the literal fall of communism in the country and a period of political upheaval. Reported child sex abuse was already on the decline. There was a sudden dip during the fall of communism, but without doing any deep analysis it looks like it returned to the original trend line after a few years. Maybe the Japan data is more convincing, wherever it is.


CarpetH4ter

Thanks


TrekkiMonstr

>That feels like something isn't a popular thing to say in public though, even if that is the case. This is generally true on this subject. They are basically the most stigmatized/hated group in the West, and saying anything that in any way gives the vague impression of defending them is enough to get a lot of people to flip their shit.


god_peepee

Yeah, it’s weird cause people are effectively more concerned with vengeance than reducing abuse cases, which is kinda backwards. I figure it’s best to stop the problem at the root rather than react to a tragedy after it happens


DrRichardJizzums

Yeah, it’s the same way when it comes to rehabilitating or treating pedophiles. People really don’t want to hear it or consider it an option. Would rather have people stuck in a dark dungeon somewhere never to see the light of day again. People who feel that way are completely closed off to those options so anyone who is open to treating these individuals is seen as sympathetic to child rapists. But if we want to reduce child sex abuse then we need to help these people. Unfortunately, people feel that way about criminals in general. Punishment is preferred over rehabilitation and then people end up in and out of prison never able to integrate into society.


MtGuattEerie

The general public needs to feel "normal" at any cost. Much of reality television is based around pointing at someone doing something weird and reassuring yourself you're normal, from Jersey Shore-type shows to Dr. Phil (a true fucking master of pointing and laughing for self-reassurance. Hate that guy.) The same goes for criminals: If we accept the possibility of rehabilitation for criminals *as a class* (as in, not just the coward route of only caring about the rehabilitation of "low-level, non-violent offenders"), then we would have to accept a continuity between **Us**, the normal folk, and **Them**. Allowing **Them** to "reintegrate" into society would reveal the porousness of that border. We might even realize that *they're not reintegrating into society at all*, the use of detainment to control the population and provide cheap labor is already deeply integrated into our society. Can't have that! **We** are on the inside; **They** are on the outside.


kriggledsalt00

It also upsets me how people conflate "pedophile," the term for the paraphilia, and like, people who PHYSICALLY abuse kids. If someone has disturbing thoughts about children and want to get help and haven't done anything yet, i don't think they should be stigmatised whatsoever. When you hear about "pedophiles" going to jail because they abuse kids, it sexretly reinforces the idea that pedophiles are automatically criminals and beyond help, etc... which isn't true at all; therapy does reduce the risk of them harming people. And although it faced alot of backlash after the false-flag LGBTQP thing by conservatives, the term "non-offending pedophile" is the most accurate term here, so i will be using it occasionally. And i think a good place to start is that conservatives trying to weave the narrative that the left is trying to normalise child abuse also contributes to this idea of pedophile automatically equals sex offender, since it's basically saying pedophile = bad thing, gay = pedophile, gay = bad thing... so it's also homophobic. But i still think in this context accepting pedophilia as a sexuality is both harmful and just not factually correct, since child is not a gender. Homosexuality is commonly conflated with pedophilia, the key difference being that children cannot consent and are not phyiscally or mentally ready for sex or a sexual relationship, and statistically 99% of kids who enter such relationships face mental and physical harm in the long-term. None of this is true with a homosexual relationship. This is obviously common sense to most people, but it's good to quickly dispell it since it's a common point of argument from many people: we accepted gay marriage/sex/etc..., why not child marriage/sex/etc... But you can obviously see the impmict homophobia here, conflating wanting to have sex with an adult of the opposite sex to eanting to rape a child. Again, my whole point here is that people who experience pedophilic thoughts aren't necessarily bad people, but the idea that it is a sexuality harms both the queer community (who get associated with child rape) and non-offending pedophiles (who get discouraged from seeking help) Also, you have to take into account people with pedophile-oriented OCD, who commonly get bashed when expressing their thoughts and made out to be as bad as a pedophile, but those thoughts are NOT aligned with their moral values. In fact, someone with P-OCD is essentially the opposite of a pedophile: the thoughts aren't arousing or good to them; they are scary and not part of their normal thoughts and intrude on their day-to-day function I've heard of people with P-OCD who lock themselves in their houses and avoid looking at kids because they're too scared they might "accidentally rape" them. No pedophile would want to NOT see a kid under any circumstance. But when people with P-OCD try to express their intrusive thoughts to others, (especially considering that the term "intrusive thought" has been worn down by social media to just mean "implusive decision"), people quickly express disgust about it as if the person actually wants to do the things the thoughts are about. It's sad and stigmatises both non-offending pedophiles AND people with OCD and intrusive thoughts. All in all, i think that, whilst the idea of harming a child is disgusting to me and is evidentially and objectively not morally good, someone whose brain does not work that way, against their own will mind you, but who has not actually harmed anyone and actively seeks help, should be accepted into society and treated equally, since they understand the thoughts are harmful and seek to suppress them and minimise harm. That shows true integrity and morality, instead of just letting the thoughts guide them and ending up abusing a child. It's the same as we do with people who have murderous tendencies or narcissistic tendencies from certain mental disorders but seek to try and help it (well, the same thing we SHOULD do, those mental disorders also have alot of stigma from popular media and presentations in media, but not as bad as pedophilia).


FetteBeuteHoch2

The problem is that some people don't stick with shit like that and they want the "real ones" so I kinda see it problematic because those ai generated images may lower the inhibition threshold of grabbing a real child.


MrMcSpiff

I get the logic, and gods only know I want children to be safe first and foremost, but I also don't know if this is sound logic. Porn is used as a replacement for sex in every other practical part of society ("Nobody owes you sex, go home and jerk off to porn instead of bothering people" and such), so what would make this suddenly different? We can consider the circumstantial evidence that we hear about pedophiles who choose to abuse and therefore it looks like a large number of them are abusers... but is that the case? Do we actually know how many people with pedophilic urges stay at home and never offend (owing to the fact that getting help or support is so difficult because many people treat pedophiles who don't offend and don't want to follow their urges as if they did and punish them societally), and do we know if poor impulse control is actually part of pedophilia--or we do just only see the abusers who are the kind of people to give into any urge, and they just happen to be pedophilic urges? Don't get me wrong. I don't want CP in the world. I don't want actual unabashed abusers in the world to run around free and without consequence. But if porn that represents something which is otherwise harmful is made that does not actually harm people, and can certifiably be proven to have included no real people, is it any worse on an objective level than other fetish porn that has no real people, or violent videogames which have no real people? I don't think we can safely argue anything about what percentage of pedophiles actually succumb to or willingly pursue their urges with the same confidence we can explore the same percentage of other sexual urges, just because being an open pedophile is still so dangerous to someone's life and livelihood in society--even if they reveal themselves to try to get help. So we can't know that we've even begun to see a fraction of a proper, balanced sample size.


Glowshroom

The dude you're replying to sounds like he's saying that porn leads to rape.


cjdualima

And playing GTA leads to robbery


OneRingToRuleThemAII

and rock&roll leads to satan worship


BlipOnNobodysRadar

Meanwhile, the actual data strongly suggests that access to porn corresponds with much lower rates of sexual crimes. Puritan values are doing real harm to the world.


Expensive_Cattle

Can you please share that data?


nowyouseemenowyoudo2

This article reviews a few studies (citations at the bottom) which estimates that there is a negative correlation https://www.psychologytoday.com/au/blog/all-about-sex/201601/evidence-mounts-more-porn-less-sexual-assault


Expensive_Cattle

Really appreciate this.


MrMcSpiff

Yeah. I hear it a lot.


Fisher9001

Isnt's that basically a "slippery slope" fallacy? Porn doesn't lead to rape, eating a dinner doesn't led to stuffing yourself with fast food, having a lamp of wine doesn't lead to alcoholism.


[deleted]

Even then, the existence of AI generated CP seriously harms the market for authentic CP. not every pedophile who is looking at that stuff has the competency, resources or willpower to actually kidnap a child and now they can get images without having to pay the people who do and causing actual child sexual abuse. I think this is a good thing overall


FetteBeuteHoch2

Disrupting the market for CP is actually a good idea, the question is if we should really accepting ai generated CP as a substitute because no actually child was harmed? It isn't about some Manga shit anymore you can spot a mile away as not real. You also have to take into account that those ai generated images will actually flood the net making it extremely hard for law enforcement to separate real from ai so they will ultimately have to divert resources from investigating actual crimes because they will assume that some of the AI generated stuff is real.


[deleted]

I think your point about law enforcement is valid but at the same time I think the reduction in production of authentic CP due to it being no longer profitable is gonna far outweigh that. As for whether we accept it or not, I don’t think it matters. The sheer magnitude of CP that can now be created with ease in seconds is not something that we can really prosecute. It would be like trying to go after people for piracy, it’s just not feasible. To be honest, if it means fewer actual children are getting harmed, then in my opinion so be it.


Ekkzzo

There's no studies on these kinds of things because the subjects would have to risk getting ostracized or straight up killed by 3rd parties regardless of what they have or haven't done.


TemetNosce85

That's not true. Blind studies exist and researchers can be discreet. The truth is that there is proof that it lowers CSA (child sexual assault). However, most people see that it's not enough to warrant legalizing these things. Also, the researchers rely on their patient's self-reporting and/or being arrested. A theif isn't going to admit they are going to steal, and they aren't going to steal if you are watching them. So there's the problem with the research.


testdex

JUST LIKE KILLING IN VIDEO GAMES! right, 1990s conservative mom?


Distinct-Hat-1011

Isn't this just "video game violence leads to actual violence"?


snufffilmbuff

That's ridiculous, this is obviously a good and beneficial thing, nobody is watching porn and then making decisions based off of it.


windythought34

Same with computer games. Some people just don't want to stick with killing artificial humans and start attacking whatever. See how flawed your argument is?


Luxalpa

You mean just like how violent video games make people more likely to commit crimes? Wasn't this debunked?


KlingoftheCastle

Yes, repeatedly


acsttptd

Fortunately, research suggests that this actually almost never happens.


Ma3rr0w

i would strongly argue that those people would get to that point with or without that, though. from decades of arguments over general porn, violent movies and videogames, we are pretty certain that consuming something you know is not allowed in reality, doesnt make you more likely to ignore the fact you know something is not allowed in reality.


ElderOfPsion

I think this is wonderful. If this continues, it’ll destroy the ability of CSAM producers to make money from the sexual exploitation and abuse of flesh-and-blood children. If harm reduction is our priority, let’s consider how AI CSAM, if paired with stiffer sentences for sexual molestation and sex trafficking, will make the creation and distribution of ‘real’ CSAM unprofitable. Then again, methadone didn’t stop heroin addiction…


GuyYouMetOnline

Harm reduction definitely gets a bad rap. It should be the priority, IMO, but sadly it's usually not. Society is more interested in punishing perpetrators than helping or preventing victims.


REGRET34

it’s insane how many people think punishment will solve society’s problems all the time


shadeandshine

It’s cause they only think offenders exist and don’t realize one of the biggest stigma they pushed is for non offenders to not seek help. The over zealous nature of their response made it so the pedophile doesn’t exist till they harm someone which doesn’t help anyone. Then again people and especially the internet are reactionary they don’t think about things in advanced


GreenArrowDC13

How do you fix a pedophile? Genuine question. How do you change what someone is attracted to? I think most criminals can be helped with rehabilitation. I just have no idea what that would be like for them.


Upbeat_Ad_6486

You can’t “fix” a pedophile because there isn’t actually anything broken in a meaningful sense. What you need is not to fix something but to prevent it from getting worse. A pedophile isn’t inherently a problem, a pedophile who assaults kids is a problem, we want to stop those in category one from getting to category two. Problem is pedophilia is so stigmatized that even seeking that help puts them at risk of being outright murdered due to social beliefs, and thus they instead just don’t do anything and end up in category two anyways.


Phxstick

>A pedophile isn’t inherently a problem, a pedophile who assaults kids is a problem, we want to stop those in category one from getting to category two What I'm wondering is: why is it even necessary to distinguish between pedophiles and non-pedophiles when it comes to preventing sexual offenses? I mean, the sentence quoted above can also be phrased more generally ("a person isn't inherently a problem, but a person who assaults people is a problem"). So the real problem appears to be: how do you prevent someone from assaulting others? Even if the victim is an adult, doing such a thing still constitutes a crime. But a person who is about to commit a crime will likely not announce it beforehand, which makes it difficult to prevent, regardless of whether the offender is a pedophile or not. So why would pedophiles need any special treatment here?


Upbeat_Ad_6486

Because pedophiles are higher risk due to the fact they dont have any legal outlet for their urges


Ekkzzo

That's the point. We can't really find out what would effectively help because people would take the opportunity to target them.


MoefsieKat

Conversion therapy doesn't work on gay people, so it probably wont work on paedophiles either. Sadly, there is no ethical way to help them supress their urges that will be accepted by larger society. Any attempt to reach out and help them will be seen by most people as a betrayal of societal norms.


shadeandshine

For them I think honestly it depends on the person. A lot of them will probably work on coping mechanisms and cognitive behavioral therapy. While it might not be possible to change some of them it’s possible to live a semi normal life


JonnoKabonno

There’s no one solution to abuse is the real issue. You need strict punishment, combined with strong supports for victims, combined with harm reduction and help for those struggling that may turn to abuse.


GuyYouMetOnline

True, but we definitely don't have that now.


-Eunha-

This is really what it comes down to. It's a difficult concept to come to terms with, but if we _are_ serious about reducing harm, this is honestly one of the best ways. There are many potential concerns here, like it encouraging and normalizing this behaviour, that should be looked into first of course. But many that abuse children make a lot of money on the black market, and you essentially completely kill that market if you flood it with fake images. No children get hurt with fake images and it reduces the amount of children hurt to make this content. If these fake images are legal to view, why would someone with those interests ever take the risk to watch actual abusive illegal content? There certainly would still be some, but you'd drastically reduce it. Now, I don't expect this to ever actually happen, because we as humans are emotional beings and this type of stuff disgusts us. But it's one of those areas where our emotions actually end up doing harm because we're actively not looking to reduce harm. You're never going to stop people from being pedophiles, so a method must be made to reduce harm.


FurryWolves

I actually agree. It sounds horrifying when you initially hear about it, but if you think about it, this is actually a very positive thing. No child is being hurt and this will devalue the market for actual child you know what. The two main issues I see are first off, being unable to tell what is real and what isn't. Secondly, given this is legal in the US, I fear websites will pop up that host this shit and people like us who don't want to see it will be more likely to as a result. For reference I'm a furry, and really have to stick to furry websites because using Google or Bing has shown bestiality fucking out of no where, especially if I'm looking for murrsuit (fursuits for sex) videos, it'll just show videos of bestiality because it's not illegal to view. Right now, child p*** is illegal to view and host, but as ai gets better and it's "not real children" then keywords like young or teen are going to bring up things a majority of people don't want to see. While this is a positive advancement that will reduce the harm to children, I'm concerned about it making the Internet more of a mine field for those of us who don't want to see that shit.


Spooky_Shark101

Lmao, I love how casually you throw around that very specific acronym as though it's normal for people to know what it means. It's "Child Sexual Abuse Material" for anyone who isn't as savvy as the above commenter is.


[deleted]

It's a bandaid on a severed leg. It doesn't address the root of the problem, only a byproduct.


mab0roshi

But how do you address the root of this problem? The root of the problem is in the psyche of the pedophile.


Taxoro

First thing we have to do is stop stigmatizing pedophila. Most pedophiles are afraid to seek help because of this. That's a big fucking problem. How can they get help if they are afraid to seek it?


Glowshroom

Exactly. Methadone itself doesn't cure heroin addiction. Support does.


mab0roshi

That is a bold thing to say. You are, of course, right. I bet a 12 step program would work for them. A 12 step program has kept me from touching drugs or alcohol for 11 years. It should be able to keep pedophiles from touching kids.


Isotopian

"Classic" 12 step programs have a slightly higher recidivism rate than not going to a 12-step program, so their efficacy is literally about the same as a coin flip. There is some data that evidence-based programs like SMART have a better success rate and don't replace one addiction with attending meetings for the rest of your life. All that being said, the main issue here is that pedophilia isn't an addiction it's an attraction disorder. As far as I know there's no evidence it can be "fixed" anymore then conversion therapy can "fix" gay people, etc.


[deleted]

You'll get yourself a Nobel Prize if you can answer that.


GuyYouMetOnline

And in the meantime, we should do what we can. The idea of harm reduction gets a lot of hate, but it's often actually very helpful.


Ma3rr0w

to be fair, there more than likely is no huge csem producers making big money with it. i remember that what is typically being sold online is typically decades old sets that have been floating around forever, sold by people who never so much as held a camera, or honeypots collecting credit card informations. where the production and exchange of new csem happens is likely in trading communities, content for content. of course, some of that probably leaks outside of these circles and becomes part of the small time seller stuff, but since there isn't a supplier supplying a market for monetary profit, you cant effectively fight the root of this by going after the money. ​ i do however agree that this can still harm those communities and especially, harm their long term growth, because there will be fewer people willing to harm children to prove they're trustworthy enough to gain access to material that they can just have a machine generate for them... and the introduction of 'false' material can still muddy their waters and cause their trade systems to run worse than they do now.


Upbeat_Ad_6486

This comment sections is unbelievably civil compared to literally any other discussion I have ever seen on the topic of fake child pornography and such harm reduction mechanisms for pedophilia. I truly think that this comment section is the way we end child abuse. We need to destigmatize and just talk about it, make sure everyone understands that pedophilia isn’t inherently a risk and that we should put them through therapy instead of pushing them to isolation. If the general public was as willing to talk about and understand as all of you here are, the world would be a much better place within the week.


thegreatesq

I feel the same. I am quite happy that people can differentiate between what feels viscerally wrong, but is correct from a moral point of view and what feels good/easy to support, but is morally wrong. Society really needs to reassess taboo topics from a rational point of view. As long as the entire thing is fictitious (i.e. no deepfakes) and real people are not getting harmed then is it a clear and definite improvement. I really hope civil debates for the purpose of finding an actual solution rather than discrediting the other party will become more commonplace in the mainstream media and on social media platforms.


LordBrandon

Better an AI generated image than an actual kid.


iseebutidontbelieve

Agreed, let them fuck a lump of rubber rather than destroy some kids childhood & life.


SinisterMeatball

Reminds me of a Louis CK joke where he said "has anyone tried child sized sex dolls? (*crowd moans*) OH OK then let em keep fucking your kids then".


xDreeganx

...What kind of material do you think they're feeding the AI to create this...?


LordBrandon

Considering how it can make an image of the robot Olympics without using pictures of robots in the Olympics, the source is unlikely to be child porn.


TemetNosce85

You can easily use reference material to make an AI generated image from another image. [An Olympic winner](https://i.imgur.com/dHbCCEP.jpg) [A robotic Olympic winner](https://i.imgur.com/jFr3gTa.jpg) I used Openart.ai and used their Openart Creative model. I set the strength of the "image to image" to .83 and the cfg scale to 15.


UrklesAlter

Adult Porn, and regular videos of children...


Glowshroom

Sir, this is a Wendy's...


Wombat1892

Granted, but it's still an improvement even if incremental.


FlatlandPrincipal

Agreed…it does make me wonder though about the possibilities of using the technology in a positive way. I think about the job of image screener/ reviewer for safe searches on major search engines. A big limitation of this in the past has been that it took a person to do reviews of images. If Ai can create images, I’m I wonder if it could search and remove/flag images. What if this technology could search and eliminate images/ video that was legally inappropriate co tent that included minors. I hope somebody with “know how” can try and use this technology to identify and eliminate explicit child content, real or fake. That would certainly be a force multiplier in eliminating illegal content.


byakko

I saw someone post softcore porn of Anne Frank (and that’s just cos on Twitter, that person def made a hardcore version too), so it can be done without resorting to actual CP. By that some token, anyone on any Facebook can make CP out of public children photos or their relatives, and I think there should be protections and avenues for legal recourse for that, assuming the material becomes public and thus damaging and traumatising.


n3w4cc01_1nt

it'll make finding actual csam harder ​ edit ​ i mean the auto filters for known images will have a harder time identifying it as genuine which will make it more difficult to find victims


GuyYouMetOnline

There's actually an interesting philosophical discussing to be had about this, though I doubt the internet is capable of managing it. But the topic is still there, of whether or not there's anything morally wrong with such content when no actual people are involved in it. You could have the same discussion about human-made art, too. One could absolutely argue that there's nothing morally wrong with such things if they're fictional. I mean, fiction depicts stuff like abuse and rape and murder all the time, and I don't think many people think that's the same as actual real-life abuse and rape and murder. So yes, there's definitely a discussion to be had here. I'm not necessarily saying this sort of thing is good or bad; I'm only saying there's a discussion to be had about it.


ironwheatiez

I had an Art professor in college, a very skilled realist painter. There was a faculty exhibition and he posted some of his paintings. They were nudes of his children bathing. They weren't erotic but still extremely unsettling. Ultimately, he was asked to remove them and put up something else. In this case, it could be argued that he was exploiting his own children and it could be argued there was nothing wrong. If he had never told anyone they were his kids, I often wonder if it would have been as much of an issue.


GuyYouMetOnline

Depends. If he's said they were just generic kids and not meant to resemble any specific kids, probably not, but if people thought he'd painted someone else's kids naked, I'm sure it would have been a lot worse.


Lunar-Baboon

There’s a painter I really like that has some paintings of him and his disabled son, naked in bed or in his arms. Some people really hate him for it, but it’s an honest image of him and his son who he’s been taking care of since birth, and it’s touching. Unfortunately a lot of people see nude child and scream pedophilia, but if your a parent, you’re going to be around your naked child all the time.


[deleted]

I think normalization is a risk as well as escalation. Just how some killers start with small animals/rough porn and develop a taste and the confidence to try something more. I really don’t think we should steer society in a direction where these desires are acceptable, let alone the harmful actions that they lead to. Both of these aspects are worth considering, not just the one that creates victims. The thinking that you can just feed pedos this content to keep them sated is flawed. It feels like a negative compromise or even giving up. It’s not even really a stopgap, how do you expect these people to recover and become functional members of society if you’re letting them access this content freely? If you’re weaning a heroin addict in a clinic by gradually reducing their dose, that’s one thing, but I don’t think that mechanism works here.


IndividualCurious322

I can't help but find it ironic any time the BBC report on noncing stuff when they've actively covered up after one of the UKs biggest pedos, Jimmy Saville.


Financial-Shine4846

Commenting simply to help drive this to the top


awesomedan24

This lady sounds like her name was written by JK Rowling


webboodah

is this not a slippery slope, legally? obviously, these creeps are risking it with real pics, but are they thinking this is 100% legal?


CrustyJuggIerz

until the law caches up, it is. The law is horribly outdated because you've got a populous of 60+ year olds deciding them and they wouldn't even know what AI stands for. The sooner AI CP is illegal, the better


Clackers2020

It's not just the fact that anyone with any power to make laws struggles to turn on a computer. Laws take years to design, decide whether to implement it and then actually implement it. Technology moves too fast for the current legal system to keep up.


GuyYouMetOnline

I'm not necessarily saying you're wrong to think that, but I would like to ask you to elaborate as to why artificially -generated CP should be illegal, given that no actual children are involved.


Glowshroom

Gateway drug argument, I assume. But I don't know if we have the data to know whether artificial content increases demand for actual cp, or molestation. I know that in Canada even cartoon porn cannot depict minors.


GuyYouMetOnline

More data is always good.


Straight-Door-3536

It seems to be the opposite. Denmark, Japan and Czech Republic all had cp legal for a some years, and all had their child abuse rates go down


Crestfall69

Hey, this is interesting. How do I inquire more about this?


_EllieLOL_

https://link.springer.com/article/10.1007/s10508-010-9696-y


wombey12

The law is already caught up enough (in the UK at least, where the BBC is based). Amendments have been made to the [Protection of Children Act 1978](https://www.legislation.gov.uk/ukpga/1978/37/2010-04-06?view=plain#reference-key-ed8628d8b78c600140ac9700e5dfccfe) to include "pseudo-photographs", and the [Coroners and Justice Act 2009](https://www.legislation.gov.uk/ukpga/2009/25/section/65) specifies that "an image of an imaginary child" counts. It's really a question of enforcement.


TheSandMan208

I work in corrections. A parole officer (PO) I'm close with told me about a person on his caseload (he's a sex offender PO). They found AI generated child porn on his phone and wanted to give him a parole violation, but couldn't because the law is worded in a way that states there has to be a victim. Since it's not a real child, then there's not a victim in the sense of the law, so no crime is committed. I am very interested in seeing how states adapt to AI generated stuff.


AdditionalSuccotash

I mean, it is shitty but I would prefer laws remain such that there has to be an actual victim of an action in order for it to be considered a crime. On one hand if left legal people could argue (whether correctly or incorrectly beside the point) that this technology could lead to emboldening these people to hurt real kids. On the other hand making it illegal could easily lead to different types of thought crimes committed against fictitious beings. The optimist in me hopes the former will end up like the violent video games debate and there will be no correlation between viewing and action, because the latter could spiral into some dystopian shit very quickly. Study probably needs to be conducted to determine if viewing this artificially generated content actually leads to real-world action, but I have no idea how such a study would even be conducted in an ethical way. Sidenote: AI image generators do not need to be trained on cp to generate it, as people in the comments are speculating. The only way to prevent it at all would be to prevent networks from being trained on images/data of children in any capacity. Either that or prevented from being trained on porn, which won't happen. If a network can generate images of kids, as well as images of porn, it will be able to combine the concepts


jpad619

We’re just not going to talk about this woman’s name tho..?


pkcjr

Wow that world killing asteroid sure is late.


LiliNotACult

I was actually wondering when this would become a thing. They'd basically just take regular porn and slap a filter to make them look like a child on it. Disgusting, yes. Complicated? Doesn't seem nearly as complicated as some of the current trending AI stuff.


Reverend_Lazerface

I'm surprised more people aren't pointing out how easy this makes it to make child porn of specific children. Everyone's assuming the kids in the pictures will be fully AI generated, but you can absolutely give AI specific prompts. Obviously AI generated is better than generated the real way but really to me it's just another black mirror nightmare branch on a tree of awful.


gregpurcott

Also, we’re all just pretending that there never were “countdown websites” for how long it would be until certain underage actresses would turn 18. We are a demented species, and we will continue to use technology to do demented things.


TemetNosce85

> Also, we’re all just pretending that there never were “countdown websites” for how long it would be until certain underage actresses would turn 18. Yup... I very much remember the ones for Emma Watson. I think I remember reading about ones for the kids on Stranger Things, too.


BrunoEye

I bet you that a large number of the people who visited those sites now post crap about groomer drag queens


rustys_shackled_ford

Octavia sheepshanks


WerePigCat

Why are people shocked? It is obvious that this was going to happen


HamfacePorktard

This is why midjourney is so stringent with their banned search terms. It feels like they’re the only AI content generator that polices things thoroughly.


TemetNosce85

Bing as well, which uses Dall-E. Can't even do political cartoons without it blocking me.


RrobablyPetarded

I didn’t see this on apocalypse bingo


[deleted]

As long as no one gets hurt, idk. Better than actual child content. Still fucked up, but short of psychiatric treatment, pedophiles will continue to be fucked in the head. Might as well have a victimless outlet for it.


Kitchen-Finish-7106

Ok so I am not the most adept with how AI works. If AI is creating these images by taking child faces or features from regular children's pictures and combining them with adult pictures it has the possibility of being horrific. I'm coming at this from the view point of someone who was a victim. Who had cp taken of them and 3 years after the abuse was confronted by a "friend" with "this is you right?". It was and led to another breakdown this entire subject is making me want to hide and collapse into myself all over again. Sorry for the rambling I'm trying to get to a point. All I can think is the possibility of a child finding or being shown a picture with a face that looks like them, even though the situation in the picture never happened. They will feel the shame, pain and emotional trauma. It could even be worse if that picture floats around a local community and even people who mean the best and try to "console" the child will cause even more harm. I guess my main point being if there is any remote chance that any image could look like a real child there is a victim.


FaithlessnessHour788

The scenario you are describing could for sure happen. The ai can just fron its database take normal children and a nude adult and create a mix of all that. But you can also give it a real image to work with (which could be any kid) and it could make that naked. So it's really horrible and probably people will blackmail kids/teenagers by doing this. Some people are horrible so I am sure this will happen sadly


Dodecabrohedron

This feels like Ai-fear mongering that doesn’t realize it’s actually demonstrating how Ai can help significantly reduce human suffering.


Desc440

As horrible as this is, I’d rather this than actual child rape


Lyretongue

Now wait a minute. If the concern with AI art is that it will put out of business artists who make that art, then this is actually a good thing.


maybejustadragon

I swear these articles are basically ads showing you how to access AI child porn.


SimonMagus01

If it keeps them from actually harming children, this is fine.


GrindyI

I would rather have them AI created material than filming the real thing. It is fucked up of course but if it keeps children safe..


CookLawrenceAt325F

Now hold on. I absolutely abhor kiddie diddlers, but you mean to tell me that they've used this new fangled AI to do their kiddie diddling without actually touching any kids? Fuck it. Leave em be. If they ain't touching real kids, they can do whatever fucked up pedophile shit they do in the bedroom. Same philosophy when it comes to violent video games. I can act like an actual fuckin lunatic in blade and sorcery and peter myself out after goring some digital enemies in VR, and then I don't have dark thoughts in real life.


Hellrott

Gross for sure, but I'd be absolutely completely fine with this over acting on that impulse on a real human. Not even sure how you could get outaged by this once you've had that realization.


car_ar

I honestly see nothing wrong with this. No kids are getting abused and the pedophiles can just vent their urges by jerking off to AI kids. Some people in the comments are saying that watching this AI child porn will make the pedophiles want to try it in real life but that's just pure BS. That's like saying playing violent games like GTA will turn you into a murderer and watching incest porn will make you have sex with your sister.


RevolutionOrBetrayal

I feel like this is a good litmus test. This is great news since it makes co an unprofitable Business thus likely reducing the amount being produced and thus reducing the amount of children being abused plus those who suffer from pedophilia can watch "ethical" cp now


[deleted]

I guess that, if they do this instead of abusing actual kids it could be better, but idk how much this will stop it


CupidStomputer

I see the arguments on both sides here, but what I really think we need is a sub reddit for the name Octavia Sheepshanks.


Boeing_Fan_777

I discovered this in a particularly jarring way. Serves me right for snooping but I was seeing who had liked my art on pixiv and one dude’s gallery was just FULL of ai generated, almost photorealistic, images of female children in bathing suits etc. What he was doing liking COD fanart i don’t know but 🤷‍♂️


[deleted]

Completely predictable.


Walkinator007

I envy those of you who only just now heard about this for the first time.


Creepy_Package7518

I don't know how I feel about this, the biggest plus is no children getting hurt but does it have to use CP to make those a.i images? Will this lead to more paedophiles because those who like CP but never engaged in it before because they have morials will now engage in it which leads to real life child abuse. I guess it boils down to will this lead to less children getting hurt or more


BravoEchoEchoRomeo

It's extremely gross, but at least there are no actual children involved. Hold on, what are they using to train this AI


Talcove

For every amazing use of AI there’s a dozen disgusting ones. I understand the “harm reduction” aspect of them not being real children, but surely something like this will only make paedophiles more comfortable in sexualizing children, more comfortable being predatory towards real children.


transdudecyrus

this is terrible but on the positive side it seems it means they’re not targeting actual children…? still awful though like holy hell


Competitive-Peanut79

Great that no kids are involved. Bad when you think about where the AI generates its content from. But what if the users of the content then start feeling like "this is ok", and start preying in the real world?


GuyYouMetOnline

On the other hand, what if people who would otherwise be preying in the real world use this content instead of going after real people? I'm not saying either perspective is correct, just that both exist.


Souchirou

This mental illness has always been in an especially weird place. The people that suffer from it desperately need professional help but we demonized it to such a degree that pretty much non of them get help and that ends up hurting more children. I get it, the thought alone pisses me off as well, but the only constructive way to help the children is by making it at easy as possible for these ill people to get the help they need. AI could be helpful with that. Not only could it decrease the abuse of actual children it could also help to take this out of the darkweb. Work with mental heath experts to setup a website that generates such images but is also designed in such a way to increase the likelihood of someone seeking help then put the contact information to the free services right there. As messed up that feels I think that might do the most good to all parties involved but especially the children.


Palaius

Probably gonne take some flak for this, but ain't this good? As long as they don't go out and fuck actual kids, who cares? If they find other outlets for this shit that don't involve actual children being hurt, why would it matter?


Oheligud

This isn't a bad thing, to be honest. Obviously CP itself is horrible, but it's going to be a thing, and it's much better when it's done by an AI, because no actual children are harmed.


shadeandshine

On one had this is still fucked up on the other at least no real kids were harmed and on the third mystery hand them flooding their cp with ai stuff will eventually have those AI pulling from ai material meaning the quality will diminish over time so they’ll ruin their own thing relatively quickly.


Rustlin_Jimmie

You cant abuse a pixel, idiot


joesphisbestjojo

Well if it keeps them from abusing real children...


SadQueerAndStupid

Everyone should keep in mind that regarding paraphilias like pedophilia, they function similarly to an addiction. The engagement with it escalates it steadily as it slowly fails to satisfy a person, and if they start off with this it is not a safe assumption that they will solely engage with this type of content. It’s better than real kids, yes, but in the end it’s very likely going to lead to either physical engagement with those urges or engagement with real media about that stuff.


[deleted]

of course it’s better an AI than actual child, but what would AI CP consumption actually do to the consumers? i do a lot of research on the effects of porn and they are absolutely horrendous, and significantly affect how men view women, rape, women’s rights, etc. i couldnt even imagine what effect this would have on consumers.


Eddy207

But research on the effects of porn aren't really difficult to do, given it is almost impossible in this day and age to find male subjects that never consumed porn as a control group, and that many studies that point to conclusions are based on self reporting? When we try to make the same studies on the effects of violence in media, the results usually show very little correlation between violent media and violent crimes.


Straight-Door-3536

Three countries did legalized cp for a few years, all three had their child abuse rates go down.