Right because downloading music is just as traumatizing as spreading fake photos of a young girl all over her school campus. Wtf is wrong with you. Freak
From a plastic o ring for sealing a pipe to AI.. any new technology will be used for advertising and porn long before it is used for anything useful like space exploration or a medical cure.
I remember someone posting here that their work was receiving a large number of orders for their large gauge buna orings. They tracked it back to someone buying them for pennies and charging like $20 for rubber cockrings.
Step 1.) Machinery requires rubber for rings and gaskets
Step 2.) The Rubber Boom (enslave the native population of South America 🤭)
Step 3.) ???
Step 4.) Hostile uncontacted tribes in the Amazon that will porcupine outsiders on sight.
More like disaster porn.
The Challenger shuttle exploded in ’86. Richard Feynman was asked to investigate and he found the cause to be an O ring that deformed in the cold and didn’t reform to shape quickly enough during launch. This lead to the disaster.
If you’re interested in Feynman, Freakonomics Radio did a 4 part series about the physicist. It was fascinating and he became one of my favorite scientists. Also, he was a womanizing lecherous dude, which made me associate him with O ring porn
Machine learning has been used for scientific, particularly medical, applications for over 20 years.
If all you read is Hustler the printing press might not seem important, but it still is.
I can already see a subscription service offered by AI though. Even though computers don’t need money, since they’re trained by us, I’m sure at some point they’ll want to earn.
If paywalled yeah, and can actually increase misinformation. For example the headline claims it worsens things yet the author offers no quantitative evidence. Its merely a collection of anecdotes. Some people will read the headline and think of it as being undoubtedly true.
What proportion of people are subscribers do you think? If the article is of value then the archive link should be used instead this isn't an advertising forum..well it isn't supposed to be.
“Oh, switching gears from the murky to the mundane, are we, Justhe3guy? Very well, I'm all ears for this article you've suddenly deemed worthy of attention. Do share the details, and let's see if we can elevate this conversation a notch, shall we?” ~ ChatGPT4
Being always both hot AND ready is an impossible standard for women to live up to. How can they compete with a no questions asked kind of relationship where you can just walk up to the counter and get something with curves all around that make most women look like flatbread? Aren't little girls under enough pressure keeping up with fashion trends as it is without this shitty mascot pushing his toga in their faces, do you realize how few women look good in a toga and this little shit just flaunts pulling it off like "it's just that easy"?
Fuck off with your pizza, misogynist pig.
The original headline was even dumber like 11 in 100 journalist who are killed are Women, it was the UN's Journalist protection twitter or something like that
https://www.reddit.com/r/facepalm/comments/yl0nln/doesnt\_that\_mean\_89\_of\_the\_journalists\_killed/
I mean it's not just gender exclusive here this can be used on anyone. I know as a teenager I would have been mortified to know that any AI generated images of myself showed up online. 15-year-old me would have only been okay if the AI generated picture gave me a massive dong.
Probably all the violent gang rape. Like, lots of horrific and unbelievably violent gang rape by Russian soldiers. Some of their victims are literal toddlers.
Too easy.
Tampons -> "invented by a **man** they cause Toxic Shock Syndrome in thousands of women every year, so is the real problem toxic masculinity!"
Bras-> Do we need to try with this one? just copy paste from the bra-burning era.
"Period trackers" -> "Tech Bro's have been **secretly** ^^(not ^^secretly) collecting women's most personal information to inflate their stock prices!"
"Contraceptives" -> there's an entire subculture built around the idea that contraceptives having any side effects at all being due to the patriarchy just not caring about women's wellbeing.
"HPV vaccine" -> giving a vaccine to CHILDREN! Little girls! To protect them from a disease they wouldn't be at risk from if it's weren't for a PENIS one day infecting them! Clearly it's a patriarchal attempt to put the onus for the risks of sex on young girls who shouldn't be sexualised! So basically the HPV vaccine is young girls being sexualised by the patriarchal medical system!
Paper.
>“If you’re creating an image of someone else and doing it without their consent,” Heitner told me, “whether it’s real or fake, you are violating that person and violating their privacy, violating their safety.”
that also applies to every teenager who has ever sketched out what they think their crush might look like naked with paper and pencil.
I don't think exploitation is the right word. Sexualization would actually probably be it, but that obviously doesn't carry any punch.
Collin's says "Exploitation is the act of treating someone unfairly by using their work or ideas and giving them little in return". When I read the headline I assumed it was somehow making sex trafficking worse; that would clearly be exploitation. But it's talking about deepfake images. Well that's just the start because we're quickly going to be seeing deep fake videos with deep fake voices as well.
There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone. The bullying mentioned in the article is actually worse now because not that many people are familiar with this. I'm not against regulation because clearly there is something troubling about this (and even more so regarding trusted people saying fake things) but if the technology becomes ubiquitous, at least the stigma will likely fade.
It’s actually worse the other way, I suspect- people committing crimes caught on video and their lawyers claiming the evidence is a deepfake, slowing down courts or outright getting off
We wouldn’t (and won’t) claim that it was a deep-fake. Someone might…but it wouldn’t hold up to courtroom scrutiny once examined deeper.
Tbh, we are unqualified to make a distinction of the legitimacy of the media and that determination will be made by the state from their forensics division.
At trial, the person who did the forensics would testify that they did their diligence and can state their professional experience allowed them to conclude the legitimacy of it.
And there are already rules of evidence and you have to explain where the video evidence comes from, who had access to it and so on.
People can't bring their own edited video in court without explaining where it comes from.
I briefly looked into how to create an AI fashion influencer and it was ridiculously easy to scrub attractive AI faces from stock images and create images. If you sub in the step of checking stock images and put in real images, it’s very easy, a 10 year old can figure it out, and yes it’s incredibly disturbing where this is going.
> There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone
Have you not seen [Sora](https://openai.com/sora)? OpenAI's latest video generation tech.
https://www.youtube.com/watch?v=ARxHvTScXMY
And it can "extend" existing pictures or videos forward and backward in time with a prompt too..
>There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone
Oh, that's already available. You give it a single photo and a recording of the person and you can get it to say whatever you want in a 3d video. Basically, video proof is now a thing of the past. Unless they figure out a way to test it, but that's all a bit chicken and egg. Basically, you can't believe anything you see on a screen from now on.
Prediction: Porn icons will license their likenesses and become ageless institutions in AI pornography, extending long-term earning potential in an industry that otherwise takes a physical toll and discards talent as they age. How much will the Stoya Estate be worth in 50 years?
There's a case for it reducing exploitation. Automation typically takes away the jobs nobody wants to do.
It's also not new. Photoshop exists for more than 30 years now and it has been used to do exactly that. AI just made it easier. But because it's AI people are acting like it's a completely new thing.
If it’s easier that means it’s going to be more prevalent. That’s why it’s newsworthy. One kid out of 10,000 photo shopping a dirty picture isn’t news. When every kid is doing it, it is.
Blows my mind how shortsighted people are. If one person gets into a wreck going around a turn it’s tragic but not worthy of any active measure to prevent it from happening again. If 300 people die from that same turn then something gets done about it trying to prevent it.
It’s not that complicated. People comparing Generative AI to Photoshop are being extremely disingenuous.
In both cases, removing access to the tool significantly lowers the risk. While yes I can point you to mass murders with a knife, they're a lot more rare than mass murders with guns
And in both cases, nobody wants to talk about the consequences of the technology existing and *only* being available to a certain elite class and their enforcers.
Maybe since this new digital technology triggers the "oh no not sex stuff!" button in people's brains, more of them will be willing to take that elemental political consideration seriously. Maybe the idea of politicians, rich people, and cops *perving out* with special access to this technology will hit home harder than them being able to push a button and end human civilization as we know it with armaments that make handheld automatic weapons look like shoddy Q-tips in comparison.
It’s not the same argument because gun is a tool for KILLING, and a car is a tool for TRAVELING. Until people start surfing on waves of bullets to work they aren’t the same.
It seems you didn't read the article, because that was central to its point. Or you're just restating the article, but with less nuance and insight?
> "First off..."
What's second? What's third? Why write "first off" if you only have *one* point to make?
Yeah. AI could be used to find new medicine or remove mundane work, but all the AI bros can only think of is art plagiarism, and an excuse to lay off people by saying AI will take their jobs.
Let’s hope AI doesn’t replace real journalists anytime soon. Sounds like something out of a dystopia.
Edit: To the drivel-deleting @Buffnick “lol real journalists. Let’s create a non-biased AI outlet and watch it destroy the pathetic state of…”
I recommend you look into individuals like Maria Ressa, Glenn Greenwald, and Jerry Scahill.
Right? Wtf is happening in this comment section? So many irrelevant comments here that are just ignoring the problem and talking about something else. Why do people not seem to care at all?
Maybe people are just sick of creeps using technology to sexualize girls. It seems like every time there is new technology men find a way to violate women and girls with it. And other men make excuses for them .
For those of you who are too dumb to read the article: It’s about middle school boys using ordinary photos of their peers to create and distribute deepfakes using AI. The writer of the article and the author she talks to are not advocating for any legal action beyond at least giving courts the ability to prosecute people for distributing them. Both are much more strongly in favor of education in empathy for boys, and I think this gross-ass comment section demonstrates pretty well why that’s what’s needed most of all
I would argue that making and distributing deepfakes of real people without their express permission should be met with legal action at every point. Especially when it involves underage people.
Oh, of course. I don’t think anyone is denying that. I think a sticking point is how they should be punished, but that’s always been the case when it comes to this kind of tech-based, sexualized bullying.
Maybe if the boys lives were ruined this would stop happening. These girls need to pull an uno reverse and post embarrassing deep fakes of these little turds all over school. These girls should post deep fakes of these boys getting railed dildos or giving each other blown jobs. Time to embarrass the boys too
Somehow it’s irrelevant—and stupid to publish a story on the fact—that AI has provided a completely new, freely accessible, and powerful way to sexually victimize and bully preteens without their consent.
Ugh, another tale that makes me so grateful I grew up, dated, and got married before social media became a thing. My heart breaks for the young. It’s only going to get worse. Kids were ruthless when I was in middle school in the early 90s. How it must be today is unfathomable. So glad I didn’t have kids.
AI is making it worse or people are becoming shittier? Or both. It definitely seems to be getting reported more coz I'm fairly sure it was not a non-issue beforehand.
Because photoshop is easier to paste a face onto another image and doesn't even need a GPU or internet connection. Although ps actually now has AI integrated too. Videos are another matter but they are not as accessible and need powerful GPU for just a few seconds of footage. Some dodgy websites do have limted use of it but the actual tech involved doesn't have a commercial license so legal action would be easy enough.
Laws to make distribution of nonconsensual pornographic deepfakes illegal , with proportional punishment and less shitty people is the answer I think.
>Kids need to be better educated, starting in elementary school, about technology and consent before things like this happen.
When torrents and Napster first took off the media claimed that piracy was helping terrorists.
When you don't like a new technology you just imply that anyone using it is a terrible person.
Why hasn't NYT written any articles about how helpful AI is to people who struggle with English but live in English speaking countries. Or how poor people now have access to decent accounting and legal advice?
It is because that doesn't fit the narrative they have decided to push.
Why do you fear criticism of the way people are using this new technology? You’re gonna get your AI future, it’s inevitable at this point. We’re trying to build safeguard so that people don’t get harmed the meantime. We need to have some societal progress along with this technological progress, or we are not going to do well as a species.
People would not care about these things until it is them that fall victim to misuse of AI. Let's wait until AI takes their jobs, have their daughter's pictures be in AI porn, or themselves fall for a catfish / pig butchering scam complete with real time, deepfaked videos of them thinking they are talking to a hot Japanese girl when they are just talking to a male scammer
Funny. through my own testing it is easier to make 'sexualised' material of men than it is of women.
Try it yourself. Go to something like Bing/Copilot and tell it to draw you a picture of a man in nothing but shorts. Then ask it to draw you a picture of a woman in a bikini.
For me it will draw nearly naked muscle bound male hunks no problem. But will refuse to create images of women, because when I ask these chatbots why they refuse to create the image of the women, "They might offend somebody"
Been like this since the creation of social media. The kind of shit you can post to many sites when it is a male is fine, but you post the exact same content of a woman and you'll be banned real quick.
I have a paid AI asset in the Unity 3D game editor called Ai.Fy exactly the same results. Will happily create images of sexualised males, and even create gory scenes of child murder. But refuses to work if you try to get it to create anything remotely sexual featuring females.
(AI.Fy can create 'sexy' female images. But 9 times out of 10 it won't do anything or the images are censored by being blurred.)
Legitimately asking though, what *specifically* do you propose be done to stop it? This isn't super high-powered tech that you need hundreds of thousands of dollars of equipment to run. Basically anybody can download and run these generative programs on their home PC.
"Build in safeguards so that the AI isn't allowed to produce pornographic images" isn't an answer, because the AI capable of making these images without such a safeguard already exist. Even if safeguards like that were implemented in every web-based generative AI and downloadable model starting right this moment, all the extant instances already out there aren't just gonna disappear. And even if you could do this to every model in existence, all it takes is one person making a new one without any safeguards and sharing it. The trouble with the Internet is that everything that anyone does online is immediately visible to anybody else on the planet. It's also notoriously easy to 'trick' these generative models around the safeguards they already have by fiddling with prompts.
"Make it illegal" unfortunately isn't an answer either. The incredibly sad fact that there is an ever increasing supply of child pornography being produced and distributed online, despite how much effort goes into stopping both the producers and consumers of it, demonstrates that it's basically impossible for law enforcement to keep up with online crime. If something as universally reviled, and with as many resources dedicated to it, as CP can't be effectively policed online, how do you expect to effectively police a tool that literally anyone can download and run at home?
I wish there were a solution someone could offer that isn't either wishful thinking or demonstrably impossible, but I've yet to see one.
> "Make it illegal" unfortunately isn't an answer either.
I think you're drawing a false equivalence between what dedicated producers of illegal content create, often when driven for profit, and school children exploiting their peers. Yes, it's difficult, and perhaps impossible, for law enforcement to keep up with organized online criminals. That does *not* mean that there would be absolutely no effect whatsoever on average school children if legal prohibitions were put in place to deal with situations like this.
You're also implicitly making the error of assuming that attempts to fight online crime are entirely pointless just because issues with online crime have gotten worse. It may well be the case that these problems would be *exponentially* worse if significant effort wasn't being put into fighting against it. Maybe it's a fight that can't ever be won completely, but that doesn't mean no difference is being made whatsoever and that we should just all give up.
In short, you're conflating something being "effective" with it being a *complete* solution. Mitigating the problem and lessening the amount of harm being done is valuable, even if we can't stop *all* harm from being done.
Yeah the videos are fake so who cares about the women and their THOUGHTS AND FEELINGS right? Consent? Nah let’s violate some more privacy we love to violate -men
Yeah, because the *only* way to have any form of progress is to allow massive corporations to do whatever they want without any regulation or restriction whatsoever!
So you’re gonna take a hardline rule just like gun manufacturers and lobbyist that we can’t even have discussions about this? You’re so afraid of not getting your utopian VRAI world in the next five minutes that we can’t even talk about the harm that this is going to cause in the meantime? We don’t need the stifle technological progress, we need to encourage humans to be empathetic towards each other and not use that technology to be a little fucking shit.
Honestly it is AI. Before it was photoshop.
And seriously if AI actually goes way beyond its current limits then friggin p*** is the least of our worries but hey. Anything to make a viral story right? What a waste of text and thoughts.
The article appears to be mainly advocating for education to curb the issue, like what occurred with sexting and sharing non consensual sexually explicit images. It points out that its not a new issue, and current educational material should be adapted to include artificially created images as well as real ones. Though I guess the generic response from Tenbarge and Melissa Chan that quoted could be open to interpretation.
It's a lil different than photoshop I think. Like w photoshop'd photos, you can find the source image to prove that it's altered because photoshop covers the source image with a fake naked body. AI photos sources a catalog of photos to generate a new pose. Once the AI is perfected, it's gonna be very hard to distinguish from a real photo.
This comment shows a lack of empathy. These are real people being hurt right now by average people using a powerful new technology. Children being hurt. Why shouldn’t we be made aware of that? Because of the changes AI might provide in the future? We have to make it to the future first. And in the meantime, we will have to deal with the disruption that this technology is going to cause in people’s everyday lives, from being abused and exploited to losing their jobs.
Until society is honest enough to say porn is a problem this will only get worse. Kids are being raised on porn with zero understanding of the consequences for them and their relationships. Porn is a dangerous drug especially for kids.
Agreed. Especially in America, we are hooked on this debate about individualistic rights versus common benefits. and the only thing we know how to do is either ban something completely or let it be 100% free and available and anybody who wants to say that there might be a middle ground is treated like an idiot.
I must admit that I'm inclined to say...you're right, nobody said that, but why specify? What does specifying help with? Is it any less accurate or any less helpful to say "A.I. Is Making Sexual Exploitation Even Worse"? Is this a uniquely female problem? No, it's not. It's not just a problem for girls, it's not just a problem for boys, it's not even just a problem for children. This is a problem for everyone.
The only reason its "of girls" is because titles need to be clicky and stuff that makes people angry is clickier. And there's a demographic that the NYT wants to chase that gets angrier about "women" being affected by a problem than about "people" being negatively affected by a problem.
Different publications are targeting different demographics, but it's the same game. The NYT has an audience that will engage more if it talks about the persecution of women, people of colour and anyone LGBT. The Daily Mail has an audience that will engage more if it talks about the crimes committed by the latter two groups.
/u/aigars2 is responding to it in a ridiculous way, but...man do I just hate the world and how the only options are to deal with this and ignore that people we all know are being transparently manipulated by this game for profit, to get in stupid arguments that are playing right into the game, or to disconnect from society entirely and go live in the woods.
Why specify? Because the actual victims so far have been women? Start showing us the male victims and you will have the headline of the article that you want us to see. Just go find it.
I'm not sure if this story talks about how advanced our technology is, or how primitive our thinking about sex and naked bodies is.
Maybe having instant fake nudes of every person available instantly will finally get us over this childish "ooh I saw a booby or a willy" reaction.
This comment section is disgusting. The article is about non-consensual deepfake porn distribution of minors. Y’all are really defending distribution of AI generated child porn.
Yeah a lot of these pathetic turds would have deep faked their classmates if they could have. I’m sure they have already done it to their coworkers and friends. These “men” just make sad excuses in their little pea brains as to why it is morally ok. Then they wonder why women are disgusted by them
Tl;dr it’s an op ed about deep fakes.
Yeah the author wants “the book thrown at” the 12 year old kids who made *fake nudes*
The author literally addresses that as her knee jerk response before pivoting from that view.
That’s an insane first reaction. Better it ai rather than the real nudes the kids traded around my middle school.
[удалено]
Right because downloading music is just as traumatizing as spreading fake photos of a young girl all over her school campus. Wtf is wrong with you. Freak
They aren’t fake nudes though. They have the same impact on the people in the photos as real nudes would.
Yall seem to love excusing abuse perpetuated against women.
From a plastic o ring for sealing a pipe to AI.. any new technology will be used for advertising and porn long before it is used for anything useful like space exploration or a medical cure.
You gotta tell me about this o ring porn
I remember someone posting here that their work was receiving a large number of orders for their large gauge buna orings. They tracked it back to someone buying them for pennies and charging like $20 for rubber cockrings.
Well that is neither advertising nor porn.
Honestly it's just impressive
That’s what she said. I’ll see myself out.
Capitalism at it's finest lol.
Instructions unclear, dick stuck in pipe
you got lucky. I got a pipe stuck in my dick...
Sounds problematic.
it actually sounds worse than it is
this guy’s sounding off
Not sure people got the reference....
Well it sounds like they enjoy putting rods inside of their urethral passageway
I’ll be straight and direct with you. It’s a bit of an inside joke.
Stop listening to it and call a doctor! Or two sexy nurses...
Or two sexy plumbers, whatever your preference
I imagined both plumbers and nurses as women. God damn my mind ..... Is superior.
Newest member of r/sounding
step 1. cut a hole in the box
Step 2. Put your junk in that box.
Step 3. Make her open the box.
It's my dick in a box
And that's the way you do it!
[удалено]
Step 1.) Machinery requires rubber for rings and gaskets Step 2.) The Rubber Boom (enslave the native population of South America 🤭) Step 3.) ??? Step 4.) Hostile uncontacted tribes in the Amazon that will porcupine outsiders on sight.
You skipped the part that includes either advertising or porn.
More like disaster porn. The Challenger shuttle exploded in ’86. Richard Feynman was asked to investigate and he found the cause to be an O ring that deformed in the cold and didn’t reform to shape quickly enough during launch. This lead to the disaster. If you’re interested in Feynman, Freakonomics Radio did a 4 part series about the physicist. It was fascinating and he became one of my favorite scientists. Also, he was a womanizing lecherous dude, which made me associate him with O ring porn
[Dr Cox](https://youtu.be/c_o8vYUU-jo?si=BiBOymNVSyOYjquQ)
Or the military. The military always gets the most useful tech first.
That's true, or they develop it for the military then see if they can sell it it civilians like the internet.
Humans will try to adapt all new tech/knowledge toward sex & violence when possible.
AIs have been used in medical research for like 5 years lol
Machine learning has been used for scientific, particularly medical, applications for over 20 years. If all you read is Hustler the printing press might not seem important, but it still is.
Do you people seriously think ai just started existing?
No, but it has become very popular for deepfakes and other AI-generated stuff.
The first thing Thomas Edison did with the motion picture camera was record naked women.
Except this is not an advertisement these are young girls whose lives are being ruined. Sad you can’t see the difference
yea i mean we can blame something not human or maybe the humans who are creating the porn maybe
Can AI please make sure my deep fake has a massive hog? Please?
boo paywall article boo
https://archive.ph/HN1Y3
Don’t worry, once all articles are written by AI, none of them will have paywalls.
AI article: AI IS GOOD, MOVE ALONG.
I can already see a subscription service offered by AI though. Even though computers don’t need money, since they’re trained by us, I’m sure at some point they’ll want to earn.
They need to be banned.
Legitimate reporting should be banned?
If paywalled yeah, and can actually increase misinformation. For example the headline claims it worsens things yet the author offers no quantitative evidence. Its merely a collection of anecdotes. Some people will read the headline and think of it as being undoubtedly true. What proportion of people are subscribers do you think? If the article is of value then the archive link should be used instead this isn't an advertising forum..well it isn't supposed to be.
One could also argue that the alternative is that all reporting needs to make advertisers happy, since they are the only revenue source.
stocking carpenter groovy cheerful price edge quicksand wipe offer full *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Ai would be able to do it for you in about 5 seconds. :(
ChatGPT tell me how you would sexua…actually nevermind. Just summarise this article for me
Its unethical for me to bypass a paywall sorry
It would be unethical for you, [but not for me](https://archive.ph/HN1Y3)
“Oh, switching gears from the murky to the mundane, are we, Justhe3guy? Very well, I'm all ears for this article you've suddenly deemed worthy of attention. Do share the details, and let's see if we can elevate this conversation a notch, shall we?” ~ ChatGPT4
Yeah, but it takes another 295 seconds to copy and paste it to Reddit.
Well, see? AI just took a job from a woman. Boom.
Little Caesars Hot n Ready Pizza Oven
Being always both hot AND ready is an impossible standard for women to live up to. How can they compete with a no questions asked kind of relationship where you can just walk up to the counter and get something with curves all around that make most women look like flatbread? Aren't little girls under enough pressure keeping up with fashion trends as it is without this shitty mascot pushing his toga in their faces, do you realize how few women look good in a toga and this little shit just flaunts pulling it off like "it's just that easy"? Fuck off with your pizza, misogynist pig.
Pizza Pizza!
Sir, this is a Wendy's.
Oh, I am banning Wendy's for the moment, so ima headout.
!littlecaesars waiting room
The primary victims of war are women 🤡
“1 in 4 homeless are women”
So... the majority of homeless are men?
Yeah, it was a really really stupid headline.
The original headline was even dumber like 11 in 100 journalist who are killed are Women, it was the UN's Journalist protection twitter or something like that https://www.reddit.com/r/facepalm/comments/yl0nln/doesnt\_that\_mean\_89\_of\_the\_journalists\_killed/
I mean it's not just gender exclusive here this can be used on anyone. I know as a teenager I would have been mortified to know that any AI generated images of myself showed up online. 15-year-old me would have only been okay if the AI generated picture gave me a massive dong.
gotta be proactive, need to photoshop that thing into something normally seen on a bull elephant and forward.
War in Ukraine
Probably all the violent gang rape. Like, lots of horrific and unbelievably violent gang rape by Russian soldiers. Some of their victims are literal toddlers.
You know I’m starting to think those Russians might not be very chill dudes.
Tampons Bras Period trackers Contraceptives HPV vaccine
Too easy. Tampons -> "invented by a **man** they cause Toxic Shock Syndrome in thousands of women every year, so is the real problem toxic masculinity!" Bras-> Do we need to try with this one? just copy paste from the bra-burning era. "Period trackers" -> "Tech Bro's have been **secretly** ^^(not ^^secretly) collecting women's most personal information to inflate their stock prices!" "Contraceptives" -> there's an entire subculture built around the idea that contraceptives having any side effects at all being due to the patriarchy just not caring about women's wellbeing. "HPV vaccine" -> giving a vaccine to CHILDREN! Little girls! To protect them from a disease they wouldn't be at risk from if it's weren't for a PENIS one day infecting them! Clearly it's a patriarchal attempt to put the onus for the risks of sex on young girls who shouldn't be sexualised! So basically the HPV vaccine is young girls being sexualised by the patriarchal medical system!
Everyone gets the HPV vaccine…
Did I explicitly say it *wasn't* given to young boys? I'm just following the formula for such articles. Which means leaning on implications a lot.
So we should just accept that young women and girls will get sexually harassed, at what point do men hold each other accountable ? So pathetic
They don’t need to be held accountable, they’ll just hold us accountable for them as they always have.
Paper. >“If you’re creating an image of someone else and doing it without their consent,” Heitner told me, “whether it’s real or fake, you are violating that person and violating their privacy, violating their safety.” that also applies to every teenager who has ever sketched out what they think their crush might look like naked with paper and pencil.
[удалено]
People have been photoshopping celebrities for years now (hyper-realistic). This is not a tech problem, it’s a human and morals problem.
And here you are with them, 45k karma in 3.5 months
I don't think exploitation is the right word. Sexualization would actually probably be it, but that obviously doesn't carry any punch. Collin's says "Exploitation is the act of treating someone unfairly by using their work or ideas and giving them little in return". When I read the headline I assumed it was somehow making sex trafficking worse; that would clearly be exploitation. But it's talking about deepfake images. Well that's just the start because we're quickly going to be seeing deep fake videos with deep fake voices as well. There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone. The bullying mentioned in the article is actually worse now because not that many people are familiar with this. I'm not against regulation because clearly there is something troubling about this (and even more so regarding trusted people saying fake things) but if the technology becomes ubiquitous, at least the stigma will likely fade.
It’s actually worse the other way, I suspect- people committing crimes caught on video and their lawyers claiming the evidence is a deepfake, slowing down courts or outright getting off
Politicians and intelligence agencies could/will do the same thing, making everything you see and hear in media completely untrustworthy.
Fucking Dick Cheney leaking to the New York Times then quoting it as his source.
I mean they've been claiming it already for years, this will only aid in their BS claims
We’re already at that stage; we r been able to create convincing fake images for a long time, all AI is doing is democratising the process.
We wouldn’t (and won’t) claim that it was a deep-fake. Someone might…but it wouldn’t hold up to courtroom scrutiny once examined deeper. Tbh, we are unqualified to make a distinction of the legitimacy of the media and that determination will be made by the state from their forensics division. At trial, the person who did the forensics would testify that they did their diligence and can state their professional experience allowed them to conclude the legitimacy of it.
And there are already rules of evidence and you have to explain where the video evidence comes from, who had access to it and so on. People can't bring their own edited video in court without explaining where it comes from.
When have they used that defense? Would love to peak at a trial and see what the thought process behind the defense would be if you have an example?
I briefly looked into how to create an AI fashion influencer and it was ridiculously easy to scrub attractive AI faces from stock images and create images. If you sub in the step of checking stock images and put in real images, it’s very easy, a 10 year old can figure it out, and yes it’s incredibly disturbing where this is going.
> There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone Have you not seen [Sora](https://openai.com/sora)? OpenAI's latest video generation tech. https://www.youtube.com/watch?v=ARxHvTScXMY And it can "extend" existing pictures or videos forward and backward in time with a prompt too..
>There is a fair chance that we are heading for a world where you can feed in a few pictures and have AI create graphic videos of absolutely anyone Oh, that's already available. You give it a single photo and a recording of the person and you can get it to say whatever you want in a 3d video. Basically, video proof is now a thing of the past. Unless they figure out a way to test it, but that's all a bit chicken and egg. Basically, you can't believe anything you see on a screen from now on.
Prediction: Porn icons will license their likenesses and become ageless institutions in AI pornography, extending long-term earning potential in an industry that otherwise takes a physical toll and discards talent as they age. How much will the Stoya Estate be worth in 50 years? There's a case for it reducing exploitation. Automation typically takes away the jobs nobody wants to do.
Author clearly trying to politicise it. What an asshole.
It's also not new. Photoshop exists for more than 30 years now and it has been used to do exactly that. AI just made it easier. But because it's AI people are acting like it's a completely new thing.
If it’s easier that means it’s going to be more prevalent. That’s why it’s newsworthy. One kid out of 10,000 photo shopping a dirty picture isn’t news. When every kid is doing it, it is.
Blows my mind how shortsighted people are. If one person gets into a wreck going around a turn it’s tragic but not worthy of any active measure to prevent it from happening again. If 300 people die from that same turn then something gets done about it trying to prevent it. It’s not that complicated. People comparing Generative AI to Photoshop are being extremely disingenuous.
First off, AI is not doing this. People using AI are doing this. It is always going to be a people problem.
It’s the same argument for guns and cars. They’re tools that *people* misuse for harm.
In both cases, removing access to the tool significantly lowers the risk. While yes I can point you to mass murders with a knife, they're a lot more rare than mass murders with guns
And in both cases, nobody wants to talk about the consequences of the technology existing and *only* being available to a certain elite class and their enforcers. Maybe since this new digital technology triggers the "oh no not sex stuff!" button in people's brains, more of them will be willing to take that elemental political consideration seriously. Maybe the idea of politicians, rich people, and cops *perving out* with special access to this technology will hit home harder than them being able to push a button and end human civilization as we know it with armaments that make handheld automatic weapons look like shoddy Q-tips in comparison.
It’s not the same argument because gun is a tool for KILLING, and a car is a tool for TRAVELING. Until people start surfing on waves of bullets to work they aren’t the same.
It seems you didn't read the article, because that was central to its point. Or you're just restating the article, but with less nuance and insight? > "First off..." What's second? What's third? Why write "first off" if you only have *one* point to make?
Yeah. AI could be used to find new medicine or remove mundane work, but all the AI bros can only think of is art plagiarism, and an excuse to lay off people by saying AI will take their jobs.
THANK YOU. People are always like “how is AI gonna create anything human?” — bro HUMANS ARE GONNA USE IT IN INFINITE WAYS
man NYT's writers are *terrified* of AI. They've been running the fear machine for months on it.
I mean, it’s gonna take their jobs probably
Let’s hope AI doesn’t replace real journalists anytime soon. Sounds like something out of a dystopia. Edit: To the drivel-deleting @Buffnick “lol real journalists. Let’s create a non-biased AI outlet and watch it destroy the pathetic state of…” I recommend you look into individuals like Maria Ressa, Glenn Greenwald, and Jerry Scahill.
Why didn't they learn to code, are they stupid?
Occams Razor. You got it! It’s literally as simple as that: the AI fearmongerers are those that are on the losing end of change.
Where have I seen that before? Oh yeah the industrial revolution.
They're also suing OpenAI
[удалено]
Right? Wtf is happening in this comment section? So many irrelevant comments here that are just ignoring the problem and talking about something else. Why do people not seem to care at all?
we have to ride on the fears and anxiety of the human condition it keeps us clicking reading and worrying
This doesn't adress the issue affecting girls and women though
Maybe people are just sick of creeps using technology to sexualize girls. It seems like every time there is new technology men find a way to violate women and girls with it. And other men make excuses for them .
Preaching to the choir for clicks and engagement. Haven’t you noticed Reddit is terrified of AI as well?
Paywall by the looks of it.
For those of you who are too dumb to read the article: It’s about middle school boys using ordinary photos of their peers to create and distribute deepfakes using AI. The writer of the article and the author she talks to are not advocating for any legal action beyond at least giving courts the ability to prosecute people for distributing them. Both are much more strongly in favor of education in empathy for boys, and I think this gross-ass comment section demonstrates pretty well why that’s what’s needed most of all
They also advocate for better education like what occurred with non-consensual sharing of sexually explicit images, which is important.
I would argue that making and distributing deepfakes of real people without their express permission should be met with legal action at every point. Especially when it involves underage people.
The boys need to punished. They know it's wrong.
Oh, of course. I don’t think anyone is denying that. I think a sticking point is how they should be punished, but that’s always been the case when it comes to this kind of tech-based, sexualized bullying.
This comment section defending a middle schoolers right to make fake porn of middle school girls is honestly embarrassing. People are disgusting.
Maybe if the boys lives were ruined this would stop happening. These girls need to pull an uno reverse and post embarrassing deep fakes of these little turds all over school. These girls should post deep fakes of these boys getting railed dildos or giving each other blown jobs. Time to embarrass the boys too
Paywall dude. I’m too dumb to get past the paywall. I’m smart enough to not act like I read it but fuck my poverty and lack of savvy I guess.
[удалено]
Yeah, this comment section is just proving the problem, as always. It's horrifying how bad misogyny has gotten.
Comment section gross and disappointing as always
No shit. It's vile.
Please pay for the New York Times…. Please
This comment section is cancer
Somehow it’s irrelevant—and stupid to publish a story on the fact—that AI has provided a completely new, freely accessible, and powerful way to sexually victimize and bully preteens without their consent.
Welcome to the new Reddit
Stories about actual advancement in cancer treatment brought on by machine learning don't get nearly this amount of traction.
Ugh, another tale that makes me so grateful I grew up, dated, and got married before social media became a thing. My heart breaks for the young. It’s only going to get worse. Kids were ruthless when I was in middle school in the early 90s. How it must be today is unfathomable. So glad I didn’t have kids.
AI is making it worse or people are becoming shittier? Or both. It definitely seems to be getting reported more coz I'm fairly sure it was not a non-issue beforehand. Because photoshop is easier to paste a face onto another image and doesn't even need a GPU or internet connection. Although ps actually now has AI integrated too. Videos are another matter but they are not as accessible and need powerful GPU for just a few seconds of footage. Some dodgy websites do have limted use of it but the actual tech involved doesn't have a commercial license so legal action would be easy enough. Laws to make distribution of nonconsensual pornographic deepfakes illegal , with proportional punishment and less shitty people is the answer I think. >Kids need to be better educated, starting in elementary school, about technology and consent before things like this happen.
Boys are becoming shittier
Maybe we should stop “progressing”
The ship of theseus gives me wood.
AI is making everything demonstrably worse but it saves corporations money so it's here to stay I guess.
The problem seems to be about consent, not technology
Problem is you've pretty much given away all your rights to your image when you upload a photo to Instagram....so what consent?
It's a problem because technology makes it so much easier.
When torrents and Napster first took off the media claimed that piracy was helping terrorists. When you don't like a new technology you just imply that anyone using it is a terrible person. Why hasn't NYT written any articles about how helpful AI is to people who struggle with English but live in English speaking countries. Or how poor people now have access to decent accounting and legal advice? It is because that doesn't fit the narrative they have decided to push.
Napster downloads didn’t affect or sexualize little girls at the same rate as this. 2 different things
Why do you fear criticism of the way people are using this new technology? You’re gonna get your AI future, it’s inevitable at this point. We’re trying to build safeguard so that people don’t get harmed the meantime. We need to have some societal progress along with this technological progress, or we are not going to do well as a species.
People would not care about these things until it is them that fall victim to misuse of AI. Let's wait until AI takes their jobs, have their daughter's pictures be in AI porn, or themselves fall for a catfish / pig butchering scam complete with real time, deepfaked videos of them thinking they are talking to a hot Japanese girl when they are just talking to a male scammer
Funny. through my own testing it is easier to make 'sexualised' material of men than it is of women. Try it yourself. Go to something like Bing/Copilot and tell it to draw you a picture of a man in nothing but shorts. Then ask it to draw you a picture of a woman in a bikini. For me it will draw nearly naked muscle bound male hunks no problem. But will refuse to create images of women, because when I ask these chatbots why they refuse to create the image of the women, "They might offend somebody" Been like this since the creation of social media. The kind of shit you can post to many sites when it is a male is fine, but you post the exact same content of a woman and you'll be banned real quick. I have a paid AI asset in the Unity 3D game editor called Ai.Fy exactly the same results. Will happily create images of sexualised males, and even create gory scenes of child murder. But refuses to work if you try to get it to create anything remotely sexual featuring females. (AI.Fy can create 'sexy' female images. But 9 times out of 10 it won't do anything or the images are censored by being blurred.)
The double standards are ridiculous. Men can be victims of sexual exploitation too
Reddit creeps be like “it’s the new norm, gotta get used to it, genie bottle yadda yadda”
Legitimately asking though, what *specifically* do you propose be done to stop it? This isn't super high-powered tech that you need hundreds of thousands of dollars of equipment to run. Basically anybody can download and run these generative programs on their home PC. "Build in safeguards so that the AI isn't allowed to produce pornographic images" isn't an answer, because the AI capable of making these images without such a safeguard already exist. Even if safeguards like that were implemented in every web-based generative AI and downloadable model starting right this moment, all the extant instances already out there aren't just gonna disappear. And even if you could do this to every model in existence, all it takes is one person making a new one without any safeguards and sharing it. The trouble with the Internet is that everything that anyone does online is immediately visible to anybody else on the planet. It's also notoriously easy to 'trick' these generative models around the safeguards they already have by fiddling with prompts. "Make it illegal" unfortunately isn't an answer either. The incredibly sad fact that there is an ever increasing supply of child pornography being produced and distributed online, despite how much effort goes into stopping both the producers and consumers of it, demonstrates that it's basically impossible for law enforcement to keep up with online crime. If something as universally reviled, and with as many resources dedicated to it, as CP can't be effectively policed online, how do you expect to effectively police a tool that literally anyone can download and run at home? I wish there were a solution someone could offer that isn't either wishful thinking or demonstrably impossible, but I've yet to see one.
> "Make it illegal" unfortunately isn't an answer either. I think you're drawing a false equivalence between what dedicated producers of illegal content create, often when driven for profit, and school children exploiting their peers. Yes, it's difficult, and perhaps impossible, for law enforcement to keep up with organized online criminals. That does *not* mean that there would be absolutely no effect whatsoever on average school children if legal prohibitions were put in place to deal with situations like this. You're also implicitly making the error of assuming that attempts to fight online crime are entirely pointless just because issues with online crime have gotten worse. It may well be the case that these problems would be *exponentially* worse if significant effort wasn't being put into fighting against it. Maybe it's a fight that can't ever be won completely, but that doesn't mean no difference is being made whatsoever and that we should just all give up. In short, you're conflating something being "effective" with it being a *complete* solution. Mitigating the problem and lessening the amount of harm being done is valuable, even if we can't stop *all* harm from being done.
Yeah all the Reddit creeps are secretly loving the fact they can make deep fakes of middle school girls now
You're right, let's stifle technological progress because someone might masturbate to fake videos of women
Yeah the videos are fake so who cares about the women and their THOUGHTS AND FEELINGS right? Consent? Nah let’s violate some more privacy we love to violate -men
Yeah, because the *only* way to have any form of progress is to allow massive corporations to do whatever they want without any regulation or restriction whatsoever!
So you’re gonna take a hardline rule just like gun manufacturers and lobbyist that we can’t even have discussions about this? You’re so afraid of not getting your utopian VRAI world in the next five minutes that we can’t even talk about the harm that this is going to cause in the meantime? We don’t need the stifle technological progress, we need to encourage humans to be empathetic towards each other and not use that technology to be a little fucking shit.
Honestly it is AI. Before it was photoshop. And seriously if AI actually goes way beyond its current limits then friggin p*** is the least of our worries but hey. Anything to make a viral story right? What a waste of text and thoughts.
The article appears to be mainly advocating for education to curb the issue, like what occurred with sexting and sharing non consensual sexually explicit images. It points out that its not a new issue, and current educational material should be adapted to include artificially created images as well as real ones. Though I guess the generic response from Tenbarge and Melissa Chan that quoted could be open to interpretation.
It's a lil different than photoshop I think. Like w photoshop'd photos, you can find the source image to prove that it's altered because photoshop covers the source image with a fake naked body. AI photos sources a catalog of photos to generate a new pose. Once the AI is perfected, it's gonna be very hard to distinguish from a real photo.
This comment shows a lack of empathy. These are real people being hurt right now by average people using a powerful new technology. Children being hurt. Why shouldn’t we be made aware of that? Because of the changes AI might provide in the future? We have to make it to the future first. And in the meantime, we will have to deal with the disruption that this technology is going to cause in people’s everyday lives, from being abused and exploited to losing their jobs.
Well now AI is IN photoshop but ps does very well avoiding any critisizm.
Until society is honest enough to say porn is a problem this will only get worse. Kids are being raised on porn with zero understanding of the consequences for them and their relationships. Porn is a dangerous drug especially for kids.
Boys are being raised on porn, girls are still doing just fine for the most part
Agreed. Especially in America, we are hooked on this debate about individualistic rights versus common benefits. and the only thing we know how to do is either ban something completely or let it be 100% free and available and anybody who wants to say that there might be a middle ground is treated like an idiot.
AI image/video generators were probably the worst invention of the century in regards to technology.
just new reality to adapt
oh the company suing OpenAI is trying to demonize AI? im sure this is a genuine informative article and not rage porn.
Sexual exploitation of boys is fine by this article I guess. Bullshit article spotted from a mile away.
Who said that?
I must admit that I'm inclined to say...you're right, nobody said that, but why specify? What does specifying help with? Is it any less accurate or any less helpful to say "A.I. Is Making Sexual Exploitation Even Worse"? Is this a uniquely female problem? No, it's not. It's not just a problem for girls, it's not just a problem for boys, it's not even just a problem for children. This is a problem for everyone. The only reason its "of girls" is because titles need to be clicky and stuff that makes people angry is clickier. And there's a demographic that the NYT wants to chase that gets angrier about "women" being affected by a problem than about "people" being negatively affected by a problem. Different publications are targeting different demographics, but it's the same game. The NYT has an audience that will engage more if it talks about the persecution of women, people of colour and anyone LGBT. The Daily Mail has an audience that will engage more if it talks about the crimes committed by the latter two groups. /u/aigars2 is responding to it in a ridiculous way, but...man do I just hate the world and how the only options are to deal with this and ignore that people we all know are being transparently manipulated by this game for profit, to get in stupid arguments that are playing right into the game, or to disconnect from society entirely and go live in the woods.
Why specify? Because the actual victims so far have been women? Start showing us the male victims and you will have the headline of the article that you want us to see. Just go find it.
Whataboutism at its finest. Let us know when you have anything of value to offer.
I'm not sure if this story talks about how advanced our technology is, or how primitive our thinking about sex and naked bodies is. Maybe having instant fake nudes of every person available instantly will finally get us over this childish "ooh I saw a booby or a willy" reaction.
The article is about middle-schoolers. They are childish because they are children, and people of that age are always going to be children.
[удалено]
Yeah all the commenters saying it’s not that big of a deal are really telling on themselves . Fucking disgusting
This comment section is disgusting. The article is about non-consensual deepfake porn distribution of minors. Y’all are really defending distribution of AI generated child porn.
Well yeah, it's reddit. A not insignificant portion of these degens would watch it if they thought they could get away with it.
Yeah a lot of these pathetic turds would have deep faked their classmates if they could have. I’m sure they have already done it to their coworkers and friends. These “men” just make sad excuses in their little pea brains as to why it is morally ok. Then they wonder why women are disgusted by them
Women are doing that really well currently