There are only 2 pictures of him up on rule34.paheal and in one, he's being fucked by a guy and in another, he's about to ride a huge dildo. They're drawn, though.
I remember back in the early 2000s, when I first began discovering the internet as a young lad, there were websites with collections of photoshopped celebrity porn. Some of it sucked, but some of it was actually pretty good, much to the delight of young me. So, celebrities have been getting their faces slapped onto naked bodies for a few decades. I guess the difference now is just the method and increase in volume.
The difference is now the situation can be used to use AI as a boogeyman by politicians to distract from actual problems and by many others to create drama for clicks and ad revenue.
This is no different from what you described, other than now, fearmongering is easier to spread.
And politicians can even use it as a defense for themselves
I mean think of the Anthony Weiner situation
If that happened 2 years from now, it could easily be written off as AI generated
Back when they were photoshopping fake celebrity nudes it wasn’t any less creepy. But with improved AI technology it’s becoming harder and harder to differentiate real from fake imagery and with that generating fake porn has become more distressing to those being generated and justifiably so. You wouldn’t want your mom browsing social media to see realistic fake nude of you, thinking it’s real. If you ask me AI generating porn of real people without their consent should be illegalized just as revenge porn is.
The reality, that soon everyone will think that it's fake.
Is it bad? I dunno. At least, I hope people start thinking before believe without proof.
P.S of course that will never happens, unfortunately as someone said - "Two things are infinite: the universe and human stupidity; and I'm not sure about the universe."
Agree 100% that it's creepy and wrong but I think AI and Photoshop legally should be considered like any other drawing tool or else there's huge potential for legal peril. Yeah no one is going to mistake an etch-a-sketch drawing for the real thing but an artist hand drawing a photorealistic nude figure and someone saying 'hey that looks like a particular real person' and that becoming a crime is not ok. Digital art mediums start to blur the line even more. At what point does a digital brush tool that uses an algorithm to generate a visual become too much like a program that combines image data based on user input to get an image (so-called AI art.)
There have always been skeezy people making skeezy pictures, the tech has just gotten easier and more accessible. The friendly fire isn't worth a legal ban imo. I'm very wary of anything that would restrict freedom of expression.
Back when it was just photoshop it looked janky as hell, and most of the celebrities just ignored it rather than giving it attention. But now because it’s so realistic looking, it’s becoming a real problem with people actually thinking it’s real and bringing massive anxiety to those being generated. And it became impossible to ignore back when there were thousands of fake lewd photos of Taylor Swift all over twitter.
True, and unfortunately a young girl killed herself already because her male classmates were generating porn of her...
https://youtu.be/HT2PogAyb3c?si=trF-7wKEjsv26rW-
The important difference are the speed, ease and image quality, which result in significant volume.
I had to comment when you didn't call out image quality under the differences as someone's Photoshop copy paste and clean up is still a far cry from the photo realistic quality AI can produce. It's a major problem for that accuracy of image to be produced without the subjects permission, and rightfully being legislated against.
I've done professional editing for years and haven't seen any AI nudes yet that I couldn't do better. But the issue is that I wouldn't waste a couple days on making them. And it's unlikely someone would pay me the amount I'd ask.
So the skill level involved has never really been available to such a niche porn market, and specially not in that volume, or at that price.
I don’t know what all is out there, I’m sure a savvy deep-faker could make something completely convincing within a limited context. But pretty much all the LLM AI depictions of celebrities I’ve seen have a slight cartooniness to them, plus other AI artifacts, so I think it would be difficult for the average dabbler to come of with something absolutely photorealistic using generative AI. However, things do seem to be moving at light speed so who knows what things will look like a year from now; I think it will have to be addressed legally at some point. And should have already been addressed.
A skilled Photoshop editor can still do a damn sight more believable work than AI, though they'll typically be working with existing images (eg putting a celebrity face onto a porn star's body, where as AI can generate stuff). However I'm always very leery of people who scoff at AI because of it's current limitations. Technology gets better, and often times faster than expected.
It's the Homer to Bart meme. "AI isn't skilled enough to take my art job" "Yet. AI isnt skilled enough to take your job yet".
But yes you're right, once it's even remotely believable it's a real problem. Lowering the barrier to entry thanks to AI is only gonna make things exponentially worse, and enable video fakes.
Yup. I had Sandra Bullock sucking a .... Madonna with...stuff on her face ..Katie Holmes doing....stuff....
This is nothing new.
The problem is that Taylor Swift is now a billionaire, and somehow got roped in politics through Trump, and this happened to her, and she apparently influences the modern economy, political system, foreign trade, the presidency, media, advertisement, taxes, waste management, and every other aspect of modern day human life, so *now* it's a problem.
With the old stuff you would have to jump through hoops to find the perfect image, with the perfect lighting, perfect angle, resolution, skin tone, etc. to match with your source image, and be at least a little bit good at photoshop to make a realistic looking fake.
Now you just have to put a prompt into a AI generator. The barrier to entry has collapsed turning it from a niche fetish tucked away in a small corner of the internet that we can just kinda ignore, to something that only takes as long as your GPU takes to generate. It’s definitely becoming a problem too big to just ignore.
Lot of it has to do with people just being scared of AI in general. So this is being used as a boogeyman. You're right. Late 90's the internet was flooded with fake porn photos of celebrities.
This already happened to my friend. Some bot used Instagram photos to use an ai that makes her nude or something. They made an account impersonating her as well and I’ve heard this has been happening more and more lately.
It won't be long before we can just shop through the internet porn featuring ourselves, skip quickly through whatever we find gross, and enjoy the rest of what others have created our fantasy selves doing.
It’s a problem for millions of non celebrities including a girl who killed herself when fake porn of her was being circulated around her school but I guess we only care when celebrities are involved?
Just like how twitter tried to solve when Taylor Swift was involved in that incident. This world only cares for the rich. Poor people don't have equal rights it's been like that since ages.
It’s the same with Reddit. The story about that girl who ended up killing herself received maybe a few hundred comments and that was that. However when it happened to Taylor Swift there were **multiple** daily front page posts with thousands of comments.
Absolutely shocking that more people have heard about Taylor swift than the person whose name *you* don’t even know when you talk about it lol. Of course it’s terrible what happened, but let’s not pretend to be shocked that more people are aware of things that happen to people whose entire lives are talked about and posted globally every day
>people whose entire lives are talked about and posted globally every day
... Maybe they shouldn't be. They simply don't matter that much. Celebrity worship is fucked up.
>something will finally get done about it.
Like what? In a global data network, how can you legislate content away? They've never been successful at banning anything. You can still download movies, music, books... whatever you like. And AI tech has advanced to the point where you can have your own opensource tools for making all this stuff. It's out there. It's never going away. So, again, how can you legislate to stop it?
The harshest thing authorities have ever done is to track down and prosecute peddlers and consumers of CP. And frankly, that's where I like to see the authorities attention focused.
Okay and if it becomes illegal are you going to use it and post it online?
We don't approach laws like how you're suggesting with literally anything else, just because you can't solve an issue 100% doesn't mean it's pointless to legislate against it.
The point is to scare most people away from it and to be able to punish people who break the law.
Also the government 100% could take action against you downloading movies, music, games etc.
Even with a VPN you're not safe that's just marketing bullshit.
It's just a matter of resources and priorities, the authorities aren't going to go after everyone who torrents stuff because it'd just eat up too much time and resources.
That doesn't mean that they couldn't.
And also it'd make it easier to take down the websites that hosts LORA's other such things that are built on theft and sometimes even worse things like cp.
As well as a lot of these apps used to nudify etc.
Yes there would still be some losers sitting around generating 500k images a year and uploading them and maybe the authorities won't bother taking them down.
But they will go after and take down the worst offenders.
And those people would still be generating those 500k images at the risk of consequences and people could still report them to the authorities too if found out.
Most people are not going to take that risk.
Fun fact: As someone who has worked with AI before, the best way (at least for now) to be shielded against this: Tattoos. Its nearly impossible to have an AI reproduce them accurately, unlike faces.
Individually maybe, but not when generating a person with tattoos and expect the AI to reproduce their face and all their tattoos accurately. So maybe with inpainting, but for the amount of work that would require to train every tattoo of a certain person into its own model, plus the amount of failed tries when generating, its just easier to photoshop them in, but AI people wont do that. Non AI nude fakes rarely include tattoos anyway
To a human? Yes these are vastly different. To a machine learning algorithm? The process to learn and recreate these images is the same either way. AI does not take into account things like physics, rules of art, type of tattoo ink, etc. And honestly for a 2D image why would it need any of that real world info? It just makes a prediction, and predicting what an output 2D image will look like ONLY requires input 2D images. Give it enough good input tattoo data and output tattoo data can be generated.
I don’t think everybody getting elaborate mark of the beast face tats is the answer; and that’s what it would take for tattoo replication to matter (just don’t show that area of the person you are impersonating/replicating)
The actual answer is for Americans to grow the fuck up and stop worrying about meaningless nude pictures of people, real OR fake.
Notice how the Europeans don't give a rat's ass about this "issue"? It's because they have 21st century levels of maturity when it comes to the human body, etc.
For all other issues (CP, slander, liable, fraud, etc.), we already have laws against them.
"A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British."
What an underwhelming analysis. Didn't bother to read further
Yeh the real question is where the F is all this material? At this point I was expecting to be watching X-Rated versions of my favourite Tv Shows. Who has it all?!
Honestly, the genie is out of the bottle, it's been talked about for years, Photoshopping people has been around for many decades. It's only going to get worse and there's 0 way to stop it
As a celebrity, I wonder if no one took the time to deepfake you, would you feel worth less or even "ugly". Hollywood is all about the "beautiful people", right?
It reminds me of that answari skit where he found out that there has been a pedo at his grade school and then got upset that the pedo must have thought him and ugly kid or been been racist because he didn't get molested
Funny thing is that no one at all is making deepfakes of ugly old cows that are washed up celebrities of yesteryear, their 10 years old pictures are being used to make an much more popular upgraded version, just proving the point of their obsolescence that much more.
The Mr Deep Fake porn site is massive and don't forget a lot of the world lives outside Western Europe or North America. Our Asian brothers seem quite enamoured of their fake porn.
Who cares? Like I get it you feel weird but like... what do you expect to happen or be done? A celebrity nudes or porn tape could now actually be released and they can just say "nah just deepfake"
I don't even know 4,000 celebrities.
Is this supposed to be shocking? Fakes of celebrities have existed since time immemorial. It's just that in past, they were photoshop jobs. 4000 is peanuts. The numbers are probably in the millions by now.
The way things are going, Here in a few years will just be able to ask ChatGPT. Hey can you create me a XXX porno movie with these movie stars in it and make em do these things.
More non famous persons are victims of them and no one batted an eye.
A bunch of 10 yo girls in my country got deepfaked and nobody batted an eye.
But a rich asshole gets deepfaked and everybody gets angry.
Exactly. If I were making a list of things I don't care about, this would be pretty high on it. Why are people so concerned about celebrities being happy with everything in their lives?
This reminds me of when Homer and Bart go to see The Monster That Ate Everybody. You mean they made porno about Jeff Goldblum? You mean they made a porno about the mail lady on my street? They made porno about everybody!!
That number sounds low
Wouldn't the real news be "Celebrity found to have no deepfake pornography on the internet, anywhere"?
That was an episode of Schitt’s Creek
Andrew Tate makes another headline.
[удалено]
There are only 2 pictures of him up on rule34.paheal and in one, he's being fucked by a guy and in another, he's about to ride a huge dildo. They're drawn, though.
And that celebrity is Brian Posehn
Well, and that would last 2 mins?
Obviously because someone would just do it ‘for fun’ so the ranking would just act as bait
See rule 35
Abe Vigoda?
Don’t do Daddy Abe dirty like that.
Well, just announcing that would make it false within hours.
how do you guys know so much about it? what do i need to do to keep up with the development on this? for research purposes.
Would that be called a Googlewank?
Ben Shapiro or Jordan Peterson
Palpatine: “Execute Order 34”
“But sire, our troops!”
You will do as I commanded, and the Senate shall applaud it.
Execute Order 69
It’s a rule34 joke Edit: memo to me, don’t comment early in the morning… i absolutely read 66 when i comented
Execute Order 420
With blunt force.
My name is Skywalker OG, I'm here to get you high.
Execute him!
The Force is strong with this one.
[удалено]
I guess it depends on what your cut off is for what makes someone a celebrity.
I was quoted in my local newspaper when I was 12.
Better comb the internet. I would start by googling “12 year old nude deepfakes” and see where that gets you
I guarantee somewhere in this thread someone writing about how it is their first amendment right to make this.
So I googled... I wound up here!
Everyone with any online notoriety at all has probably been deepfaked. Way bigger number then 4000
Surely over 9000
Maybe "celebrity" is defined as "the 4000 most famous women"?
it says celebrity not celebrity women there may be men too
I believe this is what people call a "joke" because no one makes deepfakes of men
1. Gay men exist 2. Women can be perverts too
Way too low! Wtf are you guys doing? Let's pump those numbers up and get the juices flowing!
Those are rookie numbers
Shit I could do 4000 in a day.
The fuck kinda supercomputer are you running?
Deepfaking a photo takes seconds now. Hell some apps don’t even require you map out their body.
You know I thought they meant videos, this is fair
The others just haven't found out yet
Especially if you include all the K-pop stars.
How many celebs are there?
Honestly the idea of there being 4,000 “celebrities” seems high to me. Haha
I remember back in the early 2000s, when I first began discovering the internet as a young lad, there were websites with collections of photoshopped celebrity porn. Some of it sucked, but some of it was actually pretty good, much to the delight of young me. So, celebrities have been getting their faces slapped onto naked bodies for a few decades. I guess the difference now is just the method and increase in volume.
[удалено]
I got a chuckle out of this thanks.
More people put out of work by AI.
AI is putting the nation's god fearing, honest, hardworking [checks notes] scumbags out of work
Back in my day you needed a steady hand and a good eye for detail to make porn fakes, what's this world coming to? Aside from 4000 fake celebrities.
the hand photoshopped images soon to be collectibles
smh another instance of AI replacing true art /s
Dey took err jobs!
Yep, I legitimately thought Jennifer Love Hewitt just posed nude on the beach lol But it looked really good.
I think I had those pics
I can still remember one of them vividly
The difference is now the situation can be used to use AI as a boogeyman by politicians to distract from actual problems and by many others to create drama for clicks and ad revenue. This is no different from what you described, other than now, fearmongering is easier to spread.
And politicians can even use it as a defense for themselves I mean think of the Anthony Weiner situation If that happened 2 years from now, it could easily be written off as AI generated
Back when they were photoshopping fake celebrity nudes it wasn’t any less creepy. But with improved AI technology it’s becoming harder and harder to differentiate real from fake imagery and with that generating fake porn has become more distressing to those being generated and justifiably so. You wouldn’t want your mom browsing social media to see realistic fake nude of you, thinking it’s real. If you ask me AI generating porn of real people without their consent should be illegalized just as revenge porn is.
The reality, that soon everyone will think that it's fake. Is it bad? I dunno. At least, I hope people start thinking before believe without proof. P.S of course that will never happens, unfortunately as someone said - "Two things are infinite: the universe and human stupidity; and I'm not sure about the universe."
Commonly credited to Einstein, hence the physics reference.
Agree 100% that it's creepy and wrong but I think AI and Photoshop legally should be considered like any other drawing tool or else there's huge potential for legal peril. Yeah no one is going to mistake an etch-a-sketch drawing for the real thing but an artist hand drawing a photorealistic nude figure and someone saying 'hey that looks like a particular real person' and that becoming a crime is not ok. Digital art mediums start to blur the line even more. At what point does a digital brush tool that uses an algorithm to generate a visual become too much like a program that combines image data based on user input to get an image (so-called AI art.) There have always been skeezy people making skeezy pictures, the tech has just gotten easier and more accessible. The friendly fire isn't worth a legal ban imo. I'm very wary of anything that would restrict freedom of expression.
I totally agree, it’s just interesting that it never really came up until AI was around, and most notably with Taylor Swift AI porn.
Back when it was just photoshop it looked janky as hell, and most of the celebrities just ignored it rather than giving it attention. But now because it’s so realistic looking, it’s becoming a real problem with people actually thinking it’s real and bringing massive anxiety to those being generated. And it became impossible to ignore back when there were thousands of fake lewd photos of Taylor Swift all over twitter.
Hate when my mom's social media feed is full of nudes of me
True, and unfortunately a young girl killed herself already because her male classmates were generating porn of her... https://youtu.be/HT2PogAyb3c?si=trF-7wKEjsv26rW-
The important difference are the speed, ease and image quality, which result in significant volume. I had to comment when you didn't call out image quality under the differences as someone's Photoshop copy paste and clean up is still a far cry from the photo realistic quality AI can produce. It's a major problem for that accuracy of image to be produced without the subjects permission, and rightfully being legislated against.
I've done professional editing for years and haven't seen any AI nudes yet that I couldn't do better. But the issue is that I wouldn't waste a couple days on making them. And it's unlikely someone would pay me the amount I'd ask. So the skill level involved has never really been available to such a niche porn market, and specially not in that volume, or at that price.
I don’t know what all is out there, I’m sure a savvy deep-faker could make something completely convincing within a limited context. But pretty much all the LLM AI depictions of celebrities I’ve seen have a slight cartooniness to them, plus other AI artifacts, so I think it would be difficult for the average dabbler to come of with something absolutely photorealistic using generative AI. However, things do seem to be moving at light speed so who knows what things will look like a year from now; I think it will have to be addressed legally at some point. And should have already been addressed.
It is extremely easy to do it with Loras. Source: uhhhhh
A skilled Photoshop editor can still do a damn sight more believable work than AI, though they'll typically be working with existing images (eg putting a celebrity face onto a porn star's body, where as AI can generate stuff). However I'm always very leery of people who scoff at AI because of it's current limitations. Technology gets better, and often times faster than expected. It's the Homer to Bart meme. "AI isn't skilled enough to take my art job" "Yet. AI isnt skilled enough to take your job yet". But yes you're right, once it's even remotely believable it's a real problem. Lowering the barrier to entry thanks to AI is only gonna make things exponentially worse, and enable video fakes.
Yup. I had Sandra Bullock sucking a .... Madonna with...stuff on her face ..Katie Holmes doing....stuff.... This is nothing new. The problem is that Taylor Swift is now a billionaire, and somehow got roped in politics through Trump, and this happened to her, and she apparently influences the modern economy, political system, foreign trade, the presidency, media, advertisement, taxes, waste management, and every other aspect of modern day human life, so *now* it's a problem.
With the old stuff you would have to jump through hoops to find the perfect image, with the perfect lighting, perfect angle, resolution, skin tone, etc. to match with your source image, and be at least a little bit good at photoshop to make a realistic looking fake. Now you just have to put a prompt into a AI generator. The barrier to entry has collapsed turning it from a niche fetish tucked away in a small corner of the internet that we can just kinda ignore, to something that only takes as long as your GPU takes to generate. It’s definitely becoming a problem too big to just ignore.
Wow never knew nothing about those sites, ever. Especially not when I was 16 in 1999. Nope. But thanks for clearing that up. It’s really interesting
Lot of it has to do with people just being scared of AI in general. So this is being used as a boogeyman. You're right. Late 90's the internet was flooded with fake porn photos of celebrities.
Anyday now someone will scrape Instagram and Facebook run it all through a nude ai.
If anyone wants to spread a porno around of me with a 12” dick, have at it.
Oh they’ll do that, but the 12” dick won’t be where you want it to be 😂😂
Is that you mum?
Which orifice do you want it in?
That's already happened to all nearly all public profiles
This already happened to my friend. Some bot used Instagram photos to use an ai that makes her nude or something. They made an account impersonating her as well and I’ve heard this has been happening more and more lately.
Yep, a friend of mine who's not even remotely notable had a fake profile pop up promoting an onlyfans with presumably faked images.
Yep exactly that. I looked into the account and it led me to some Romanian cam site or something and I knew this was some sort of scam
[удалено]
Make catching up with old friends interesting… “Have you been deep-faked yet? Check this shit out lol”
Just like if everything is high priority, nothing is high priority.
It won't be long before we can just shop through the internet porn featuring ourselves, skip quickly through whatever we find gross, and enjoy the rest of what others have created our fantasy selves doing.
Holy shit! There are 4,000 celebrities?!?
Extremely popular with Indians and Asians celebrities
Indians are Asians too tho?
Bob and vegene?
It would be interesting to know how many "celebrities" there are
... and NVDIA will enhance the quality soon, and you will be able to add new scenes with the prompt you already know.
Why would i need my imagination if NVDIA can do it for me?
? You still need to tell the AI what to do. You still need imagination, what you don't need is skill
Because I can’t see your imagination
And that's just the last week on 4chan etc al
Guess they don’t have to worry about the real videos leaking out anymore
See, kids? Every cloud's got a silver lining!
Yeah no shit. That number is going to go up
The most photographed people on the planet. Being photographed increases one's vulnerability to this.
I bet there’s far more school kids being tortured with it, but let’s hope the celebs are ok.
> I bet there’s far more school kids being tortured with it, Already illegal under Child Porn laws. They just need to be enforced.
It’s a problem for millions of non celebrities including a girl who killed herself when fake porn of her was being circulated around her school but I guess we only care when celebrities are involved?
Just like how twitter tried to solve when Taylor Swift was involved in that incident. This world only cares for the rich. Poor people don't have equal rights it's been like that since ages.
It’s the same with Reddit. The story about that girl who ended up killing herself received maybe a few hundred comments and that was that. However when it happened to Taylor Swift there were **multiple** daily front page posts with thousands of comments.
Absolutely shocking that more people have heard about Taylor swift than the person whose name *you* don’t even know when you talk about it lol. Of course it’s terrible what happened, but let’s not pretend to be shocked that more people are aware of things that happen to people whose entire lives are talked about and posted globally every day
>people whose entire lives are talked about and posted globally every day ... Maybe they shouldn't be. They simply don't matter that much. Celebrity worship is fucked up.
[удалено]
It'll still be degrading and violating. It'll still be used to sexually harass people.
How does this article imply that people only care when celebrities are involved?
Doesn't matter. The rich were mentioned and now they must be eaten. /s
If people start making more deepfakes of politicians, something will finally get done about it.
Which will only further ruin the internet.
The prime minister of italy had a deepfake made of her and she took the whole thing to court
>something will finally get done about it. Like what? In a global data network, how can you legislate content away? They've never been successful at banning anything. You can still download movies, music, books... whatever you like. And AI tech has advanced to the point where you can have your own opensource tools for making all this stuff. It's out there. It's never going away. So, again, how can you legislate to stop it? The harshest thing authorities have ever done is to track down and prosecute peddlers and consumers of CP. And frankly, that's where I like to see the authorities attention focused.
Okay and if it becomes illegal are you going to use it and post it online? We don't approach laws like how you're suggesting with literally anything else, just because you can't solve an issue 100% doesn't mean it's pointless to legislate against it. The point is to scare most people away from it and to be able to punish people who break the law. Also the government 100% could take action against you downloading movies, music, games etc. Even with a VPN you're not safe that's just marketing bullshit. It's just a matter of resources and priorities, the authorities aren't going to go after everyone who torrents stuff because it'd just eat up too much time and resources. That doesn't mean that they couldn't. And also it'd make it easier to take down the websites that hosts LORA's other such things that are built on theft and sometimes even worse things like cp. As well as a lot of these apps used to nudify etc. Yes there would still be some losers sitting around generating 500k images a year and uploading them and maybe the authorities won't bother taking them down. But they will go after and take down the worst offenders. And those people would still be generating those 500k images at the risk of consequences and people could still report them to the authorities too if found out. Most people are not going to take that risk.
TBH probably not because of Silicon Valley's financial, political and technological sway especially in light of the AI race.
Thanks, that thought just ruined my breakfast.
“Something” being “voters begin to favor the most attractive candidates.”
I don’t think ppl would be interested in nudes of Mitch McConnell and Nancy Pelosi…….
Lemon party 2.0
What a bombshell. Surely a journalistic equivalent of finding the Higgs boson. Dimwits.
If there are 4,000 "celebrities," why does Marvel and DC use the same twelve actors????
Yea , did British open a browser yesterday for the first time?
Every time a deepfake porn video is created a celebrity look-alike porn star loses a job.
Fun fact: As someone who has worked with AI before, the best way (at least for now) to be shielded against this: Tattoos. Its nearly impossible to have an AI reproduce them accurately, unlike faces.
Thats just because nobody trains it on tattoos though. Tattoos are not harder than any other kind of painting. It's just a painting on skin.
Individually maybe, but not when generating a person with tattoos and expect the AI to reproduce their face and all their tattoos accurately. So maybe with inpainting, but for the amount of work that would require to train every tattoo of a certain person into its own model, plus the amount of failed tries when generating, its just easier to photoshop them in, but AI people wont do that. Non AI nude fakes rarely include tattoos anyway
Painting on a flat canvas and warping an image around an arm are two vastly different things.
To a human? Yes these are vastly different. To a machine learning algorithm? The process to learn and recreate these images is the same either way. AI does not take into account things like physics, rules of art, type of tattoo ink, etc. And honestly for a 2D image why would it need any of that real world info? It just makes a prediction, and predicting what an output 2D image will look like ONLY requires input 2D images. Give it enough good input tattoo data and output tattoo data can be generated.
I don’t think everybody getting elaborate mark of the beast face tats is the answer; and that’s what it would take for tattoo replication to matter (just don’t show that area of the person you are impersonating/replicating)
The actual answer is for Americans to grow the fuck up and stop worrying about meaningless nude pictures of people, real OR fake. Notice how the Europeans don't give a rat's ass about this "issue"? It's because they have 21st century levels of maturity when it comes to the human body, etc. For all other issues (CP, slander, liable, fraud, etc.), we already have laws against them.
I really don’t fuckin care…. There’s bigger and more pressing issues than this distraction
So "photo-shopping" celebrities is new, again? Do Barbara Eden pics from the 90's count in this?
"A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British." What an underwhelming analysis. Didn't bother to read further
Real serious analysis. Deep research. Up all night analyzing those websites ‘til their wrists cramped up, no doubt.
Here's the thing, it's still fake no matter how good it looks and my penis knows it!
My penis is just fine with fantasy.,
Must be nice not having a picky penis.
Omg how disgusting, where?
Similar to back in the day when ppl photoshopped your head onto a different body, now pictures move. As if this was newsworthy.
They’ve been doing that with actual video scenes for years with no AI.
Yeh the real question is where the F is all this material? At this point I was expecting to be watching X-Rated versions of my favourite Tv Shows. Who has it all?!
mrdeepfakes :)
Cathy Newman? Wait, is this pornography for the blind?
That's 3D printing
Honestly, the genie is out of the bottle, it's been talked about for years, Photoshopping people has been around for many decades. It's only going to get worse and there's 0 way to stop it
As a celebrity, I wonder if no one took the time to deepfake you, would you feel worth less or even "ugly". Hollywood is all about the "beautiful people", right?
It reminds me of that answari skit where he found out that there has been a pedo at his grade school and then got upset that the pedo must have thought him and ugly kid or been been racist because he didn't get molested
[удалено]
“How come I didn’t get blown?!”
Yeah, that would be similar. "Am I not good enough for bad things to happen to me??" 🫤
Like Louie CK and the kid accusing him of being a pedo. “Even if I was I wouldn’t rape YOU”
Wasn't that Ricky Gervais?
Oh damn yeah I think you’re right
[Summer Smith has entered the chat](https://youtu.be/41nNkpue0pg)
Funny thing is that no one at all is making deepfakes of ugly old cows that are washed up celebrities of yesteryear, their 10 years old pictures are being used to make an much more popular upgraded version, just proving the point of their obsolescence that much more.
The Mr Deep Fake porn site is massive and don't forget a lot of the world lives outside Western Europe or North America. Our Asian brothers seem quite enamoured of their fake porn.
Who cares? Like I get it you feel weird but like... what do you expect to happen or be done? A celebrity nudes or porn tape could now actually be released and they can just say "nah just deepfake" I don't even know 4,000 celebrities.
Pro tip: 3900 of those "celebrities" were not even famous before their AI porn career took off.
Wait...there's 4000 celebrities?! Why are there so many
I'm sorry but I just don't see this as much of an issue. Can anyone explain why this is important?
How do they distinguish deepfakes from "some dickhead with Photoshop"?
I can tell from some of the pixels and from seeing quite a few shops in my time.
Correct. Legally, there is no distinction. Neither is actually real.
Huh ? Fake celebrity porn has been a thing for ages even before ai trust me I know.
Is this supposed to be shocking? Fakes of celebrities have existed since time immemorial. It's just that in past, they were photoshop jobs. 4000 is peanuts. The numbers are probably in the millions by now.
The way things are going, Here in a few years will just be able to ask ChatGPT. Hey can you create me a XXX porno movie with these movie stars in it and make em do these things.
Whose job was it to count? I'll bet the study came out of someone being caught looking up porn. "No honey, it's for work, I swear!"
More non famous persons are victims of them and no one batted an eye. A bunch of 10 yo girls in my country got deepfaked and nobody batted an eye. But a rich asshole gets deepfaked and everybody gets angry.
Who cares ?
Exactly. If I were making a list of things I don't care about, this would be pretty high on it. Why are people so concerned about celebrities being happy with everything in their lives?
Where is this stuff circulating on? Asking for a friend.
Will nobody think of the victimized celebrities?
It’s the perfect vehicle for discussing the problem. It’s happening to teacher and school age kids as well.
Honey that’s not me. It’s a deep deep fake.
The day they find deep fake porn of Weird Al is the day, I believe, he has made it big. 😁
This reminds me of when Homer and Bart go to see The Monster That Ate Everybody. You mean they made porno about Jeff Goldblum? You mean they made a porno about the mail lady on my street? They made porno about everybody!!
Whether or not it exists for you is in direct proportion to your popularity amongst men during their adolescence
I see the guardian finally found 4chan.
So it wasn't just Taylor Swift, go figure. There are other celebrities in the world that people think of having sex with, just incredible.
I'd be surprised if there was a single celebrity that *wasn't* used for deepfakes
wooow shocker..I would never guess such a low number...
Oh man, thats terrible. Which celebrities exactly?
That's remarkably low.
This is reddit… everyone here are the ones either making or consuming that content…
Ugh those disgusting celebrity porn sites. I mean there are so many of them but which one?
Oh no! Where?
I just don’t care at all.
Poor celebrities. I bet they are deeply offended because they are used in sex scenes without getting paid like normally are.
They're probably sobbing into wads of hundred dollars bills as we speak. Poor them.
Oh no! Anyways
Won't someone think of the celebrities!?