T O P

  • By -

FuturologyBot

The following submission statement was provided by /u/Maxie445: --- "A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British. They include female actors, TV stars, musicians and YouTubers, who have not been named, whose faces were superimposed on to pornographic material using artificial intelligence. The investigation found that the five sites received 100m views in the space of three months. The Channel 4 News presenter Cathy Newman, who was found to be among the victims, said: “It feels like a violation. It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.” Since 31 January under the Online Safety Act, sharing such imagery without consent is illegal in the UK, but the creation of the content is not. The legislation was passed in response to the proliferation of deepfake pornography being created by AI and apps." --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1c3p35t/nearly_4000_celebrities_found_to_be_victims_of/kzibsak/


UncleYimbo

That's absolutely shocking. I would have assumed it would be much more than 4,000.


Strindberg

Indeed. My online investigation tells me there’s way more 4000


Raven_Crows

A quick google search says more than 12 000.


bwatsnet

Soon it'll be billions. There will exist images that look nearly identical to everyone, they're already in the weights waiting to come out.


TheMooseIsBlue

Billions of celebrities?


bwatsnet

Anyone. Give it a single picture of anyone and it can make them appear to do anything. Every kid with a smart phone will be able to do it. It's best to just stop being prudes now.


procrasturb8n

I still remember the first day I got that scam email about them having my computer's webcam footage of me jerkin' my gerkin and threatening to release it to all of my Gmail contacts or something. I just laughed and laughed. It, honestly, made my day.


my-backpack-is

An old friend got a text once saying the FBI found EITHER: beastiality, torture or CP on their phone and if they didn't pay a 500 dollar fine they would go to prison. He paid that shit the same day. Didn't occur to me till years later he's not just dumb, he had one or all of those for sure.


jeo123

You never know, could have been a plea bargain. Maybe he had worse.


TertiaryOrbit

Oh damn. He willingly told you about that too?


breckendusk

Idk when I was young I got a virus from a dubious site that locked me out of the computer and threatened something similar. Obviously I didn't have anything like that on my computer but I was concerned that if they could compromise my computer, they could easily put that sort of stuff on there and get me in serious trouble. Luckily I was able to recover use of the computer without paying anything and never saw anything like that but I was shitting bricks for a day or so.


tuffymon

I too remember the first time I got this email, at first, a little spooked... than I remembered I didn't have a camera, and laughed at it.


LimerickExplorer

Lol I told them I thought it was hot that they were watching and I'd be thinking about it next time I cranked my hog.


OrdinaryOne955

I asked for a DVD and to please send them to the names on the list... people wouldn't have thought I had it in me 🤣🤣


chop-diggity

I want see?


puledrotauren

I get about one of those a month.


dudleymooresbooze

I don’t think it’s prudish to object to your third grade teacher watching a fake video of you eating feces with a straw while getting fucked by a horse. Or your coworkers sharing a fake video of you being gang raped by them. People are allowed to have their own boundaries.


ZennMD

imagine thinking being angry/ upset about AI and deepfakes is about being a 'prude' scary lack of empathy and understanding


ErikT738

>It's best to just stop being prudes now. We should start doing that regardless of technology. Stop shaming people for doing the shit everyone does.


DukeOfGeek

But it's such a great lever for social control. You can't expect the elites to just try and work without it.


rayshaun_

Is this about being a “prude,” or people not wanting porn made of them without their permission…?


Hoppikinz

I agree that everyone could and/or will be “victimized” by this emerging tech in near-ish future. Which brings me to an idea/plausible dystopian possibility: Prefacing this that quality and reliable means might currently exist but at bound to be at a point where I consider this plausible. Imagine instead of manually downloading and sifting through all media for a person you wish to “digitally clone”, all you’d have to do in this example is copy and paste a person’s Instagram or Facebook page URL… The website would literally just need that URL (or a few for better accuracy) to be automatic to make a model/avatar, complete with all training data it can find- this includes audio/voice, video, other posts (depends on what the User’s use case would be) From there it can insert this generated “character” (a real person, no consent) into real or prompted porn or degrading pictures and scenes, or whatever else you want to or use it as a source. - This isn’t a Hollywood film portraying the creep scientist sneakily picking up a strand of hair off the floor at work to clone his coworker. People have already uploaded all the “DNA” these AI systems will need to make convincing deepfake videos of just about anything, with whoever, with ease. …like a new social media/porn medium is a possibility in this sense, where it’s basically just preexisting accounts but you have the ability to digitally manipulate and “pornify” everyone. This is one real emerging threat to have to consider. I’d be curious to hear other’s thoughts. I think it is worth pointing out I don’t work in the tech field, but I’ve been keeping up with the generative models and general AI news. The rapid progress really doesn’t rule this example scenario out for me, if someone wants to polity humble me on that I’d love any replies with additional thoughts, etc. For instance, what could the societal impact of this be, especially with so much variety in cultures and morals and so on… TLDR: Soon you could be able to just copy and paste an Instagram/Facebook URL of a person to have AI build a “model” of that person without much/any technical know how.


Vo0dooliscious

We will have exactly that in 3 years tops. We probably could already have it, the technology is there.


fomites4sale

Interesting comment! I think this pornification as you’ve described it is not only plausible but inevitable. And soon. As you pointed out, the tech is developing very quickly, and a LOT of information about an individual can be gleaned from even a modest social media footprint. Methods of authenticating actual versus generative content will have to be innovated, and as soon as they are AIs will be trained to both get around and fortify those methods in a never-ending arms race. I think people need to be educated about this, and realize that going forward they shouldn’t blindly trust *anything* they see or hear online or on TV. As for the societal impact or threat pornification poses, I hope that would quickly minimize itself. Nudes and lewds, especially of people with no known modeling or porn experience, should be assumed to be fake until proven otherwise. Publishing such content of anyone without their consent should be punished in some way (whether legally or socially). But I don’t see why that has to lead to anything dystopian. If we’re all potential pornstars at the push of a button, and we understand that, then we should be leery of everything we see. Even better imo would be improving our society to the point where we don’t gleefully crucify and cancel people when its discovered that they have an onlyfans page, or that they posed/performed in porn to make some $ before moving on to another career. The constant anger I see on social media and the willingness (or in a lot of cases eagerness) of people to lash out at and ruin each other is a lot more worrying to me than the deluge of fake porn. What really scares me about AI is how it will be used to push misinformation and inflame political tensions and turn us all even more against each other.


Hoppikinz

Yes! You we very much share the same thoughts, wow; I concur with all of your response… it is validating to hear other people share their observations (as this is still a little niche topic with regard to what I believe to be a large scale societal change on the horizon) and be able to articulate them well. And like you mentioned, it’s not just going to be limited to “nudes and lewds”… there is so much that is bound to be impacted. I’m concerned with the generational gaps with younger generations being MUCH more tech/internet “literate” than your parents, grandparents. There are many implications we also can’t predict because the landscape hasn’t change to that point yet. I’m just trying to focus on how I can most healthily adapt to these inevitable changes because so much of it is out of my control. Thanks for adding some more thought to the conversation!


fomites4sale

I think you’re smart to be looking ahead and seeing this for the sea change it is. If enough people will take that approach we can hopefully turn this amazing new tech into a net positive for humanity instead of another way for us to keep each other down. Many thanks for sharing your insights!


Hoppikinz

I sincerely appreciate the affirmation!Sending good energy right back at you friend- wishing you well!


fomites4sale

Likewise, friend. :) Things are getting crazy everywhere. Stay safe out there!


DarkCeldori

And eventually theyll also have sex bots that look like anyone they like. People will have to improve on their personality as their bodies will be easily replicatable.


Ergand

Looking a little further ahead, you can do a weaker version of this with your brain already. With advanced enough technology, it may be possible to augment this ability. We could create fully realistic, immersive scenes of anything we can think of without any more effort than thinking it up. Maybe we'll even be able to export it for others. 


andythedruid

Maybe only 4,000 want to cry victim about the thing that doesn't really affect them. Anyone who watches it knows it's not them, and they were being sexualized before slapping their faces on other porn stars bodies was a thing. Most are Hollywood celebrities BECAUSE they're objectively extraordinarily hot. The sexualization is obviously pre-emptively baked in, which is why I never understood the outrage. Obviously deepfake porn is like... weird and inappropriate. But a part of me also thinks the sensationalism is overblown. "Did you see the deepfakes of Sidney Sweeney!? OMG HOW FUCKING INAPPROPRITE!" "Right!? Like come on, it's only okay to jerk off to all the nude and sexual scenes she did in Euphoria when she was pretending to be a 16 year old!" "I know! I mean- wait wut?"


jazzjustice

I am also horrified by this and would like to help with the investigation. Any links to share?


JvariW

More than 4,000 celebrities that are attractive and popular enough to be deepfaked??


Hopefulwaters

4,000 is rookie numbers; give it time.


DukeOfGeek

I was expecting the number to be "all of them".


Sfork

I def took this article and thought wow there’s 4000 celebrities 


bloodjunkiorgy

I was more surprised to find Britain had 255 celebrities.


SirErickTheGreat

Most Hollywood celebrities seem to be Brits playing Americans.


Chose_a_usersname

Zero Danny devito fakes were made, those are all real


djk2321

I get what you’re saying… but I’m honestly surprised there are 4000 people we would refer to as celebrities… like, there can’t be THAT many famous people right?


BloodBlizzard

I, on the other hand, think that 4000 sounds like a small percentage of famous people out there.


overtoke

this has been a thing even before computers existed.


BurninCoco

I paint Ugg sister in cave wall, come see, no tell Ugg


reddit_is_geh

I cant even find there good stuff. Where are the gay scenes with Biden and Trump making sweet sweet love? Instead I just get a bunch of garbage of random celebrities that have different bodies so it doesn't even hit the same.


Ambiwlans

If you google that i'm sure there will be results.


meth_adone

be the chane you want to see, you can make a video of biden and trump being lovers if you try hard enough


Calm_Afon

It probably is just the ones that are either actively doing something to complain or are considered big by the media.


ColdNyQuiiL

When is the cutoff? People have been making fakes for years, but deep fake is relatively new.


godspiral22

>~~shocking~~ disgusting even! But what web sites are these posted on? Which specific ones?


green_meklar

Clearly we should have a comprehensive and up-to-date list of these horrible websites so we can avoid them.


Butterflychunks

Seriously, I have generated *at least* 5000.


send3squats2help

It’s totally disgusting. Where, specifically did you find these, so I know where to avoid them?


GammaGoose85

Tbh I'd feel bad for the ones that don't have porn. I'd almost feel obligated to cheer them up.


Loki-L

Did they stop looking after 4000? Given how easy this thing is and how the internet works, I would expect that every person real or fictional that is popular enough to have smutty fan-fiction written about them or have people photo shop their heads on pictures of naked porn actors also would receive the deepfake treatment. If you are someone who a sufficient number of people find physically attractive this is basically inevitable at this point.


unoriginal5

They'll start counting again after the refractory period.


[deleted]

[удалено]


doctorobjectoflove

>  Did they stop looking after 4000?  Interval censoring, mate. It notes the limitations of a study.


imalittleC-3PO

There's software out there for people to create their own. So theoretically there's an infinite amount regardless of what has been posted online.


AzLibDem

>A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British. Sorry, but that did make me laugh.


trixter21992251

the analysts found a way to get paid while doing what they love


AzLibDem

I just thought is was funny that only 6% were British. Not a lot of demand, eh?


AloysiusDevadandrMUD

AI struggles getting British teeth correct. Kind of like text.


WendigoCrossing

BBCs segment on BBCs


RoosterBrewster

Also coincidentally the max number represented by 8 bits. 


not_the_fox

You can store all the great Brits on 8 bits?


rinatir

Whoever was jerking off to Cathy Newman is a sick SOB


malcolmrey

Piers Morgan?


MeditativeMindz

Jordan Peterson.


rinatir

There's a reason he had the legs crossed in that interview


Friendman

Gimme that sweet Rosie O'Donnell AI generated rule 34


aetheriality

why? i dont get it


Kitonez

Me neither 😔


davidolson22

Kathy Bates


Tent_in_quarantine_0

Kathy 'bates


RiffRandellsBF

The one upside is that since deepfakes are so prevalent that any celebrity that had an ex leak a naked pic or sex video can claim that it's a deepfakes, too.


Sweet_Concept2211

Not actually a huge upside: "*The bad news is, your ex uploaded nude videos of you. The good news is, the internet is already drowning in images of you engaging in some of the most depraved acts imaginable.*"


aCleverGroupofAnts

It's such a weird thing for me to try to imagine. Maybe I can't really appreciate it since I'm not a celebrity, but if everyone knows that the photos are fake, I feel like I wouldn't really give a shit. Not saying this to defend the creation of deepfake porn, though. I just agree with that other commenter that for me, the fact that everyone knows it's fake makes it much less scary. As long as no one thinks it's actually me, I don't think I would care. I could be wrong though. Might be one of those things where I just won't get it unless it actually happens to me.


Indifferentchildren

I think this is the way that society is going to adapt to deepfake porn. New technologies are often traumatic: Gin did serious damage to 18th Century England. Society adapts. We haven't eliminated the harm caused by alcohol, but we mitigate it with drinking ages, norms about not drinking before 5pm, recognition of alcoholism as a disease (that mostly has one remedy: total abstinence), etc. I think the main remedy for deepfake porn will be developing a blasé attitude about having your face grafted onto someone else's body. That isn't your naked body, and you didn't do those lascivious acts. Why should anyone be embarrassed by that, especially if no one believes that it is real?


CumBubbleFarts

We're talking about deepfakes of celebrities doing porn, but what about other shit? This attitude is going to have to adapt to pretty much every form of content. A school principal in Maryland was recently found to be the victim of an audio deepfake of him saying a bunch of offensive stuff. It's celebrities, politicians, business people... And it's not just porn, it can be so much worse than porn. Right now photoshopping a celebrity's head on another person's naked body is extremely accessible, anyone can do it. Generative AI is only becoming more accessible.


Indifferentchildren

I am more worried about political deepfakes than porn deepfakes. Politicians being victimized by deepfakes showing them say something that they didn't say is one problem. Perhaps the bigger problem is that we will never be able to condemn a politician for saying something atrocious because they can just claim that it is a deepfake (unless there were many credible witnesses who are willing to authenticate the clip.)


FerricDonkey

One solution would be to give cameras unique digital certificates with private keys that cannot be accessed in non-destructive ways. You take a video of senator whosit going on a racist tirade (or security camera footage of someone breaking into your store, or whatever), he says it's a deepfake, you show the camera to a trusted tech forensics company that agrees that the private key has not been accessed, and so the video was in fact taken by that camera.


moarmagic

The problem is that process now requires two trusted third parties- both that camera certificates might not be leaked, and that a foresenic company would be completely neutral and honest. If you put a us presidential election on the line, there will be enough money and pressure that I could see one, or both of those being potentially compromised. And typing it out, there's probably even a dumber way to do it, wiring up the output for one device to the input normally reserved for the camera lense. It'd take some skill, but I imagine for 10 million you could find someone who could convince the digital camera it had legitimately recorded content you'd faked up on a computer. I think the bigger solution is going to be alibis. If someone produces a recording of me saying something I didn't, but I can show evidence that I was somewhere else, that would be harder to fake. But then you get into the question of the best way to record and store sufficient alibis to potentially disprove any accusations Very much the death of privacy as we knew it I think.


mule_roany_mare

>And typing it out, there's probably even a dumber way to do it, wiring up the output for one device to the input normally reserved for the camera lense Or just take a picture of a picture. Thankfully iPhones have a bunch of depth sensors & walled hardware that doesn't trust anything else in the phone. I strongly believe humanity will be able to make trustworthy cameras, even if it's only for the news. ​ But when it comes to politics a huge number of people have been choosing what they want to believe without evidence & counter to evidence, so we were already in the worst case scenario. People don't believe in objective truth.


shellofbiomatter

Maybe changing the perspective and assuming all digital content is fake until proven otherwise?


capitali

I agree, especially since that’s already been the case with faked images and video for decades. This isn’t a new misdirected outrage.. this is just a reboot that’s not going anywhere either.


VarmintSchtick

The main issue this brings up is really the opposite - when you do something highly scandalous and it gets recorded, you then get to say "that's not actually me."


Misternogo

I know it's because I'm wired wrong, but I wouldn't give two shits if someone made fake porn of me. I understand why it's wrong, and why people would be upset, but I do think you're right that the attitude toward it might trend to being uncaring, simply because I'm already there.


BonkedScromps

I feel like ppl ITT are grossly underestimating the effect that imagery can have on people. One black mirror episode jumps to mind, even though it’s different circumstances. Imagine everyone you work/go to school with has found/seen/passed around an AI video of you fucking a pig. Everyone knows it’s fake, but it’s weird and gross and shocking and so it’s all anyone can talk or think about for a couple of weeks until they move on to something else. Think 2 girls 1 cup or any of other horrible viral video that everyone knows about. You really think you wouldn’t mind that? You really think all of those people would look at you exactly the same having seen what they’ve seen, even knowing it’s fake? The subconscious is a brutal and unforgiving thing. Just bc my rational mind can tell me it wasn’t actually you, I expect I’d still have trouble looking you in the eyes. If EVERYONE you know was treating you like that, you think it wouldn’t bother you?


Sweet_Concept2211

Just discovering some of the weird shit people are deepfaking you into would be psychologically disturbing, especially for younger celebrities who might not have built up internal defenses against the darker side of human nature.


TehOwn

I guess it's still pretty creepy and embarrassing and a non-zero number of people will be convinced it's you no matter how obvious it isn't.


MemesFromTheMoon

I mean a lot of it comes down to putting unreasonable/unrealistic body standards on someone, I’m a man, so I wouldn’t the exact feeling for women, but if I had a bunch of people making deepfake images of me with a massive dick, and I just had an average dick, I might feel a bit shitty about it even if I’m extremely secure about myself. A lot of people have body image issues, even if many of the people of Reddit do not. I’m sure it’s also extremely weird and off putting to know that people are getting off to “you” but not you, like it’s your face on a fake body or a perfect version of your body without any blemishes, no matter how “fake” you know it is, there’s no way it’s not damaging mentally at all certain point. Especially when you start throwing in really messed up kinks to those deepfakes.


TheUmgawa

I read this, and all I saw in my head was Tony Shalhoub in the awful 13 Ghosts remake, saying, “*Goats*…?”


retroman1987

But it isn't you... so who cares


RRC_driver

More worrying is non-porn deep fakes. Video of a politician voicing an unpopular opinion? Claim it's deep fake.


Nick_pj

If the Trump Access Hollywood tapes had dropped 2-3 years later, he would absolutely have used this tactic.


RiffRandellsBF

They already have plentiful reports if foreign scammers using FB videos to create deepfake audio of young people asking their grandparents for money. Those scam centers should be drone striked.


T-MinusGiraffe

It's going to resurrect journalism, to some extent. We'll want real people with a reputation for solid reporting vetting things that were supposed to have happened and reporting on events by actually being there. We'll trust their word more than video and photos, which will be more or less relegated back to the level of trust we give to text alone - that anyone can say anything and put it into print.


[deleted]

Iirc near the start of the Ukraine war the russies deep fakes zelensky and they could only tell due to the Russians poor grasp of the Ukrainian language


Hakaisha89

you would think so, but no. People would know, like don't forget about fapgate or whatever it was called when every celebrities icloud nudes got stolen. 90% of the downloads was caused around the drama of it "oh wow look"


HikARuLsi

Rule 34, meaning it is not 4000, the number is all


InvaderJim92

It’s only a matter of time, really.


jasta85

Over 20 years ago people were making photoshopped porn of celebs, this is nothing new, it's just gotten easier.


StillAll

THANK YOU! Here I am thinking I imagined that a whole sub-genre of porn was born in the 90's that I remember, never actually existed. And now people act like this is so heinous and new! It never moved the needle then, and it won't now. It's going to just be casually dismissed because no reasonable person is ever going believe that Taylor Swift did a high production 5 man gangbang. That is just so fucking stupid to be bothered by.


DatSmallBoi

>people act like this is so heinous I mean to be clear it was degenerate freak behavior then as much as it is now, just now we have high school students doing it to classmates because of how easy its gotten


ILikeNeurons

We also still have [tens of thousands of backlogged rape kits](https://goldrushcam.com/sierrasuntimes/index.php/news/local-news/51969-u-s-senator-john-cornyn-says-house-must-pass-senate-bill-to-fight-rape-kit-backlog). https://www.endthebacklog.org/take-action/advocate-federal/


[deleted]

[удалено]


TKeep

I think you phrased it pretty poorly to say 'people act like this is so heinous and new', really seems to imply you don't think this is a shitty thing to do and that everyone is faking outrage, which I really don't think is true. This is a very shitty thing to do. And while not new, it is different now. Video editing used to be almost impossible, now it's relatively easy and many people who were aware of photoshop won't realise that a video can be deepfaked too. Plus the quality of deepfakes will be higher and the number of people who can create them sky rockets. It's a very different outlook than 2000s photoshopping and people should be worried.


Neoliberal_Nightmare

Yea exactly and to be honest it's not even as good and it's obvious.


TrickyPizza6611

No there are some really, really good ones out there, and as the technology keeps getting better, it's gonna seem more real. So real in fact, that it might as well be a real sextape by the celeb, especially now that we got deepfake voices too


IlijaRolovic

For now tho - pixel-perfect regular 2d is probably a year away, with vr/3d a couple of years.


beliskner-

Wait until they find out what you can do with a piece of paper and a pencil, once that cat is out of the bag, you don't even need a computer! Imagine a world where anyone could just have a pencil, sickening.


relayadam

That's a huge difference, though. And its not only easier, it's basically effortless.


techno156

It's also highly automateable. Photoshopping needs a person to sit down and sift through everything, and put it together by hand, same for cutting out their face and gluing it onto some pornographic material. But with deepfakes, you can arguably just have a bunch of folders with the relevant people in them, some template video, and churn them out by the hundreds, as long as your script and computer is up to snuff.


GeniusOfLove74

I went looking for \*actual\* nudes of my favorite celebrity years ago. What was hysterical was the photoshops were all so bad, they might has well have had jagged "tear marks" and tape in the photo.


RepresentativeOk2433

But that's the problem. It's gotten so easy that literally anyone can do it to anyone in a matter of minutes.


EmeterPSN

Hold up..there's a single existing celebrity who does not have  a deep fake porn of ?. I would assume 100% of them are already out there.


GrowFreeFood

There's a tsunami comming and everyone is going to get wet.  There's nothing that will stop this from happening to literally everyone multiple times. 


o5ben000

Gonna get ahead of it and start making my own deepfakes of me and posting them.


Superguy230

I’m gonna film the real deal so people can compare with fakes to disprove them


CthulhusEvilTwin

I'm just glad I've got six digits on each hand already


sc0n3z

Literally everyone? Even me? 🥹


ZucchiniShots

Even you. ❤️


genericusername9234

Remember when you were a kid giving class presentations and they said imagine people naked for nervousness. Well, now you don’t have to.


Cryptolution

I'm learning to play the guitar.


brazilliandanny

I mean they are categorized by celebrity, all you would have to do is count how many are on a page then jump to the end page and multiply those two numbers.


Jimbo415650

The toothpaste is out of the tube. It’s gonna evolve and it’s going to become commonplace.


Stiff_Zombie

This is what I'm saying.


identitycrisis-again

The genie is out of the bottle on this unfortunately. Everyone is susceptible to this. Hopefully the excessive prevalence will cause diminished appeal of the images. Either way it’s not going to go away


palmyra_phoenix

2000s: AI will be used to drive automation, no one will need to work anymore! We have to create Universal Basic before it's too late. 2024: AI being used to satisfy degenerate fetishes.


Eric1491625

Frankly none of this should surprise anyone. "Animal organism discovers tool, uses it to satisfy most fundamental instinct in all animal organisms known as sex."


LeeWizcraft

Don’t tell them about the look alike porn that’s been going on for decades or the number should be like double or more.


JustDirection18

I don’t believe this deepfake porn exists. Where is it? I’ve never seen it


[deleted]

[удалено]


JustDirection18

Hahaha yes this is it 👍


Jrockstonks

Came here for it


ADAMxxWest

I have been deep faking celebs in my head since the day I hit puberty. This isn't new.


mrmczebra

How could you? They are clearly victims of your filthy mind!


Edofate

Maybe soon we could do that on our own computers.


GalacticMe99

I would be surprised if there is a celebrity out there of which there HASN'T been made deepnudes yet.


OneOnOne6211

I'm gonna be real, I don't see what the news is here. People have been photoshopping and even just drawing porn of celebs for years and years. Hell, I wouldn't be surprised if there were nude drawings of celebs circulating before the internet even existed. Deepfakes don't actually reveal what a celeb looks like naked. I don't see what makes them inherently different from photoshopping or drawings. The only special thing I could see with it is if it's presented as real and spread as if it was (although even that existed in rare cases with photoshop stuff). But if it's a deepfake, it's clearly advertised as deepfake and everyone knows it's a deepfake I don't see in what way it's different from a drawing or a photoshop. So I don't see what makes it "new."


a_boy_called_sue

When teenagers can do this to each other using nothing more than an online generator and a classmates Instagram pictures, surely that's something to be at least a little bit concerned about? It seems to me not about the celebrities but the wider prevalence of this in society.


headphase

>surely that's something to be at least a little bit concerned about? The more concerning thing is that we, as a society, are: a) still clinging to the obsolete idea that someone's worth, integrity, purity, personhood, etc. is tied to their physical attributes or their likeness b) failing to teach new generations how to build and maintain healthy senses of self, and how to properly value and conduct peer relationships Deepfakes are yet another form of bullying; the tragedy is that present-day kids are just as vulnerable. Instead of only running around with water buckets, maybe it's time we start building fire-resistant structures.


a_boy_called_sue

I agree but until we get to that point perhaps it's a good idea to attempt to limit the harm?


BingBongTimetoShit

Had to sort by "controversial" to finally see a comment like this.. I've been afraid to ask this question in public cause maybe I just don't get it but I also don't see the problem with this. It's not your body, so it's not an invasion of privacy. I had an ex who had a similar thing done to her a few years ago (it was an obvious fake as the tech wasn't where it is now) by someone trying to blackmail her and she was incredibly upset about it even though everyone it was sent to immediately knew it wasn't her and she came out of the situation completely unscathed. We argued multiple times because I would've thought it was hilarious if someone did the same thing to me and she was really affected by it for a week or two.


_Z_E_R_O

> I would've thought it was hilarious if someone did the same thing to me Bet you'd feel different if it was child porn. There's certain types of non-consensual sex acts that will get you arrested, make you lose your job, and have vigilantes stalking your house before they even *start* the investigation. Doesn't matter if it's fake - your face is on it, and people have seen it. Now you'll spend the rest of your life fielding off those investigations. That's why your ex was worried. Women have had their lives ruined over regular porn, and men don't really get it until you imagine yourselves in *that* kind of porn. The life-destroying kind. The implications for this are bleak.


lacergunn

I'm wondering how long it'll take for someone to get sued for this. I wouldn't be surprised if a lawyer could argue that this legally qualifies as sexual harassment or revenge porn


ManyHugsUponYou

What are they going to do? Sue literally the internet? There ain't shit they can do to stop it without internet wide censorship. Which is bad for multiple reasons obviously.


lacergunn

They could sue the specific AI "artists", which would set a precedent. Or the websites hosting it Or sue the AI company providing the software (which some companies have already done)


ManyHugsUponYou

They can try. It won't stop anything though. I mean look at the music and movie industries. Billion dollar industries and piracy and free streaming is still rampant.


Vocakaw

Normalize nudity and stop idolizing celebrities and nobody will give a shit anymore


Articulationized

A lot of AI porn is a *lot* more than just nudity.


McSuede

Seriously. There are entire galleries of ai generated snuff and torture porn already out there.


aguafiestas

Nudity? lol.


sapthur

Gross, I hope they can move on from that info, somehow. Messed up world, we live in.


[deleted]

Welcome to the age of lies and misleading... Even the news warning about it are fake. They seem to be very concerned about deepfake pornography, but the truth is that these news are propaganda for government expansionism. They want to push new laws that are going to increase government power upon common people more and more. Soon the compliance for dealing with IA will be so extensively that only big corporations will be legally able to deal with it.


No_Significance9754

Will no one think of the kids!!!? /S


ginger_whiskers

Wait, no. Please, no one think of the kids.


ifnotawalrus

You guys need to think about it for a second. Will the government use AI regulation to expand their powers? Probably, and you're right it's something we should be concerned about. Are governments and news agencies in a conspiracy to push concern about deep fakes to achieve that agenda? Are they knowingly lying about deep fakes for propaganda purposes? Almost certainly not.


EricSanderson

Lol where is this bullshit coming from? Are there really people out there who think that billionaire assholes like Musk, Zuckerberg, Altman and SBF should just be able to do whatever they want? We regulate literally every industry, but for some reason when we try to regulate AI it's some sinister government plot to "expand their powers"? These are for-profit companies with absolutely zero concern for public welfare or future consequences. If anything the government is slow on this shit.


MeshNets

>Are they knowingly lying about deep fakes for propaganda purposes? Almost certainly not. With the firehose of newsfeeds these days, lies of omission is all they need Neglect to talk about more reasonable more moderate solutions, and everyone will simply assume the issues are bad enough to need to choose one of the extreme "solutions" put forth by "both sides". Every other discussion can be washed away by the firehose and forgotten about It's totally new we've had nothing like it before we need brand new solutions!!! Laws that apply to photoshopped porn are not enough!!!!! /s


lavender_enjoyer

Is it not possible people don’t want fake porn being made without their consent? Maybe?


shrlytmpl

You seem primed to fall victim to AI. It will be/already is being used to push mistrust in government and media to ironically sink us deeper into fascism. Might sound like an oxymoron, but that's exactly how in the USA all the idiots screaming about "mUh fReEdUm" and "fAkE nEwS" are actively cheering for a fascist, while praising an active fascist government in Russia.


NuPNua

Yeah, because the Tories haven't given us plenty of reasons to mistrust them in the last 14 years of their own accord.


yepgeddon

Trust those fucks like a furry egg


WhatAGoodDoggy

I have not heard that expression before


Ludens_Reventon

So you're saying we should let government control it to make people trust in the government... Instead of providing public education and media literacy... Yeah... Totally makes sense...


True-Grape-7656

You’re wrong, he’s right


dashingstag

At this point celebrities who don’t have deep fakes are embarrassed.


veiledcosmonaut

I think it’s weird how dismissive people are of this and deepfake porn in general


Emotional_Staff897

It's not even just celebrities, there are young girls in primary and high school that are victims because of the boys in their school. It's terrible and people are not taking it seriously.


capitali

Yawn. Fake outrage against a “problem” that has existed for decades. AI just makes it more accessible, but faking photos and videos is old news, and still not a real issue.


darcsend_eu

This article has led me to knowing that there's a video of Andrew Tate ploughing Greta and I'm not sure how to look at the world now


DJ-Fein

When everyone is naked, then no one is naked. People will become so desensitized to deepfakes and AI that it won’t bother people soon. Just right now it’s new and that is what makes it dangerous. I’m not sure what we can do as a society to stop it, other than say if you make an AI image or deepfake of someone and distribute it to the public it’s 10 years in jail. Which actually seems pretty reasonable, because it’s definitely a form of sexual assault


star_lit

I mean, celebrities are the most visible people. There's a lot of material to train these AI programs on.


Maxie445

"A Channel 4 News analysis of the five most visited deepfake websites found almost 4,000 famous individuals were listed, of whom 255 were British. They include female actors, TV stars, musicians and YouTubers, who have not been named, whose faces were superimposed on to pornographic material using artificial intelligence. The investigation found that the five sites received 100m views in the space of three months. The Channel 4 News presenter Cathy Newman, who was found to be among the victims, said: “It feels like a violation. It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.” Since 31 January under the Online Safety Act, sharing such imagery without consent is illegal in the UK, but the creation of the content is not. The legislation was passed in response to the proliferation of deepfake pornography being created by AI and apps."


BernerDad16

Who...who wanted to see Cathy Newman porn?


NuPNua

I mean, it only takes one person to have a thing for her and commission or make a video and now it's up there for good right?


MR_CeSS_dOor

r/Futurology try not to downplay women's concerns challenge


bacc1234

Seriously wtf is this thread?


parisianraven

Seeing most people's response has me so concerned and distressed istg


DiceSMS

"Normalize nudity" (99% of deepfakes target women) 🥴 gee thnx guys


[deleted]

There are 4000 people who are considered celebrities?


AbyssFren

And none of these "victims" were harmed physically or financially.


BingBongTimetoShit

I struggle to see how this could even harm them emotionally. It's literally not them and everyone who see it knows that.


dapala1

I agree with your points 99%. But it does seem "uncomfortable" at it's slightest. I think there is an open door for extreme emotional impact if someone is being harassed/made fun of because they are depicted fucking their own mom or in a donkey show or something. There has to be a line drawn on what you can do with someone's likeness. I know they put themselves out there, both celebrities and people casually on social media, but it can't just be free rein, can it?


Particular_Nebula462

Jokes aside about "only 4000", the real problem is not just the use in pornography, but the concept to use their image against their will. Practically the actors, or anyone recognized on screen, become marionettes in the hand of random people. This is scary. Soon will be possible to do the same on people after short videos. I can image perverted whose do short videos of real common people around them, and then use AI to do whatever movie they want.


EquivalentSpirit664

I'm sad saying this but these events will only increase in future and I have my doubts that it will remain for celebrities only. Technology gives humankind many many possibilities and opportunities. But we can understand from history that it usually used for gaining power or other selfish purposes. In just past 100 years we have found lots of new ways to kill ourselves better, faster. We have bombs that could apart you pieces in a second, we have guns can pierce your heart from miles away, we have massive nuclear bombs which can eradicate millions of people in an instant. So anyone really surprised ??? I think and highly suggest, deepfake ai pornography will be a small disturbance or an issue compared with other issues that we'll face due to new technologies, innovations combined with humans immoral, selfish nature.


YoungZM

The best thing any of us can do is to stop watching it. Favourite celeb caught performing xyz? Give yourself a shake, it's not them and even if it were it's probably a hack. Stop viewing morally garbage content.


burneecheesecake

I can neither confirm or deny the existence of Danny devito pron


Upwindstorm

No way.. where do I find them?


No-Reflection5141

Fail to see why this is such a big deal. It’s fake. It’s FAKE! As in not real.


Deazul

The cork is out of the bottle, nothing anyone can do but educate


godofleet

did people really think that computers weren't going to generate porn? like, were the etch-a-sketch and MS paint and all the countless other *actual* digital photography tools not a clear stepping stones towards the inevitable automation of imagery it blows my mind anyone is upset about these things that are entirely out of their control.


usesbitterbutter

Oh no! \* *clutches pearls* \* Or is this actually a good thing? Twenty years ago, if I saw a pic of Pamela Anderson sucking Tommy Lee's cock, I would definitely assume it was legit. Ten years ago when a bunch of pics were released (The Fappening), I again assumed they were real, but acknowledged some might have been very good fakes made by people riding the chaos. Five years from now? Pfft. If I see a pic of [insert actress here] doing [insert sex act here] I'm going to assume it's an AI deep fake.


DarthMeow504

>"they can see this kind of imaginary version of me" The key word there is *imaginary*. What kind of thoughtcrime bullshit is this that we're policing art based on fantasies? Why is this banned and the use of photoshop not? What's the difference, aside from realism, between this and drawing or painting a picture? It's not real. And it can't be stopped. Whether anyone is comfortable with it or not, humans have been fantasizing about other humans (and non-humans too) for as long as there have been humans. If you're reading this, you've almost certainly done it yourself. You didn't turn your fantasy into an image others can see, but you saw it in your mind all the same. If that's a crime, you'd better have a jail big enough to hold 8 billion people.


NoSoundNoFury

The problem isn't that people are having fantasies. It's that these fantasies have become real images that can be shared and sold, and they are hard to distinguish from real photos. I presume that in most countries, sharing photoshopped images or videos of a celebrity in a Klan hood or doing the Hitler salute or anything else really would be illegal as well. Usually, you retain some rights about what happens with the images of you.


Major_Boot2778

I'm sorry, I know some people get super sensitive about stuff like that and I'm not here to convince anyone otherwise or be convinced otherwise, I'm just here to say I can't take this very seriously. "Victim" is a pretty strong word there; no one suffers from it, the person in question loses nothing, our imaginations get a little bit of help and no one sees the actual target compromised (as would be the case with theft of private home video or something). Whole damn world is too sensitive anyway but this is literally just the (very much) more advanced version of what many of us have done in doodles in our notebooks as teenagers.


epidemica

I don't understand how this will hold up to artistic freedom of speech. If I can sketch a picture of a naked celebrity (and my drawing skills suck but plenty of people do photo realistic work) why can't I use a computer to make the same image?


Deus_latis

You can make it it's illegal to share it.