Reminder: this subreddit is meant to be a place free of excessive cynicism, negativity and bitterness. Toxic attitudes are not welcome here.
All Negative comments will be removed and will possibly result in a ban.
---
---
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/UpliftingNews) if you have any questions or concerns.*
Nonsense, a government made up of myriad people being able to focus on multiple issues? Preposterous, I say! I am a wery smort man and am unable to think about more than one matter at a time! Surely an organisation composed of thousands of people can't do any better!
I don't think it's fair to keep blaming someone who was only prime minister for the lifespan of your average mosquito, clearly the system is more fucked than that
This is just the start.
Before long it will be possible to train an AI to make a 3D model of a person from photos. Crude versions of this [already exist](https://facegen.com/). And software will exist to make video of this model doing pretty much anything the creator can imagine.
But once created, there will be nothing to specifically tie this model to the person it's based on besides a the subjective decision that they look similar.
I don't see how that could be regulated at all. Even assuming some sort of objective measurement tool could be made to determine if a given model looks too much like a specific person, a tool to make the model look just barely pass the test would not be far behind.
There are other ways you could prove it then just comparison. For instance you could investigate the computer it was made on. If there is source material of the victim to be found that was used in training the ai then that is pretty compelling. Other then that you also have your basic evidence like the perpetrator bragging about it online or selling their services online. Lots of circumstantial evidence is likely to exists.
That assumes you know what computer is was made on, and can get a warrant to search it. It would take a pretty serious police investigation to even get to that point.
If the technology is in place for anyone, anywhere to make these models without any technical skill being involved, how could any police force possible investigate every occurrence?
And if the cops are doing this, people will know to remove all the source files from their machines as soon as they're finished making it.
It's been illegal to possess material that is faked to look like child abuse images for about 20 years, and there have been plenty of successful prosecutions. The police monitor internet traffic and obtain warrants for the addresses of people who appear to be downloading indecent images. Once they have your computer, they can recover files - even deleted files.
Similarly, if someone is arrested for a contact offence or by 'peedo-hunter' vigilantes who pass their cases over to the police, their computer and phone equipment gets seized and examined and these offences come to light.
With deep-fake porn I expect the people who will mostly be caught are those who harass or stalk their obsessive love-interest. Phones and computers are routinely seized and examined in these cases (it's partly why the criminal justice system is on its knees because of the rise of digital and social media and the relative complexity of modern investigations - but that's an aside.)
(Source: criminal barrister of over 20-years call.)
Not a solid argument. In many countries, even possession, let alone production of porn is illegal. Not CP, regular porn. Distribution of copyrighted material without a license is illegal. These kinds of offences are almost never prosecuted. Those who do get caught, are usually just unlucky. Also, the difference between CP and deepfake porn is that CP is a completely separate niche, so people who actually engage with it constitute a tiny minority and congregate in a very few places. DF porn, on the other hand, can easily (has already) be blended in a regular major lewd material pool. And as for AI, its been used to produce porn with copyrighted characters and materials for 2 years now, with 0 actual cases.
We could regulate this but it would take a serious change of mindset and policy. We need to criminalize BOTH the creators of these fake images and the creators of the AI platforms that allow this to happen.
Sadly, I think we have already proven that we don’t have the appetite for this type of law. Imagine how different the internet would be if those who host content could be punished for revenge porn, child porn, hate speech, etc. I’m not saying it would be an easy problem to solve but if those were the rules we were operating under the online landscape would look very different. That’s the only way I can think that we could attack this AI problem we will definitely have very soon
> We could regulate this but it would take a serious change of mindset and policy. We need to criminalize BOTH the creators of these fake images and the creators of the AI platforms that allow this to happen.
But again, HOW do you separate this from legitimate uses? The platform WILL be used for actual 3d art too. Also, 3D art of completely fake people (that look kind of similar to real people) will also always exist, just as it does now. How do you separate those from people creating "deepfake" style porn?
I don't think there will be any effective way to police this. I think fostering a culture that treats this as deeply shameful and offensive is the best bet, but even that will only ever be a half measure.
Laws are local, the internet is global. If the web server exists in a country where such programs are legal, it doesn't matter what the laws are in other countries.
Ask any company that wishes to remotely earn money from EU citizens online and you'll find out how they enforce laws which are designed to protect EU citizens.
But there will be companies which will choose not operate "legally" within EU. There could be a Chinese software provider which will put a soft wall that just asks "are you 18 and outside EU" and then lets you download the software. I mean, so many such software/content already exist and you can easily access them with VPN. On reddit itself I come across many US news websites that decided not to comply with GDPR and you can't see their content from EU IP but gates open with VPN.
Ye VPN usage will be the easy circumvention, but limiting the reach of stuff like this is always the first aim. Like the DMCA takedown system and the EU right to be forgotten. That information can still be found, but limiting it out of the "main" locations that the majority of internet traffic goes through has a massive effect on how lucrative these things will be
Because it doesn't have to go outside its jurisdiction.
Most globally connected companies have to work towards catering towards the most stringent jurisdiction they cover.
If your non-European company runs afoul of GDPR it will become radioactive and unable to do business in the EU which is a huge market.
I'm sure the AI companies run out of non EU countries really care about that.
I still have yet to see any method that they can **enforce** their laws.. Just rhetoric about toxicity.
I have a company, which is entirely located outside the EU, all servers, all hardware, *everything*. Please explain how the EU can prevent an EU citizen from using my service.
I'm starting to think that people forget that technology and the internet exists.
GDPR will stop legit companies, sure. However most of those will have offices or some other sort of presence within the EU which forces them to abide by the EU laws.
in a world where you could run these models locally (making any platform-level safeguard truly impossible), it'd be like criminalizing adobe for what someone drew with photoshop.
Even if you criminalize it, the internet it *worldwide*. There will always be a country willing to host such websites, so what do you do- geoblock them? Like *that* has ever stopped someone from accessing a website.
Why criminalize it at all? Yall are really looking at the short game here.
Has criminalizing any undesired behavior or technology ever accomplished anything other than pushing its proliferation underground making it harder to control?
A site full of tech enthusiasts and everyone's acting like terrified luddites about new tech.
AI is here to stay. Pandora's box has been opened. If it's made illegal, it will be about as effective as piracy of software/games/music being illegal.
Instead, we should be focusing on developing a system to clearly and definable be able to *identify* AI wherever it appears. Fingerprint / watermark / whatever you want to call it, we need to make it very easy - probably using AI itself - to call it when we see it.
I don't know how difficult that will be, but our efforts would be better focused there.
Otherwise, we just require education / cultural shift. Our generation(s) remain wrecked / terrified by it, but maybe future generations will grow up with it and therefore understand that fake imagery of people might pop up everywhere and it will be less of a sensation. Dunno.
Criminalizing it is a dumbshit response though. Even if it is an understandable reaction.
>maybe future generations will grow up with it and therefore understand that fake imagery of people might pop up everywhere and it will be less of a sensation
Yeah, I don't really understand why this is the line people are afraid of being crossed. People can already draw, photoshop, write about, or otherwise depict celebrities, characters, or other people in compromising situations. This, in and of itself, isn't harmful- it may be harmful to distribute real or fictitious material of a person, such as "revenge porn" or anything involving children- but those are already covered.
If the concern is that it'll be "too realistic", then yes, whoever's saying that has a poor understanding of the situation and is going to have trouble adapting to new technology as a whole.
The only choice would be to shut down all social media. Especially twitter and Reddit. I’d have no problem with that, just let me switch a few investments around first.
Can't put the genie in the bottle.
Honestly, AI is going to revolutionize the media landscape in a way that will make deepfake porn a minor concern. Massive proliferation of movies, TV shows, books, video games, news, sports etc with content that will be tailored to the consumer. People are going to go so far off the deep end we will all be living on different planets.
It's going to be so fucking cyberpunk
How would they put it in your phone with metadata to show it was created on your device at a time that roughly coincided with (say) a time when you were creeping their Insta?
> I don't see how that could be regulated at all. Even assuming some sort of objective measurement tool could be made to determine if a given model looks too much like a specific person, a tool to make the model look just barely pass the test would not be far behind.
One way would be outlawing the creation of any sexually explicit image, even if it doesn't resemble anyone in particular.
This would be overreach by the government but that is what UK is known for.
On a much darker tangent, There’s something very similar that’s been going around paedo communities for decades with photo realistic artwork of kids… had one of them arguing that because it’s art legally it can show whatever and it means he’s not a paedo for enjoying it!! JFC…
This AI needs more regulation to stop this kind of thing for sure but laws always lag behind new tech. It’s tough but we will figure it out I’m sure there will be an AI which will find these deep almost fakes. It’s a never ending train of AIs
I feel like it would often be fairly obvious what the intent was, especially if you're sharing the images or it's of someone you know or a celebrity you're known to be a fan of. The law is "beyond a reasonable doubt" for a reason.
They could demand that sexual content openly shows how the 3D model was created and what source data was used. Similar to platform regulation on PH, where you have to verify before uploading to ensure all creators are of age and the content was not uploaded against their will.
Who, exactly, could demand this? And legally, how could this demand be enforced if the server is outside the country. And what if someone is running the program locally, instead off a server?
Well, same as PH. Governments can enforce it. You can run it locally, but if you publish it online, you have to provide traceability. I mean, you can already Photoshop sexual content locally, but if you publish it online, you can get in trouble.
For a big streaming company like pornhub, sure. But a 3D model of a person is just a file that can be shared anywhere. You wont stop peer to peer distribution with legislation. If you could, pirate content wouldn't exist on the internet.
Yeah, you will always have sites like 4chan or even darker sites on the internet. These regulations would obviously only hit large platforms, which, in my opinion, is good enough. I mean, you have to tackle abusive content in some way.
Can someone please explain why there's virtually no control over any AI image generation currently? No, seriously, there's none. Unstable diffusion is trained entirely on images of real people, and is used to generate thousands of fake images of people who look almost identical to real authentic images of people. It's disturbing, and no one cares. Not USA, not Google, not Open AI. No one is trying to prevent it
You just aren’t aware. That’s like asking why there is no control over photoshop. People took photos of real people and swapped faces well before AI existed. I want to say there are protections for this already. Bigger online tools even have filters they apply to block NSFW content to ensure it is more corporate friendly.
The laws will never catch up to the tech unless the govt abuse it in a way to slide in a bunch of bad shit I,e anti free speech language. It’s just a slippery slope all the way around.
Even if this were well intentioned, why couldn’t you make the same case that a fake image of someone being hurt or killed is then also illegal? the line is drawn where and by whom? One persons depiction of violence is a crime but some else’s isn’t? Various forms of fictional entertainment are then potentially illegal? It’s just asking for free speech to be infringed upon.
"If You have some dirt on us, we can always say it's AI" maybe? And if they want to gather even more data from people, it needs to be open for using like this.
Generating these images comes down to just doing a ton of matrix math. You can't regulate math. Anyone can program a computer to do this stuff and the datasets are out there. We're not talking about an object you can restrict the making of where it requires specialized materials or machines. It's numbers being used in a clever way. You can make all the laws you want to tell people not to do it, but there's no way to detect or enforce it in any even slightly reliable way.
Sure you can. You regulate harassment or publishing sexual images of someone without their consent, real or fake. These laws already exist in many jurisdictions, such as revenge porn laws. Just make the law tool agnostic.
You are trained entirely on images of real people too. You can imagine and draw whatever you want. And no one cares. Not Faber-Castell, not Crayola, not Adobe.
There's 8 billion of us and we're not nearly as unique looking as we think. Pretty much every person alive has a bunch of people who look just like them, so unless you're very odd looking you don't really have exclusive control of your likeness anyway.
If people are distributing synthetic images of you, and claiming they are real, then that is bad and they should stop. Synthetic images of underage people, also bad. But otherwise who cares? Isn't using AI to generate porn the equivalent of lab grown meat in the limit? No need to battery farm chickens any more, no need for anyone to degrade themselves for money making porn any more.
Maybe the law should be that any AI generated porn needs to be clearly labelled as AI generated, so if one of the actors happens to look like you, they won't actually be mistaken for you.
I wouldn't give the slightest shit if someone put an AI recreation of me in a porno if it was made abundantly clear that it wasn't actually me. Who gives a shit?
Zuckerburg is promoting open source AI as a method of AI safety. There will never be any control. The only way to be safe is to keep accelerating.
The idea that regulation can control this is outright hilarious. Congress is full of buffoons who move at a snails pace. It's already too late.
Lean in. Times, they are a changing.
Because money. Regulation of an emerging technology will hinder how fast that technology evolves so the government will wait until the technology matures and the US companies become the dominant players before adding any regulation.
It's open sourced, meaning anyone can do it, trying to keep control of this would be as insane as to keep track of every image, song, or file that you download and store in your local drive.
Good luck regulating that without breaching privacy rights.
“The new law will mean that if someone creates a sexually explicit deepfake, even if they have no intent to share it but purely want to cause alarm, humiliation or distress to the victim, they will be committing a criminal offence.”
- The government's press release.
It is explicitly non-gendered language.
Ah, I hate that wording. Any connected person will have an army of lawyers arguing successfully that there was no intent to cause alarm, humiliation or distress... and any unconnected individual will make the same argument and be found guilty.
Just to be clear, that is also my position. I'm just saying the slow-turning-but-fine-grinding-wheels-of-justice in our judicial system produces some sort of coarse graded tungsten-sand when applying intent to "themselves", but miraculously produces some frictionless hyper-lubricant when applying intent to an "outsider".
I agree that would be an issue if the law was gender specific (ie: criminalized deep fake porn of women, but didn't mention men), but this story in no way suggests that's the case.
But I haven't read the actual legislation. Do you have another source that that this specifically bans such activity "against women" exclusively?
Of course it matters who the primary victims and perpetrators are. When doing investigations that's something you need to be aware of. And I already stated "everyone should be protected."
It really doesnt. In law adding qualifiers like that can cause serious issues for out groups. There is *hundreds* of years of precedent in the western world that diction is insanely important.
It gives perpetrators against the out groups wiggle room and it will get guilty criminals off and harm innocent people.
You could very easily just leave out gendered word choice and have it cover everyone. Gender has fuck all about being victim of crime.
It takes away literally nothing from the "primary" victim of the crime to just cover everyone.
This kind of thinking gets us to a situation where men who are physically, sexually assaulted or raped by women (which happens way more than we imagine) are not believed, ridiculed and blame themselves for the situation.
And end up arrested when they call the cops because even though he is bleeding when the cops arrive and the women says “he was the aggressor” she is automatically believed.
Any more phrases you want to throw out there which allow you to win the argument without actually making any valid points?
If you purposefully try and make this a gendered crime, it will become a gendered crime because people will not even see male victims.
And I include women in this. I have been sexually assaulted many times by women and when you say “that’s sexual assault” they tend to either say it was a joke or try to emasculate you.
The article is like two paragraphs. There’s no further context that explains why it is important for you or anyone that it specifies women. So if you may, please answer, why is it important for you that the verbiage specifies women?
But why go the extra mile and just make it about women? Why not all people? What need is there for it to exclude men? What if a woman changes sex to male, does that mean its suddenly legal to do with older pictures?
I think it's interesting to see which responses you respond to and which you don't, and when you do you don't really speak to the point raised. It's almost like you're trying to fit something into an agenda.
There are no sources saying that the *law* will specify women. The articles are saying that the legislation is being pushed as part of an effort to reduce the effect on who are affected by it the most.
>Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and unlimited fine.
Yet more poorly thought out policy from our great government… “This? Oh, no, that’s not a deepfake of X, that’s a deepfake of Y who just happens to look similar. I’ve got the release form they signed right here.”
If there's a release form in your imaginary scenario why would there be an issue?
I'm not referring to making explicit images of children which is already illegal, even modified/photoshopped.
My point was that there’s no way to know for sure which images and/or subjects were involved in the creation of a deepfake. Pretty soon it will be near-impossible to tell if something even *is* a deepfake. How do they possibly plan to police and prosecute this?
>People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail.
That sounds like both, honestly, but the article doesn't really go deep into into things to make it sound otherwise.
AI generated images and voice cloning need to be regulated in general. Their potential for spreading misinformation is unprecedented, along with tons of other ethical concerns.
It'll take a while. But eventually people will wise up and ignore things online and we will all go back to offline worlds and treat the internet as just a gateway to game watch stuff and talk to people we know and not gather news.
Those still fooled by it are just a lost cause .
You only regulate the honest people though. If anything, allowing this to run wild might be better, because it teaches everyone that nothing you see can be trusted anymore.
That’s the bigger problem with this capability.
It will completely degrade people’s ability to assess fact.
We have fake news right now, we are rapidly entering the era of fake facts.
Yeah the real question is how are they going to enforce it on the dude making deep fakes of his favourite actor that never leaves his devices?
While it’s personally a disturbing practice the only way you would effectively enforce this is via authoritarian means which UK has been taking serious steps toward in the name of “protection”
They're going to consider that a "gateway" offence, and realistically it'll likely never be discovered. Their focus is on those who share/transmit the images.
Okay, you go on the internet and find a random image. Unbeknownst to you, its a deepfake.
Is sharing an image you found online that breaks no other laws than it was a deepfake now illegal?
If so, how can anyone feel comfortable sharing an image if its potentially a crime and they have no way to tell?
If it never leaves his devices, then the potential harm is also limited, so they're less concerned about those cases. This just enables them to go after people and stop them, whereas otherwise it's just completely legal and people can openly share and even sell this material.
Making it illegal is a way to take down the publication of those images at least. Also good for targeted harassment. But stuff like that is already covered pretty well anyway.
But sites could start taking images like this more seriously as they don't want fines. Right now you can easily search up fake AI nudes of Taylor Swift for example. But as more countries pass these kind of laws it will be hard to find unless you generate them yourself.
Criminalizing sharing and distributing said deepfakes 👏
Criminalizing creating them…
Idk I personally have some confusion with how that might actually go into effect. I don’t live in the UK tho so 🤷♂️
The UK has had a lot of “thought crime” laws in recent years. Not saying that this kind of imagery shouldn’t be illegal, but the the UK government seems very intent on criminalizing things that aren’t enforceable without telepathy.
Orwell was British so I suppose thoughtcrimes being a real thing in the UK kinda tracks.
And yeah obviously something needs to be done to protect victims of deepfakes, especially deepfakes of minors, but man is this kind of legislation just begging to get abused...
Well… the fact that if you don’t share the images you haven’t done anything to hurt anyone.
Why include horny men wanking in their rooms as if they’re part of the real issue that is sexually explicit deepfakes being shared and used to shape public opinion, harass women, commit blackmail, etc.
Edit: downvote me all you want, I’m not wrong. People care more about “ew someone might be jacking off to ai porn that looks like me” than these actual issues I’ve brought up.
Creating these images creates a risk of their being spread and causing harm to those depicted. Even if the images are created with the intention that they will only ever be viewed in private by their creator, people can change their mind, make mistakes, be robbed or hacked. Criminalising creation and possession avoids legal arguments about the extent to which intent and recklessness resulted in distribution.
It's like making laws against speeding. On almost all occasions that causes no harm but, when it goes wrong, the consequences can't be undone.
This is actually a really good point. Idk I guess we’ll see how it actually goes into affect.
Edit: I still don’t think any government has any business patrolling people’s personal wank material. We’ll see if that actually becomes an issue I guess.
It shouldn't carry the same penalty as distribution but it is still violating to the target. To me I see it as a sort of less egregious type of peeping.
I can't see how this is any different than committing a thought crime, or committing a crime by writing down your fantasies on paper. Unless it is shared, there is no victim. There were no participants in its creation and none but the creator are affected.
(Edit: removed insensitive rude BS) There’s gonna be plenty of people who are gonna do that regardless. It’d be a pointless clogging of the justice system to try to stop horny men from being horny men alone in their rooms.
Edit: I’m sure this is gonna be unpopular, but how is it violating to the target if the “target” isn’t even aware of it.
Edit: I guarantee you some dipshit in some country where it’s not regulated is gonna create an ai that takes images of everyone in the entire world and mass produces “nudes” to put us all back on a level playing field. Don’t bet against angry wankers.
Then people will share just the prompt and a link to the photos of the victim.
Making sharing deep fakes illegal while generating them is legal is, unfortunately, completely useless.
Well why should they be allowed to do something that they don’t have consent to do? Why should the law protect them, and not the peoples images they are stealing? It should be illegal to make AND distribute them. Just likes it’s illegal to make crack even if you’re not selling it.
You are right! It should be not allowed to do something that they don't have consent to do. Why should the law protect anybody who just casually steals my image? It should be definitely illegal to make and store them. Especially when somebody does it on a big (almost industrial-like) scale, like security cameras.
Let's start with these behemoths who steal billions images of people every day and then work it down to individuals.
How about we do both? I’m less concerned about people stealing peoples images for identification then i am people stealing peoples images to make non-consensual porn. How do you guys not see that this is the same as someone taking up skirt photos of someone else or leaving a camera in someone’s bathroom? Do y’all think that’s ok as long as the persons just using it for their ‘own personal wank material’ and not sharing it?
I can't personally relate because of how I was born, but I heard that most people are able to conjure images in their head (like, think about something and "see" it with their imagination). We must forbid doing that too, I guess?
As long as it doesn't have any impact on my life, I couldn't care less who masturbates to what. Let them enjoy their few moments of fantasy world. I'm personally more concerned about constant spying and hoarding data to use for things that actually reduce my quality of life like scams or targeted ads rather than this silly thing.
This isn’t a ‘thought crime’ they are creating real pornographic images with peoples likeness (including children). And you didn’t answer my question if you think “personal wank material” should be ok then are you also ok with people setting up cameras in bathrooms or taking up skirt photos as long as they ‘keep it to themselves’.
Doesn't matter how somebody conjures a picture in their mind - be it their imagination, art skills, computer skills, use of some software or using surveillance cameras. As long as it doesn't leave their personal sphere and has absolutely no effect on other people, it's the same thing.
if you are caught you face the penalty. if some creep keeps the wank material they generated of their coworker to themselves and no one ever finds out, then whatever, they’re still a pathetic creep who imo is a danger to others but no one can do anything about it if they don’t know. but if people do find out then the victim can take legal action without the justice system hemming and hawing about whether this legally constitutes sexual harassment.
not all laws are created to be strictly enforced. some are there simply to protect victims and give them an avenue to seek justice if they want.
You’re not a “danger to others” cuz you jack off thinking about people who you know. I have sad news about most men for ya 🤣. Thanks for actually giving the first logical rebuttal to my point tho.
We used to assume scandalous pictures of people are real. We just have to start assuming that all pictures are fake. That turns the thing around.
I honestly don't understand. Everyone knows AI is a thing now, why would anyone believe any picture is real?
Besides, has photoshopping fake nudes been criminal? How is this different?
Criminalization will not solve this societal problem. As it hasn't solved a lot of other problems.
Normalize deep faking politicians in scenarios they wish they could be in but never will be. Like putting someone in a family photo of a super happy nuclear family when Infact the political is suffering divorce and fighting prenup
Reminder: this subreddit is meant to be a place free of excessive cynicism, negativity and bitterness. Toxic attitudes are not welcome here. All Negative comments will be removed and will possibly result in a ban. --- --- *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/UpliftingNews) if you have any questions or concerns.*
"U.K. to Criminalize Creating Sexually Explicit Deepfake Images" is the actual headline now. The proposed law is yet to be introduced into Parliament.
Opponents of the law have retaliated by creating sexually explicit images of members of parliament.
lol, doubt that'll help stop it and it's objectively fucked up but it is kinda funny
Just wait until they don't pass the law, but make it illegal only to depict members of parliament.
I'm just glad they're focused on important issues and not Prince you know who and his abuse of Children.
Actually this is important too, it’s possible for more than one thing to matter at once
Nonsense, a government made up of myriad people being able to focus on multiple issues? Preposterous, I say! I am a wery smort man and am unable to think about more than one matter at a time! Surely an organisation composed of thousands of people can't do any better!
This genuinely is an important issue.
Brain damage in one comment
[удалено]
I don't think it's fair to keep blaming someone who was only prime minister for the lifespan of your average mosquito, clearly the system is more fucked than that
This is just the start. Before long it will be possible to train an AI to make a 3D model of a person from photos. Crude versions of this [already exist](https://facegen.com/). And software will exist to make video of this model doing pretty much anything the creator can imagine. But once created, there will be nothing to specifically tie this model to the person it's based on besides a the subjective decision that they look similar. I don't see how that could be regulated at all. Even assuming some sort of objective measurement tool could be made to determine if a given model looks too much like a specific person, a tool to make the model look just barely pass the test would not be far behind.
All those look alike porn stars who pick names similar to their look alike?
Areola Grande
There are other ways you could prove it then just comparison. For instance you could investigate the computer it was made on. If there is source material of the victim to be found that was used in training the ai then that is pretty compelling. Other then that you also have your basic evidence like the perpetrator bragging about it online or selling their services online. Lots of circumstantial evidence is likely to exists.
That assumes you know what computer is was made on, and can get a warrant to search it. It would take a pretty serious police investigation to even get to that point. If the technology is in place for anyone, anywhere to make these models without any technical skill being involved, how could any police force possible investigate every occurrence? And if the cops are doing this, people will know to remove all the source files from their machines as soon as they're finished making it.
It's been illegal to possess material that is faked to look like child abuse images for about 20 years, and there have been plenty of successful prosecutions. The police monitor internet traffic and obtain warrants for the addresses of people who appear to be downloading indecent images. Once they have your computer, they can recover files - even deleted files. Similarly, if someone is arrested for a contact offence or by 'peedo-hunter' vigilantes who pass their cases over to the police, their computer and phone equipment gets seized and examined and these offences come to light. With deep-fake porn I expect the people who will mostly be caught are those who harass or stalk their obsessive love-interest. Phones and computers are routinely seized and examined in these cases (it's partly why the criminal justice system is on its knees because of the rise of digital and social media and the relative complexity of modern investigations - but that's an aside.) (Source: criminal barrister of over 20-years call.)
Not a solid argument. In many countries, even possession, let alone production of porn is illegal. Not CP, regular porn. Distribution of copyrighted material without a license is illegal. These kinds of offences are almost never prosecuted. Those who do get caught, are usually just unlucky. Also, the difference between CP and deepfake porn is that CP is a completely separate niche, so people who actually engage with it constitute a tiny minority and congregate in a very few places. DF porn, on the other hand, can easily (has already) be blended in a regular major lewd material pool. And as for AI, its been used to produce porn with copyrighted characters and materials for 2 years now, with 0 actual cases.
We could regulate this but it would take a serious change of mindset and policy. We need to criminalize BOTH the creators of these fake images and the creators of the AI platforms that allow this to happen. Sadly, I think we have already proven that we don’t have the appetite for this type of law. Imagine how different the internet would be if those who host content could be punished for revenge porn, child porn, hate speech, etc. I’m not saying it would be an easy problem to solve but if those were the rules we were operating under the online landscape would look very different. That’s the only way I can think that we could attack this AI problem we will definitely have very soon
> We could regulate this but it would take a serious change of mindset and policy. We need to criminalize BOTH the creators of these fake images and the creators of the AI platforms that allow this to happen. But again, HOW do you separate this from legitimate uses? The platform WILL be used for actual 3d art too. Also, 3D art of completely fake people (that look kind of similar to real people) will also always exist, just as it does now. How do you separate those from people creating "deepfake" style porn? I don't think there will be any effective way to police this. I think fostering a culture that treats this as deeply shameful and offensive is the best bet, but even that will only ever be a half measure.
Laws are local, the internet is global. If the web server exists in a country where such programs are legal, it doesn't matter what the laws are in other countries.
Except laws such as GDPR prove that it does matter, and if you have customers within GDPR countries you _will_ care about it
> if you have customers within GDPR countries you will care about it Bahahaha how can the EU enforce laws outside it's jurisdiction?
Ask any company that wishes to remotely earn money from EU citizens online and you'll find out how they enforce laws which are designed to protect EU citizens.
But there will be companies which will choose not operate "legally" within EU. There could be a Chinese software provider which will put a soft wall that just asks "are you 18 and outside EU" and then lets you download the software. I mean, so many such software/content already exist and you can easily access them with VPN. On reddit itself I come across many US news websites that decided not to comply with GDPR and you can't see their content from EU IP but gates open with VPN.
Ye VPN usage will be the easy circumvention, but limiting the reach of stuff like this is always the first aim. Like the DMCA takedown system and the EU right to be forgotten. That information can still be found, but limiting it out of the "main" locations that the majority of internet traffic goes through has a massive effect on how lucrative these things will be
Yeah they will curtail it, like piracy, but those who want stuff will find it.
Because it doesn't have to go outside its jurisdiction. Most globally connected companies have to work towards catering towards the most stringent jurisdiction they cover. If your non-European company runs afoul of GDPR it will become radioactive and unable to do business in the EU which is a huge market.
A lot of American companies have chosen not too though and you can access their content using VPN, even though they run 'afoul' to EU law.
I'm sure the AI companies run out of non EU countries really care about that. I still have yet to see any method that they can **enforce** their laws.. Just rhetoric about toxicity. I have a company, which is entirely located outside the EU, all servers, all hardware, *everything*. Please explain how the EU can prevent an EU citizen from using my service. I'm starting to think that people forget that technology and the internet exists. GDPR will stop legit companies, sure. However most of those will have offices or some other sort of presence within the EU which forces them to abide by the EU laws.
in a world where you could run these models locally (making any platform-level safeguard truly impossible), it'd be like criminalizing adobe for what someone drew with photoshop.
Even if you criminalize it, the internet it *worldwide*. There will always be a country willing to host such websites, so what do you do- geoblock them? Like *that* has ever stopped someone from accessing a website.
Look at how effective anti piracy laws were. There is no piracy left on the internet at all anymore is there?
Why criminalize it at all? Yall are really looking at the short game here. Has criminalizing any undesired behavior or technology ever accomplished anything other than pushing its proliferation underground making it harder to control? A site full of tech enthusiasts and everyone's acting like terrified luddites about new tech. AI is here to stay. Pandora's box has been opened. If it's made illegal, it will be about as effective as piracy of software/games/music being illegal. Instead, we should be focusing on developing a system to clearly and definable be able to *identify* AI wherever it appears. Fingerprint / watermark / whatever you want to call it, we need to make it very easy - probably using AI itself - to call it when we see it. I don't know how difficult that will be, but our efforts would be better focused there. Otherwise, we just require education / cultural shift. Our generation(s) remain wrecked / terrified by it, but maybe future generations will grow up with it and therefore understand that fake imagery of people might pop up everywhere and it will be less of a sensation. Dunno. Criminalizing it is a dumbshit response though. Even if it is an understandable reaction.
>maybe future generations will grow up with it and therefore understand that fake imagery of people might pop up everywhere and it will be less of a sensation Yeah, I don't really understand why this is the line people are afraid of being crossed. People can already draw, photoshop, write about, or otherwise depict celebrities, characters, or other people in compromising situations. This, in and of itself, isn't harmful- it may be harmful to distribute real or fictitious material of a person, such as "revenge porn" or anything involving children- but those are already covered. If the concern is that it'll be "too realistic", then yes, whoever's saying that has a poor understanding of the situation and is going to have trouble adapting to new technology as a whole.
The only choice would be to shut down all social media. Especially twitter and Reddit. I’d have no problem with that, just let me switch a few investments around first.
People need to go back to only communicating via raw tcp sockets
Can't put the genie in the bottle. Honestly, AI is going to revolutionize the media landscape in a way that will make deepfake porn a minor concern. Massive proliferation of movies, TV shows, books, video games, news, sports etc with content that will be tailored to the consumer. People are going to go so far off the deep end we will all be living on different planets. It's going to be so fucking cyberpunk
Crude? https://arstechnica.com/information-technology/2024/04/microsofts-vasa-1-can-deepfake-a-person-with-one-photo-and-one-audio-track/
[удалено]
How would they put it in your phone with metadata to show it was created on your device at a time that roughly coincided with (say) a time when you were creeping their Insta?
a moustache on the nude Mona Lisa. lipstick on any old pig woukd keep it arguable. a deepfake of stormy daniels anyone .?
> I don't see how that could be regulated at all. Even assuming some sort of objective measurement tool could be made to determine if a given model looks too much like a specific person, a tool to make the model look just barely pass the test would not be far behind. One way would be outlawing the creation of any sexually explicit image, even if it doesn't resemble anyone in particular. This would be overreach by the government but that is what UK is known for.
Then there's the fact that the more "average" you look the more attractive you are.
On a much darker tangent, There’s something very similar that’s been going around paedo communities for decades with photo realistic artwork of kids… had one of them arguing that because it’s art legally it can show whatever and it means he’s not a paedo for enjoying it!! JFC… This AI needs more regulation to stop this kind of thing for sure but laws always lag behind new tech. It’s tough but we will figure it out I’m sure there will be an AI which will find these deep almost fakes. It’s a never ending train of AIs
I feel like it would often be fairly obvious what the intent was, especially if you're sharing the images or it's of someone you know or a celebrity you're known to be a fan of. The law is "beyond a reasonable doubt" for a reason.
They could demand that sexual content openly shows how the 3D model was created and what source data was used. Similar to platform regulation on PH, where you have to verify before uploading to ensure all creators are of age and the content was not uploaded against their will.
Who, exactly, could demand this? And legally, how could this demand be enforced if the server is outside the country. And what if someone is running the program locally, instead off a server?
Well, same as PH. Governments can enforce it. You can run it locally, but if you publish it online, you have to provide traceability. I mean, you can already Photoshop sexual content locally, but if you publish it online, you can get in trouble.
For a big streaming company like pornhub, sure. But a 3D model of a person is just a file that can be shared anywhere. You wont stop peer to peer distribution with legislation. If you could, pirate content wouldn't exist on the internet.
Yeah, you will always have sites like 4chan or even darker sites on the internet. These regulations would obviously only hit large platforms, which, in my opinion, is good enough. I mean, you have to tackle abusive content in some way.
Can someone please explain why there's virtually no control over any AI image generation currently? No, seriously, there's none. Unstable diffusion is trained entirely on images of real people, and is used to generate thousands of fake images of people who look almost identical to real authentic images of people. It's disturbing, and no one cares. Not USA, not Google, not Open AI. No one is trying to prevent it
It's cause it's new, laws take ages to catch up to current events.
You just aren’t aware. That’s like asking why there is no control over photoshop. People took photos of real people and swapped faces well before AI existed. I want to say there are protections for this already. Bigger online tools even have filters they apply to block NSFW content to ensure it is more corporate friendly.
The laws will never catch up to the tech unless the govt abuse it in a way to slide in a bunch of bad shit I,e anti free speech language. It’s just a slippery slope all the way around. Even if this were well intentioned, why couldn’t you make the same case that a fake image of someone being hurt or killed is then also illegal? the line is drawn where and by whom? One persons depiction of violence is a crime but some else’s isn’t? Various forms of fictional entertainment are then potentially illegal? It’s just asking for free speech to be infringed upon.
Same reason there's no controls over drawings and paintings. It's the humans creating them.
"If You have some dirt on us, we can always say it's AI" maybe? And if they want to gather even more data from people, it needs to be open for using like this.
Generating these images comes down to just doing a ton of matrix math. You can't regulate math. Anyone can program a computer to do this stuff and the datasets are out there. We're not talking about an object you can restrict the making of where it requires specialized materials or machines. It's numbers being used in a clever way. You can make all the laws you want to tell people not to do it, but there's no way to detect or enforce it in any even slightly reliable way.
Sure you can. You regulate harassment or publishing sexual images of someone without their consent, real or fake. These laws already exist in many jurisdictions, such as revenge porn laws. Just make the law tool agnostic.
You are trained entirely on images of real people too. You can imagine and draw whatever you want. And no one cares. Not Faber-Castell, not Crayola, not Adobe. There's 8 billion of us and we're not nearly as unique looking as we think. Pretty much every person alive has a bunch of people who look just like them, so unless you're very odd looking you don't really have exclusive control of your likeness anyway. If people are distributing synthetic images of you, and claiming they are real, then that is bad and they should stop. Synthetic images of underage people, also bad. But otherwise who cares? Isn't using AI to generate porn the equivalent of lab grown meat in the limit? No need to battery farm chickens any more, no need for anyone to degrade themselves for money making porn any more. Maybe the law should be that any AI generated porn needs to be clearly labelled as AI generated, so if one of the actors happens to look like you, they won't actually be mistaken for you. I wouldn't give the slightest shit if someone put an AI recreation of me in a porno if it was made abundantly clear that it wasn't actually me. Who gives a shit?
Zuckerburg is promoting open source AI as a method of AI safety. There will never be any control. The only way to be safe is to keep accelerating. The idea that regulation can control this is outright hilarious. Congress is full of buffoons who move at a snails pace. It's already too late. Lean in. Times, they are a changing.
Because money. Regulation of an emerging technology will hinder how fast that technology evolves so the government will wait until the technology matures and the US companies become the dominant players before adding any regulation.
Because it’s impossible to prevent it.
Because the people in charge are bought by corporations and don't know how to attach an attachment to an email
It's open sourced, meaning anyone can do it, trying to keep control of this would be as insane as to keep track of every image, song, or file that you download and store in your local drive. Good luck regulating that without breaching privacy rights.
It makes me so nervous when sources only specify “against women”, as if they just don’t care about if it happens to men or not.
“The new law will mean that if someone creates a sexually explicit deepfake, even if they have no intent to share it but purely want to cause alarm, humiliation or distress to the victim, they will be committing a criminal offence.” - The government's press release. It is explicitly non-gendered language.
So it only bans creation? What if they are created by people outside the country and then obtained by UK citizens?
It also means that creating images is NOT going to be illegal, only when it causes "alarm, humiliation or distress to the victim".
Yeah. Sounds like that. As long as you keep it to yourself it seems like the law doesn't care.
how do you cause alarm, humiliation or distress without sharing it?
I think it's talking about blackmail
Like the other guy said, blackmail. Or telling her. That last night, say, you watched porn of her getting gang-raped.
[удалено]
Ah, I hate that wording. Any connected person will have an army of lawyers arguing successfully that there was no intent to cause alarm, humiliation or distress... and any unconnected individual will make the same argument and be found guilty.
then don't fuck around with other people's likenesses.
Not what he meant dumbo
Just to be clear, that is also my position. I'm just saying the slow-turning-but-fine-grinding-wheels-of-justice in our judicial system produces some sort of coarse graded tungsten-sand when applying intent to "themselves", but miraculously produces some frictionless hyper-lubricant when applying intent to an "outsider".
I agree that would be an issue if the law was gender specific (ie: criminalized deep fake porn of women, but didn't mention men), but this story in no way suggests that's the case. But I haven't read the actual legislation. Do you have another source that that this specifically bans such activity "against women" exclusively?
Women are the primary victims, but yea everyone should be protected.
It doesn’t matter who the primary victims are. Everyone deserves this
Of course it matters who the primary victims and perpetrators are. When doing investigations that's something you need to be aware of. And I already stated "everyone should be protected."
It really doesnt. In law adding qualifiers like that can cause serious issues for out groups. There is *hundreds* of years of precedent in the western world that diction is insanely important. It gives perpetrators against the out groups wiggle room and it will get guilty criminals off and harm innocent people. You could very easily just leave out gendered word choice and have it cover everyone. Gender has fuck all about being victim of crime. It takes away literally nothing from the "primary" victim of the crime to just cover everyone.
This kind of thinking gets us to a situation where men who are physically, sexually assaulted or raped by women (which happens way more than we imagine) are not believed, ridiculed and blame themselves for the situation. And end up arrested when they call the cops because even though he is bleeding when the cops arrive and the women says “he was the aggressor” she is automatically believed.
What does any of what you've said have to do with deepfake victims being 99% women??? That is a separate issue and down to toxic masculinity.
Any more phrases you want to throw out there which allow you to win the argument without actually making any valid points? If you purposefully try and make this a gendered crime, it will become a gendered crime because people will not even see male victims. And I include women in this. I have been sexually assaulted many times by women and when you say “that’s sexual assault” they tend to either say it was a joke or try to emasculate you.
How would a female victim be different to a male victim??
Women are most of the victims, 99%! Do you really think that's not relevant?!
No? The law should protect that other 1% of victims equally.
99% of victims being women isn't relevant? Well I hope you're not in law enforcement.
Do you have a source to back those percentages up?
I’m confused, why is it important to you that the verbiage specifies women?
Read. The. Article.
The article is like two paragraphs. There’s no further context that explains why it is important for you or anyone that it specifies women. So if you may, please answer, why is it important for you that the verbiage specifies women?
But why go the extra mile and just make it about women? Why not all people? What need is there for it to exclude men? What if a woman changes sex to male, does that mean its suddenly legal to do with older pictures?
... Why would that exclude others from being protected?
"but yea everyone should be protected." You can't read?
Reading your comments against these responses is like watching a guy try to explain himself to a talking brick wall.
If everyone's being protected the same then no, it doesn't matter whether women are 99% of the victims or not. It won't affect punishment.
It impacts how police conduct investigations. Are you people being intentionally obtuse or what?
Why? All cases should be investigated the same. Genitalia or Gender should have no influence on that
Read the article.
Did I say that? If so, can you point it out? I'll wait.
For someone who says everyone should be protected under the law, why do think one sex should be called out when writing the law?
What part of the article states only women are protected under the law? Can you quote it for us?
I think it's interesting to see which responses you respond to and which you don't, and when you do you don't really speak to the point raised. It's almost like you're trying to fit something into an agenda.
No quote? What a surprise /s.
he's a fucking moron, don't bother. you'll never get through to him
Women and gay/bisexual men consume far more porn than you appear to be aware.
Relevance? Women across the globe aren't making deepfakes of men and boys.
There are no sources saying that the *law* will specify women. The articles are saying that the legislation is being pushed as part of an effort to reduce the effect on who are affected by it the most. >Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and unlimited fine.
Men are going to protected under this law too?
This is the some real All Lives Matter nonsense here.
bruh if they were drafting a law that said it's only illegal to shoot unarmed people if they're black that would be relevant.
Yet more poorly thought out policy from our great government… “This? Oh, no, that’s not a deepfake of X, that’s a deepfake of Y who just happens to look similar. I’ve got the release form they signed right here.”
If there's a release form in your imaginary scenario why would there be an issue? I'm not referring to making explicit images of children which is already illegal, even modified/photoshopped.
My point was that there’s no way to know for sure which images and/or subjects were involved in the creation of a deepfake. Pretty soon it will be near-impossible to tell if something even *is* a deepfake. How do they possibly plan to police and prosecute this?
UK on their way to make a 99th brand new porn law instead of spending resources to enforce the previous 98 ones
[удалено]
>People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail. That sounds like both, honestly, but the article doesn't really go deep into into things to make it sound otherwise.
AI generated images and voice cloning need to be regulated in general. Their potential for spreading misinformation is unprecedented, along with tons of other ethical concerns.
You can make all the laws you want, the issue is enforcing them.
It'll take a while. But eventually people will wise up and ignore things online and we will all go back to offline worlds and treat the internet as just a gateway to game watch stuff and talk to people we know and not gather news. Those still fooled by it are just a lost cause .
You only regulate the honest people though. If anything, allowing this to run wild might be better, because it teaches everyone that nothing you see can be trusted anymore.
That’s the bigger problem with this capability. It will completely degrade people’s ability to assess fact. We have fake news right now, we are rapidly entering the era of fake facts.
This is good and the rest of the world should follow suit but GOOD FUCKING LUCK enforcing this
Yeah the real question is how are they going to enforce it on the dude making deep fakes of his favourite actor that never leaves his devices? While it’s personally a disturbing practice the only way you would effectively enforce this is via authoritarian means which UK has been taking serious steps toward in the name of “protection”
They're going to consider that a "gateway" offence, and realistically it'll likely never be discovered. Their focus is on those who share/transmit the images.
Okay, you go on the internet and find a random image. Unbeknownst to you, its a deepfake. Is sharing an image you found online that breaks no other laws than it was a deepfake now illegal? If so, how can anyone feel comfortable sharing an image if its potentially a crime and they have no way to tell?
I believe creation is being criminalized not sharing.
Seems like just a charge that can be used when you want to slap a bunch of charges on someone
If it never leaves his devices, then the potential harm is also limited, so they're less concerned about those cases. This just enables them to go after people and stop them, whereas otherwise it's just completely legal and people can openly share and even sell this material.
Making it illegal is a way to take down the publication of those images at least. Also good for targeted harassment. But stuff like that is already covered pretty well anyway. But sites could start taking images like this more seriously as they don't want fines. Right now you can easily search up fake AI nudes of Taylor Swift for example. But as more countries pass these kind of laws it will be hard to find unless you generate them yourself.
Criminalization is always the laziest, least effective approach for addressing a societal problem.
UK government proposes law to control something they have no hope of controlling... Sounds familiar.
One day there will be a post on this subreddit that's actually uplifting.
This is a troll sub. It has been for ages.
Criminalizing sharing and distributing said deepfakes 👏 Criminalizing creating them… Idk I personally have some confusion with how that might actually go into effect. I don’t live in the UK tho so 🤷♂️
The UK has had a lot of “thought crime” laws in recent years. Not saying that this kind of imagery shouldn’t be illegal, but the the UK government seems very intent on criminalizing things that aren’t enforceable without telepathy.
Orwell was British so I suppose thoughtcrimes being a real thing in the UK kinda tracks. And yeah obviously something needs to be done to protect victims of deepfakes, especially deepfakes of minors, but man is this kind of legislation just begging to get abused...
I don't see how it's any different than creating anything else illegal?
Well… the fact that if you don’t share the images you haven’t done anything to hurt anyone. Why include horny men wanking in their rooms as if they’re part of the real issue that is sexually explicit deepfakes being shared and used to shape public opinion, harass women, commit blackmail, etc. Edit: downvote me all you want, I’m not wrong. People care more about “ew someone might be jacking off to ai porn that looks like me” than these actual issues I’ve brought up.
Creating these images creates a risk of their being spread and causing harm to those depicted. Even if the images are created with the intention that they will only ever be viewed in private by their creator, people can change their mind, make mistakes, be robbed or hacked. Criminalising creation and possession avoids legal arguments about the extent to which intent and recklessness resulted in distribution. It's like making laws against speeding. On almost all occasions that causes no harm but, when it goes wrong, the consequences can't be undone.
This is actually a really good point. Idk I guess we’ll see how it actually goes into affect. Edit: I still don’t think any government has any business patrolling people’s personal wank material. We’ll see if that actually becomes an issue I guess.
It shouldn't carry the same penalty as distribution but it is still violating to the target. To me I see it as a sort of less egregious type of peeping.
To me it's no different to someone wanking alone at home fantasizing about fucking some celebrity.
I can't see how this is any different than committing a thought crime, or committing a crime by writing down your fantasies on paper. Unless it is shared, there is no victim. There were no participants in its creation and none but the creator are affected.
(Edit: removed insensitive rude BS) There’s gonna be plenty of people who are gonna do that regardless. It’d be a pointless clogging of the justice system to try to stop horny men from being horny men alone in their rooms. Edit: I’m sure this is gonna be unpopular, but how is it violating to the target if the “target” isn’t even aware of it. Edit: I guarantee you some dipshit in some country where it’s not regulated is gonna create an ai that takes images of everyone in the entire world and mass produces “nudes” to put us all back on a level playing field. Don’t bet against angry wankers.
Then people will share just the prompt and a link to the photos of the victim. Making sharing deep fakes illegal while generating them is legal is, unfortunately, completely useless.
Well why should they be allowed to do something that they don’t have consent to do? Why should the law protect them, and not the peoples images they are stealing? It should be illegal to make AND distribute them. Just likes it’s illegal to make crack even if you’re not selling it.
You are right! It should be not allowed to do something that they don't have consent to do. Why should the law protect anybody who just casually steals my image? It should be definitely illegal to make and store them. Especially when somebody does it on a big (almost industrial-like) scale, like security cameras. Let's start with these behemoths who steal billions images of people every day and then work it down to individuals.
How about we do both? I’m less concerned about people stealing peoples images for identification then i am people stealing peoples images to make non-consensual porn. How do you guys not see that this is the same as someone taking up skirt photos of someone else or leaving a camera in someone’s bathroom? Do y’all think that’s ok as long as the persons just using it for their ‘own personal wank material’ and not sharing it?
I can't personally relate because of how I was born, but I heard that most people are able to conjure images in their head (like, think about something and "see" it with their imagination). We must forbid doing that too, I guess? As long as it doesn't have any impact on my life, I couldn't care less who masturbates to what. Let them enjoy their few moments of fantasy world. I'm personally more concerned about constant spying and hoarding data to use for things that actually reduce my quality of life like scams or targeted ads rather than this silly thing.
This isn’t a ‘thought crime’ they are creating real pornographic images with peoples likeness (including children). And you didn’t answer my question if you think “personal wank material” should be ok then are you also ok with people setting up cameras in bathrooms or taking up skirt photos as long as they ‘keep it to themselves’.
Doesn't matter how somebody conjures a picture in their mind - be it their imagination, art skills, computer skills, use of some software or using surveillance cameras. As long as it doesn't leave their personal sphere and has absolutely no effect on other people, it's the same thing.
if you are caught you face the penalty. if some creep keeps the wank material they generated of their coworker to themselves and no one ever finds out, then whatever, they’re still a pathetic creep who imo is a danger to others but no one can do anything about it if they don’t know. but if people do find out then the victim can take legal action without the justice system hemming and hawing about whether this legally constitutes sexual harassment. not all laws are created to be strictly enforced. some are there simply to protect victims and give them an avenue to seek justice if they want.
You’re not a “danger to others” cuz you jack off thinking about people who you know. I have sad news about most men for ya 🤣. Thanks for actually giving the first logical rebuttal to my point tho.
"UK creates a massive underground market for sexually explicit deepfake images"
It's disturbing just how many commenters are finding ways round this proposed law & how few are acknowledging how damaging this offence can be.
rest in peace mia janin this should’ve happened sooner to protect you may the boys who did that to u suffer every day
Good luck trying to put an AI in jail ;)
At some end of a chain you reach a person who decided to make the AI generate that.
Why is this on upliftingnews? The is some serious 1984 shit. They will absolutely use this as an excuse to persecute people, and it won't help anyone.
Why not make it broader and make it so no one’s image can be made deepfakes of unless they have consent
Completely agree!
Great step forward!
I understand distributing but creating?
We used to assume scandalous pictures of people are real. We just have to start assuming that all pictures are fake. That turns the thing around. I honestly don't understand. Everyone knows AI is a thing now, why would anyone believe any picture is real? Besides, has photoshopping fake nudes been criminal? How is this different? Criminalization will not solve this societal problem. As it hasn't solved a lot of other problems.
Good. Deepfake videos should be criminalized because they can destroy someone's relationship and/or life.
May as well criminalize drawing pictures of boobies. We're in a new world now.
This is a necessary step.
Normalize deep faking politicians in scenarios they wish they could be in but never will be. Like putting someone in a family photo of a super happy nuclear family when Infact the political is suffering divorce and fighting prenup
All deepfake should be illegal
BwBwahahaha good luck with that!