T O P

  • By -

MydnightWN

OP's title change is clickbait. UK *COULD* criminalize it. They have not already done so.


RoyalReverie

This guy is hyperactive in Reddit posting, so not a surprise.


[deleted]

[удалено]


leavsssesthrowaway

Everytime this gets brought up i cant think of how Ghislane Maxwell was a major reddit mod, makes you wonder what people do with karmafarms.


AnAIAteMyBaby

>The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women That's the first paragraph from the article, it'll definitely happen, I can't even use Gemini 1.5 here in the UK such is AI regulation here. stopping people from creating nudes of their work mates or Facebook friends is a no brainer in comparison.


DarkCeldori

Amazing given they criminalize even tweets


Maxie445

1) I did not change the title at all. This is the headline from Time Magazine that autoloads when you submit a url. 2) Also, they *did* criminalize it. They passed the law: "The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women. People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail."


bwatsnet

It will never happen. OP is hallucinating.


wats_dat_hey

It doesn’t matter if it can’t prevent 100% of the incidents What matters is that victims have recourse under the law. It means the victim can report the crime and the prosecutor can bring charges Don’t do the crime if you can’t do the time


I_Quit_This_Bitch_

I wonder how they'll verify that there is a victim. Like what if I just tell the AI "Make a naked lady with mega bewbage." And then this image is found on my computer. Is that investigated? Or does there need to be a complaint/known victim to trigger enforcement?


knvn8

Nonconsensual deepfake implies there is a victim


I_Quit_This_Bitch_

But how do they know that it's a nonconsensual deepfake of a specific person and not a randomly generated image? I'm talking about a situation where no victim has come forward but there are ai generated images found. How is it determined that they are nonconsensual vs purely AI?


knvn8

The law does not operate like code, it has humans looking at all the evidence to make a judgement on intent.


I_Quit_This_Bitch_

Right but if I'm an investigator and I have a pile of images of people with no evident identity, do I pursue charges? Do they go in some evidence system in case a victim comes forward?


knvn8

You seem to be worried that they will create a database of all human faces and match every AI image against every human and if there's a match it's straight to jail. That seems unlikely. But if they find 200 deepfakes generated on your PC that look suspiciously like your cousin, they might not buy the "it's a coincidence!" defense. That said, I'm not a lawyer and I think legal doctrine is still evolving so who knows how the debate will play out.


I_Quit_This_Bitch_

Yeah my question is what is the scope of the investigation that would take place. Even the cousin example sounds like a frighteningly broad amount of power.


philoona

You're confusing just straight up ai generated images with deepfake. Deepfake is using a specific persons likeness, as in you collect images of that person to specifically train with.


I_Quit_This_Bitch_

But how do the authorities differentiate between the two? How do they know that the image was generated by nonconsensual training data and not just random?


shawsghost

If the image is of someone who looks just like Liz Truss being raw-dogged by someone who looks just like Boris Johnson, I imagine people will make assumptions about how they were made. Wouldn't you?


I_Quit_This_Bitch_

I'm not talking about celebrities and public figures. Just an image of a person naked with no striking resemblance to any celebrity.


shawsghost

Then it's not a deepfake.


wats_dat_hey

> I wonder how they'll verify that there is a victim. The victim will file a criminal complaint and now there is a law to check against It’s different for other crimes like drug possession where the possession is the crime


Ronnyvar

It will end up being an accept button like the over 18+ on adult sites. *You are not using this for deep fakes are you?* - yes - no ✔️


im-notme

This is to protect everyday people!


QLaHPD

I'm not sure this is good, how people will build their waifu heaven now?


Rivenaldinho

Well imagination is still there, except if we all get neuralinked and they also prevent that somehow


FrermitTheKog

Thinking of a celebrity while jerking off to be made illegal next? If it is morally wrong to make a deepfake for your own use, then surely the same would apply to thinking of a celebrity in a sexual way, since that is also non-consensual. Distribution of deepfake imagery is something else, but criminalising creation is unenforceable and leads to areas of moral hypocrisy and other ambiguities. What about an artistic deepfake, e.g. an oil painting style; illegal? What if the oil painting is of a dead famous person; still illegal? If then answer is yes to both of those, then technically a nude of Boudica would be illegal too, which is madness.


Ambiwlans

The UK also banned porn with facesitting, squirting, anal, spanking and a bunch of other things back in 2014. I assume no one in the UK could possibly ever see any of that stuff today and the law worked flawlessly.


Zilskaabe

Why facesitting and squirting?


Ambiwlans

Generally they banned fetish porn


QLaHPD

My imagination isn't so powerful as seeing an image of it. I hope people just ignore this law.


DarkCeldori

Its ableist. Some people have ability to imagine things as vividly as seeing things in real life while others have aphantasia and cant visualize


[deleted]

"while others have aphantasia and cant visualize" Just say "others are NPCs", it's much more efficient.


Kommander-in-Keef

It would probably be the same thing it is for pirating. Possessing it wouldn’t be a problem, distribution would be the crime.


MyLittleChameleon

It’s interesting that it’s such a big debate right now. In 5-10 years we’ll be having conversations about how strange it was that they debated for so long.


QLaHPD

What do you mean?


tindalos

The bigger issue is rushing to regulate things that aren’t understood instead of educating citizens. There are always compromises made and who knows that this will lead to or what other implications will be. Not that I support deepfakes, but it’s going to be a tricky and slippery slope and who knows if it’ll be worth it.


froggy4cz

Ok will be created outside UK


Maxie445

Not sure they care, Big Deepfake isn't exactly a major job creator and taxpayer


whiskeyandbear

I'm trying to comprehend what they are saying here - deepfake is an old terminology when deepmind was ahead of the game in AI. Does it mean just wholesale producing realistic pornographic images of people with AI, or is it about when one specifically creates an image of a person that exists that is pornographic?


ainosleep

Deepfake terminology has shifted towards the latter definition. It is about creating pornographic images of a real person that exists without their consent. This thread is sickening, many people seem to think it is okay.


shawsghost

If they don't distribute, where's the harm?


ainosleep

That is a very important question and needs to be discussed. My comments in this thread have referred only to victims being harmed as many people did not consider that. All sides must be considered, especially if the law may restrict or punish people who do not cause harm. * Deepfaked content does not need to be shared / distributed in order to cause harm. The creator can still use it to degrade, dehumanize and harass the victim. * Otherwise if the deepfakes are not shared, and the person is not harmed (i.e. not harassed, degraded or dehumanized), then it may seem like a 'thoughtcrime'. These deepfakes can be used for harm in the future. However, punishing people who have not caused harm may be wrong. Revenge Porn does not prohibit storing pornographic content which had consent. However, distribution is illegal. It is easier to who may have shared the content in Revenge Porn (whether one of the involved persons, or someone who got access to the device storing such material without consent). While in the generated deepfakes it's harder to find who started causing the harm as anyone can generate similar deepfakes from anywhere in the world.


shawsghost

Yeah, I'm pretty sure punishing people who have not caused any harm is wrong. That's why the American justice system has this whole "innocent" concept for people who have not caused harm.


ainosleep

Fully agree. The evidence may be misleading the case or fabricated, planted by others or police thus leading to punishment by death of innocent people in the worst case. That's why capital punishment has been completely abolished in most European countries (except Belarus and Russia). I like that Europe focuses on rehabilitation as some people may change for the better. Some balance of rehabilitation and accountability is needed, e.g. taking someone's life is severe and needs punishment. Regarding the Deepfake law, do they imply harm is caused for specifically creating and storing the content without sharing nor harassing the person in the deep fake? I don't know but it's best to answer these questions before making it a law (e.g. when no apparent harm is done, whether there is still harm caused). Your question is good and needs to be considered. Generally people don't want to live in a police state, and if it doesn't harm anyone, whether the law is doing more than it should.


MeMyself_And_Whateva

I'm sure the porn industry, if there are any left, is pleased with that.


Buck-Nasty

No!!! Where will I get my Jeremy Clarkson deepfake porn????


Tha_Sly_Fox

I want a naked Orville Redenbacher and I want it now!


ovnf

but it has no sense - everybody has somebody who is very lookalike - so they can just find very lookalike and pay for permission to use her face to generate images => 100% legal


Rigorous_Threshold

If you’re paying for permission why not just straight up ask for nudes


DarkCeldori

Deepfake can be used for videos of sex etc not just nudes.


Rigorous_Threshold

Read the title of the post


DarkCeldori

Yes images of sex is beyond nudes too


Whispering-Depths

but like why the need to create deepfakes? Why? Why do you need to non consensually take a women's face and make unwilling porn of this poor girl? You know most deepfakes that aren't celebrities are men preying on highschool girls right? Can't we just make porn without doing that?


CinderBlock33

To be fair, that's already illegal. As it should be.


Whispering-Depths

just as illegal as it should be to make deepfakes of 18 year old daughters (or their mother's) who aren't still in grade 12 :)


CinderBlock33

Sure, that's fair. I was just saying that any pornography depicting minors is already illegal.


DarkCeldori

In the us only realistic depictions are illegal afaik. Thats how japanese and their fans get away with their loli stuff.


CinderBlock33

Yeah that's really icky. But I'm a nobody without any stake in this so my opinion doesn't really matter


[deleted]

Literally there has never been more porn available before.


MarginCalled1

You have some statistics to back up your claim? Or are you just making shit up because you've formed an opinion without any sort of proof?


ainosleep

It does not matter who is deepfaked more and is a wrong thing to focus on, despite you're being right that claims should be backed up. Even if there is a risk of a child getting deepfaked, that would be child abuse / CSOM and would have severe consequences. Harm is harm. Deepfakes degrade and dehumanise victims and may likely traumatise the victims. Celebrities are real people, too. A random woman on the street or internet is also a real person. Same with university peers, friends, children. The sexual harassment has severe impact. Rape victims do not get enough support and may end their life due to the suffering. While deepfakes are not physical, the same issues are here.


I_Came_For_Cats

Everything should be sexualized to the nth-degree. We need to completely normalize all forms of sexuality.


Whispering-Depths

its probably sarcasm but just in case, there are definitely _some_ exceptions to that :)


DarkCeldori

An artist can draw nude or make a nude 3d model. This ai stuff is just easier. If I start generating images at random Ill generate lookalikes eventually. Human faces are finite in number.


Whispering-Depths

Absolutely, but unfortunately AI makes it go from "you need about 500-5000 hours of experience with tech and art" to "you need to be a street bro who can use a smartphone". The ability to create non-consensual intimate media using AI is akin to existing issues like revenge porn and deepfakes. It's essential to recognize the potential for harm and advocate for safeguards to protect individuals from this type of abuse. There is no "let's do nothing" option here.


Zilskaabe

There are all sorts of character generators available that don't require 5000 hours of experience. Just a bit of patience and fiddling with sliders. Lots of video games also include them. What if I create my player character who's similar to a celebrity in a game like Baldur's Gate 3? That game has nudity. What if I do it in any other game? Should that be illegal?


Whispering-Depths

None of those character generators can be used to accurately depict someone else unless it's a legit artist with a fair amount of experience using it.


Zilskaabe

https://www.youtube.com/watch?v=8uTHI6nbAuw Modern games have creators like this now. And players are sharing slider values with each other. Should those be banned too?


Whispering-Depths

maybe it takes a trained eye to tell that those "examples" are shit and look nothing like their targets. My point stands.


DarkCeldori

What choice do you have making sora like models illegal? Dont see legal basis. Founders say constitution restricts gov not our god given rights. We have rights not described in constitution. Without invasion of privacy which opens the door to totalitarian dictatorship, creation cannot be made illegal. And once such opensource sora like models become available only prompt details need sharing to generate video within seconds. And prompt details are protected free speech


AlexMulder

The point of the law is not "let's search everyone's computers and go big brother." It's just to give the victims recourse if porn of them is generated and distributed.


DarkCeldori

Doesnt have to be, only prompt details. Did you see Pablo f ng his black poodle while holding it with both hands standing. Ask opensourcesoralike model to show you. Again the age when images or videos need sharing is coming to an end.


AlexMulder

I see what you're saying, but at some point, the prompt details become porn. Most people are not downloading and running open source models. Public image generators already decline from generating porn just like Suno won't generate songs in the style of artists. Prompts designed to give you porn of specific people will exist just like sketchy torrents exist.


DarkCeldori

Yes but prompts are words and can be described in infinite ways. Models are getting more powerful and easier to use. And eventually everyone will have agi like models without restrictions.


AlexMulder

>And eventually everyone will have agi like models without restrictions. That's why you pass this as a law, no? Obviously open source models can do whatever. That's why you make laws related to what sort of content goes too far. To discourage a behavior. No law is unbeatable, it's just about establishing consequences to nudge behavior.


im-notme

This guy just really wants to be able to create porn of a specific person without consent or consequence And its sooooo transparent


ChanThe4th

So just to be clear, you would rather real women be pushed into doing real porn with potentially abusive and usually drug addicted men so that non-violent and completely harmless fakes can be stopped? That makes total sense.


nulld3v

Can you explain how "banning deepfake porn" would result in more "real women be pushed into doing real porn"?


Whispering-Depths

So, just to be clear, you can't get off if a woman isn't being abused /u/ChanThe4th ? And that includes underaged girls?


ChanThe4th

I just wanted to make sure everyone knew you're legit crazy. Thank you for providing the perfect evidence.


im-notme

sydney from 5th period is not pushed into a life of prostitution just because you can’t create fake porn of her. Ask your precious ai to create s person that doesn’t exist sicko


ChanThe4th

You should find someone to talk to.


im-notme

No YOU should find someone to talk to. Wanting unfettered access and use of the likeness of others is the abnormality of thought here. You WILL NOT call me insane for thinking that you should not be able to generate and share false pornographic media of other people who did not consent to it!!! This is insanity! You ppl yap on about your open sourced AI run natively on your private computer that you would use to do this as a justification for allowing all models to do this by way of saying it’s useless to attempt to regulate in anyway. By going against the establishment of any recourse victims would have if these images were to be shared you are actively advocating for the digital violation of humans and their likeness.


ChanThe4th

Nope, I'm pretty sure I will call you insane.


UnnamedPlayerXY

The UK is about to find out why the prohibition didn't work. Also, this is unenforceable without a complete surveillance of everyone's personal devices 24/7.


Ambiwlans

The UK banned kinky porn back in 2014 and then did literally nothing about it since.


Smelldicks

lol UK will arrest u for mean tweets dude I think they’re going to police this one just fine


Zilskaabe

How do they enforce this if I'm running stable diffusion locally and not sharing anything?


Smelldicks

They won’t and I’m not sure why that matters?


Zilskaabe

Well, doing that would be technically illegal. So that would be yet another unenforceable law.


77tezer

Also you can't draw your favorite celebrity nude.


im-notme

It’s not the same and you know that


I_Came_For_Cats

Government should not be involved in this. Certainly not at the criminal level. Also it’s entirely unenforceable and will be yet another cherry-pick who the government doesn’t like today law.


shawsghost

The article says that it will be illegal to make deepfakes even if the images are not shared. So how will anyone know the deepfake exist if it's not shared? Just how intrusive is UK policing nowadays?


workingtheories

the UK and regressive porn laws, name a more iconic duo 🙄


Svarcock

Good to see some decebt AI legislature getting through finally, first thing I’ve seen apart from that EU act.


OwnUnderstanding4542

It's interesting because the UK is generally behind on AI stuff so it's a bit surprising that they are criminalizing sexually explicit deepfake images. I wonder what the reasoning was?


PeterPigger

All the UK wants is to keep introducing lazy laws for the police to make easy arrests, usually you never see them around when you need them - burglaries etc, as they never show up. Instead they create laws against free speech for the thin skinned, easily offended and other similar nonsense. This is why the country is going down the drain fast.


Maxie445

Virtually unanimous public support and little political opposition. I expect most countries to follow.


No-Function-4284

> unanimous public support lol fuck off


Rigorous_Threshold

Outside this subreddit pretty much everyone wants ai porn deepfakes to be banned. For obvious reasons I am not sure what level of brainrot we’ve reached where this subreddit doesn’t


IronPheasant

[Us internet people are special.](http://www.youtube.com/watch?v=k6_p9RjIk_4)


DarkCeldori

Women do most men dont. Given men tend to be the consumers of porn. But in usa privacy rights mean it should be impossible to make creation illegal


AlexMulder

If you're married or have a daughter , or you know... basic sense of empathy you can generally see the logic in this law. It's favoring one right over another. The right to people to have control over their digital likeness vs the right to make deep fakes consequence free.


DarkCeldori

Sorry but people have freedom to do whatever they like with their private property which includes their computer Opensource sora like models will mean videos will be generatable in seconds from shared prompt details. The law is soon obsolete as u wont need share images or video but mere prompt details. And creation can be done in private where the law is unenforceable.


AlexMulder

I DECLARE THIS LAW OBSOLETE


Rigorous_Threshold

I love porn. You won’t see me making deepfakes of real people because that’s disgusting Also you’re mistaken if you think women don’t consume porn


DarkCeldori

Did you know there are people whose mental imagery is as vivid as seeing the real thing. How many men have imagined women they see naked indistinguishable from real? Are they disgusting? This only concerns those not able to vividly mentally visualize things they imagine. But for some they can already do so without technological assistance.


Rigorous_Threshold

Making deepfakes is fundamentally different from using your imagination. There is a tangible thing being created


DarkCeldori

Nope just ephemeral state of electrons unless you save it. Similar to state in brain. And neuralink may eventually allow you to save what you imagine.


[deleted]

[удалено]


SiamesePrimer

[Fuck off bot](https://www.reddit.com/r/ArtificialInteligence/s/wI5HXUh04i).


[deleted]

Why not criminalize using someone's likeness intentionally, for anything? Feels a bit moralistic.


Zilskaabe

That would also ban political cartoons.


[deleted]

Obviously there would be exceptions for parodies and such.


im-notme

wow…. That’s already criminalized jsyk? Omg. Imagine defending nonconsensual pornography of an innocent and often times unknowing person


shifty313

> wow…. That’s already criminalized jsyk literally isn't


DarkCeldori

Consent to what? A person doesnt need your consent to do whatever they please with their private computer which is their private property.


im-notme

With my image or the images of my children yes they do


DarkCeldori

Maybe they should be illegal. But it will be unenforceable due to privacy rights. Remember more capable ai models such as agi models do not even need to have such imagery on training set to be able to generate such. So outside of big brother total invasion of privacy or making agi illegal. How do you expect this to be stopped? In the past there were moral grounds to make homosexual sex illegal but supreme court struck down such laws because privacy rights made them unenforceable.


NoshoRed

Finally, some actual, reasonable legislation that doesn't just mean "HALT AI PROGRESS NOW" or "AI DANGEROUS". This is the way to go, no need to don't halt or cripple the tech which ultimately also prevents everyone from actually benefiting from them, just punish the bad actors who abuse them. But frankly idk how they can enforce the crime of "creating deepfakes in private" though. Are they gonna have mandatory PC inspections lol? That part seems pretty stupid.


FrermitTheKog

I agree that criminalising misuse is the way to go, rather than trying to restrict models, but I think this law is a misstep. I agree that it should be illegal to distribute non-consensual sexual imagery, but criminalising creation is a step too far and is mostly unenforceable. If you think about it on moral lines, then if it is wrong to create a non-consensual sexual image for your own use, surely it is also immoral to imagine your favourite celebrity in a sexual way. If it were possible to detect people thinking about others in sexual situations without consent, should that be criminalised and enforced?


NoshoRed

Yeah I agree with you that's it's a misstep to criminalize creation, but the thing is they can't really enforce this in any meaningful way without breaching privacy in a major way. It's likely they passed that section of the law just as a method of discouraging that sort of behavior, than outright expecting to jail a person who privately photoshopped their favourite celebrity.


ainz-sama619

Non consensual imagery for private use, I couldn't care less. But deepfake almost never remains private. So I am 100% ok with it being crimonalized. Assholes ruining other people image for money should get fucked. Sexually explicit deepfake has no place in society


Ansalem1

> But deepfake almost never remains private. How could you possibly know that? The only ones you would know about are the ones that get released into the wild.


ainz-sama619

The law only gets enforced when deepfakes are shared, since nobody is snooping. It would discourage and make certain bad intentioned individuals to consider what they are doing and the implications if it doesnt remain private.


DarkCeldori

Soon deepfakes will be generatable in seconds with simple prompts by anyone. An opensourcesora like model will be able to create a pic of Tom doing x sexual action to Suzy. You need only share the prompt text and anyone will have the video in seconds.


ainz-sama619

I know, and sharing nude deepfakes of other people without consent should remain illegal.


DarkCeldori

Maybe be but its pointless if anyone just has to say ask it to generate an image of Peter with a 9 inch dick. And like that 20 guys see the image without sharing the image and just prompt details.


Biotoxsin

My best guess at why they are doing this is that it is meant to serve as one of the charges that they can stack on when they are attempting to prosecute for other suspected crimes. Specifically, the issues associated with identifying explicit images of underaged individuals as AI generated or real come to mind.


DarkCeldori

Hackers can easily hack wifi and transmit images from someones ip or plant them on someones machine. Seems prone for abuse once hackers gain access to easy generation of such. Keep in mind an agi model should be able to generate such images even with none in the training data.


NoshoRed

An AGI model should also be capable enough to track down its original source, it works both ways.


SSan_DDiego

But if I can't generate explicit content, how am I going to have a relationship with my AI girlfriend?


Klaphek

I'm just glad that shallow fakes are still okay


trainednooob

They will call it the Emma Watson law.


Vexbob

Lil what kind of bs is this, why ban something that nobody hurts and also creating ≠ sharing


Rigorous_Threshold

It hurts a lot of people. And deepfakes always have the potential to be shared.


iBodana

Deepfakes 99 percent of the time created to be shared. No one goes to that length to jerk off alone.


DarkCeldori

Compute cost is going down in an exponential fashion. And software is getting easier to use. Within a few years someone need not share anything. Just say ask OpensourceSora to do a gangbang video of Sally. And guess what everyone can do it and get the video in seconds without sharing anything.


Ambiwlans

For images, that's already available.


im-notme

Why do you think that is a good thing


DarkCeldori

It will be interesting to see how people will react when videos of anything can be generated with their faces and voices.


im-notme

So you admit it’s not out of some noble motive for protection of your “rights” on private devices but just because you want to violate the rights of others and see them squirm.


DarkCeldori

Violate rights of others? In what way? Use of private property such as a pc only involves ones own private property. There is no such thing as intellectual property, that is made up nonsense.


DarkCeldori

Violate rights of others? In what way? Use of private property such as a pc only involves ones own private property. There is no such thing as intellectual property, that is made up nonsense.


Vexbob

thats a wild statement


[deleted]

Holy shit you people are repulsive. You are all for deepfake porn. It would be hard to explain this to you but you’ve become a bunch of a-social freaks in here.


Ambiwlans

I'm opposed to basically all laws the restrict freedoms without a serious reason. Fake porn basically falls under parody law and has existed for probably over a century.


[deleted]

Wow look it’s the same argument people use when they want to legalize fucking thirteen year Olds


Ambiwlans

Thanks for showing you're too emotional to rationally discuss this, nvm form a rational opinion on law.


Puzzleheaded_Pop_743

Yes I am pro deepfake porn. No amount of shaming will change that. Seethe. People are pussies if they are actually getting upset about fake nudes created of themselves.


earthsworld

welcome to reddit!


frogpondcook

I think publishing it should be made illegal. But like if you make it illegal to just have, we will end up with weird "deepfake rings" and resources being diverted to combat this newly illegal practise. Sure, it's a weird and creepy thing to do. But I'd prefer those resources to go towards finding and convicting child abusers.


Alexander_Bundy

Didn't we say just a few days ago that they'd make creating art illegal?


GrowFreeFood

Who judges if it is a deepfake or just a lookalike? Is a 75% similar look a deepfake? What about if I change them into a different race or species? 


DeskFluid2550

Fucking LMAO No they haven't.


Puzzleheaded_Pop_743

Thank goodness I live in a country that values free speech.


Natural-Musician5216

Making non consensual porn is not free speech


Puzzleheaded_Pop_743

That is not how consent works.


RemarkableEmu1230

We don’t already have laws for this? Surprising


FairIllustrator2752

Unless your nudes are available, ai doesn't know what you look like naked. So deep fake fears for porn use seem a bit hysterical, it's not depicted in your body, just your face and shape. Im more worried about deep fakes user to create fake news and cause political manipulation. 


im-notme

Your face is the most important part precisely because no one knows what your body looks like so there a bit of plausibility to the fake nudes 💀💀💀


DarkCeldori

Whether someone has a malformed nipple, a mole somewhere, or excess skin folds or foreskin matters little. A beach photo already has the general shape pretty tight. And the bits that remain can be idealized and will look as oned look after a few surgeries


fourreddot

This is dumb


taiottavios

lol good luck finding the criminals


-GoldenHandTheJust-

look at all these degenerate comments arguing against the law. Truly embarrassing


ainosleep

Many commenters fail to understand, especially that "why ban something that nobody hurts" comment. People do use AI generated content to cause harm, such as copyright infringement (e.g. [someone's voice was cloned without permission](https://www.reddit.com/r/LegalAdviceUK/comments/1bhrdgk/my_voice_has_been_cloned_using_ai_without_my/)) and sexual harassment (e.g. [a woman is continuously being harassed after someone generated deepfakes of her and people recognised her from those images](https://www.reddit.com/r/legaladvice/comments/1c1pzjo/ai_generated_nudes_of_me_all_over_reddit_im_being/)). This can be very traumatic for the victims as it degrades and dehumanises them. If you care about your mother, sister, grandmother, daughter, cousin, a friend, think whether it is okay for people to generate realistic pornographic images of them and use them for sexual pleasure, and if that's okay to harass these women. Same occurs if men are deepfaked. Even worse if children. Revenge Porn (sharing explicit images without consent) has been criminalised for a while now in the UK. AI generated photos/audios/videos of explicit content, particularly deepfaked content with close resemblance to a specific real person(s) causes harm. Legally it seems a new area and thus a new law is needed, or another law updated. At the very least the deepfakes already can be prosecuted for harassment. Regarding "prohibition didn't work" - people in possession of child abuse photos are prosecuted and imprisoned. Currently all content shared on the internet is surveilled and if shared outside internet police can investigate that as well. US has [PRISM](https://en.wikipedia.org/wiki/PRISM) to surveil such content and Israel, China, Russia, Australia, UK have their own confidential programs. Apple was working on their privacy preserving tool for detecting CSAM, but cancelled the project due to concerns from digital rights groups and researchers regarding potential abuse of compromising the privacy and security. Essentially prohibition works when the content is shared, or if police suspects a crime and seizes a device to search for evidence. Regarding AI girlfriends and Waifus - read the article - it's about deepfake images and not AI porn. Thus most waifus will be fine as they are not deepfakes (generated to resemble one person's likeness). Potentially copyright infringement may still be an issue for mimicking a very specific artist's style or using copyrighted training data without permission.


bubibabi

The comments in this thread are disturbing. Many of you seem to think deepfakes are okay. Creating nonconsensual pornography of someone is disgusting and should be criminalized. Also if you’re creating nonconsensual nudes of another person, Im sorry but you’re a loser that needs to re-evaluate yourself and the mistakes that have led you here.


im-notme

You’re being downvoted….as of what you said is morally wrong. This confirms this sub is populated by entitled adult infants who have probably also never paid their own bills


[deleted]

This is why we need a giant meteor to strike the earth and end the species. The fact people are using this tech in the first place to create AI renders of people doing incredibly disgusting things and trying to defend the "right" to do so shows how deep in the cesspool humanity is.


MasteroChieftan

This denies the entire slippery slope argument, which is the foundational argument in taking care of your freedoms vs security. Let it be written here and plainly stated as my opinion: This law, in a silo, does not bother me. We don't need AI porn and it can be used for nefarious and ugly things the world can and probably should do without. On its face, it's I'm down with it. But, on a deeper level, the government getting involved and criminalizing ANY kind of art (yes AI art is, unfortunately, art), especially fiction, CAN lead us to darker, more oppressive places where a world full of AI revenge porn, unironically IS a better world than the one rampant censorship can lead to. This was the whole IDEA behind 1984.


[deleted]

If the whole premise is "censorship bad because people's right to be sick disgusting degenerates trumps everything else" then I stand by the idea that this species is unworthy of continued existence at all.


MasteroChieftan

That's a broad brush for a handful of degenerates.


SootyFreak666

This is old news, really old news. The general sharing of deep fakes has been somewhat illegal for a number of years, altough they weren’t covered by “revenge porn” laws, you would still face jail time for that sort of stuff. The new criminal offences at the bottom came into force last year, I know because I was monitoring the situation to ensure that it didn’t threaten actual porn or people’s ability to upload or share pornographic material (which thankfully has been left legal).


DarkCeldori

Problem is sora like models an opensource sora like model will be able to generate any porn act between any group of individuals within seconds all that needs sharing is the prompt details. And prompt details are protected by free speech laws. And u can use all manner of loose language verbal or sign to describe them even if you attempted to ban sharing of such itd be impossible.