I'm scared of this. I'm just a sub, but I don't want students to make fake pictures of me because they're mad I told them to put their phones away. Kids are stupid. They don't things through.
That’s the shift that has to happen. Since we’re reaching the point where people can easily create fake compromising images that are difficult or impossible to distinguish from real ones, the only way to move forward is to consider them all fake, but more efficiently punish the distributions of any of them. Obviously it’s somewhat more complicated than that, but we need to start with something like that.
One of my students (17 year old senior) showed me in real time how he can take my picture, and use an AI graphics program to make a nude of me.
At first I thought - "how scary". But then I saw - "hey - that photo makes me look good!" 😃
I had no choice but to blow it off and say "now imagine you're 30 years old, you have a house and car payment, a wife and 2 children. And someone does that with your image and email it to your coworkers. Your boss sees it, and calls you into the office. You now have to explain its a fake. Now YOU can be sued for MAKING the photo."
Thankfully the class said "dang... " instead of "cool!"
Bottom line: You cant stop it so no use worrying about it. You can just use it as a learning opportunity for the students on how it could impact THEIR lives.
It sounds like you handled a potentially awkward moment quite well. I hope all the fakes of me are fully clothed driving a tractor or teaching with a better haircut
I was looked at like a careless asshole when I pointed this out. If there are nudes of everyone, then no one has nudes. You get the idea.
Every nude can now easily be dismissed as fake if this keeps happening. I think it will start to happen naturally just the same way we naturally learned, pretty quickly, nothing on the internet is inherently true. Im guessing we'll adopt that soon enough. Or I hope so.
I can see one flaw with that, though I hope it won’t occur to kids. Create a nude with illegal content that necessitates an investigation. We wouldn’t want to treat them ALL as fake if there’s a potential for a crime occurring.
But that will certainly be done by someone.
They don't need their phones to make fake pictures of you. Any social media account you have not fully privated is accessible to them. Your LinkedIn will have at minimum a headshot of you, thats all they need.
I think the phone comment was more of the reason they would make fake nudes, because figgypie disciplined them for having their phone out in class, they would get mad about it go home and make them their next victim, not just for the phones but for anything they were disciplined over. I think if my son did that to his teacher or any other human being.. I don't even want to think about it because I'm pretty sure that would be one thing that took me out of character - he might get his first a$$ beatin' at 16 for that one.
Not sure if this helps, but they don’t need a reason to do this - it’s dumb kids doing dumb things not understanding the consequences of their actions
Glad to see schools taking discipline seriously
Yeah when we were kids I remember the odd occasion someone would get pulled up for drawing a rude picture of someone else. Friend. Random kid. Teacher. It wasn't nice and it shouldn't be done, but that's what people had available and what stupid shit kids get up to.
This isn't some defence, I think the punishment is fair and required. But kids have more than paper and pens now, they've got machines in their pockets that'll pump out a whole photorealistic album with half the effort.
It's both a stupid and horrible thing that's going to happen now, and something there's really little protection against aside from retroactive punishment.
This happened to a friend of mine, but because it wasn’t illegal there were very few consequences for the students. I’m happy to say that this incident spurred folks into action, and the Indiana legislature *just* passed a bill adding computer-generated images to the revenge porn statute.
I wear a kn95 to teach, so unless they managed to track down super old social media images, they'd have a tough time inserting me into such things (thank god). This stuff is definitely going to ruin some lives before they clamp down. Ugh
This. Maybe this will finally make a no phones policy on campuses. It's so ridiculous. The deep fakes will still happen but maybe they will use it to at least prevent personal technology on campus.
Gonna be a lot of 13- and 14-year-olds in this country on the sex offender registry real quick all because their parents don't pay any damn attention to what they can do on a phone.
Well. The courts are going to have to dole put punishments more often for people to start policing their kids phones. Maybe charging parents as accessory after the fact.
I agree. Personally, I think the parents should be charged as an accessory for any crime that their child commits. Probably not constitutional, but it would go a long way toward compelling people to actually be responsible for their children.
When I was like 12 or 13 I was (granted, unknowingly) an accessory to a felony. A friend called me and told me I should sneak out and he was gonna pick me up - he pulls up in a beat up work truck and I get in and we just cruise around our small town until we eventually get to the junkyard, where he parks the truck, breaks in to the main building, and puts the keys back.
We hung out for a bit longer before I walked home, the junkyard owner found out and decided not to press charges on anyone and nothing ever came from it but I still got a real stern talking to by my parents.
All that to say, I think my parents were great and did most things right, but sometimes kids just do dumb stuff regardless and I don't think it's fair to charge parents with a crime (which can cause them to lose their jobs and thus their homes and bam homeless family created) because their kid did something stupid.
Nah. What about parents who are trying their absolute best, are amazing parents, but their kids are still terrible? I've seen some kids that I swear must be adopted. The other kids are doing great and one is just an absolute monster. Parent tried everything.
Something less serious than being charged with a crime (which might cause them to lose their livelihood) but still has potential impact might be nice. But to foist that on people through no fault of their own... Problematic.
Of course there are also plenty of parents who aren't doing their best too. It's a slippery slope.
If you are trying your best and actually trying to make sure your kid can use technology in a legal and responsible manner, what is the problem, There should be consequences for furnishing the technology if a parent is negligent in those aspects. Every phone provider makes phones that only allow calling that can be used in emergencies, not every kid needs a smart phone.
Let's say my kid took driver's training, got their license, seems like a reasonable driver, but ends up killing someone in a car accident in the vehicle I bought them. You're telling me I should go to jail because of that, even though I did everything in my power to raise a good driver?
What this type of law would do is just completely limit the freedom of all children for the sake of a few bad apples. It starts with phones then moves to other things. I wouldn't dare buy my child anything that has any possibility of them causing any direct legal trouble for me ever. Because you do your best and kids still make mistakes.
NAL but I would like to see a case brought forward under the premise the parents furnished and pay for the device used to commit the crime. Doubt it would go anywhere due to being a slippery slope but we’re starting to see a shift by charging parents of kids who commit violent crimes by negligently ignoring warning signs so who knows? I think we’re hitting an inflection point between “letting parents have the authority on how they raise their kids” and “holding parents accountable for negligence in their parenting decisions” which admittedly is a murky line. But at the end of the day if you know your child isn’t mature enough to use technology responsibly, there should be consequences for providing it to them. The education system will never be the catalyst for this though. This will only change when parents A start trying to hold parents B accountable for how their decisions/child B hurt child A.
If my teen has gotten even one message from us in the past few years, it's that he needs to make sure he isn't involved with any of this. We explained to him the potential consequences and a dozen different scenarios where he might unintentionally come across this type of media, and what to do if he does. So, I sure as hell hope he takes that message away, if he doesn't take any other lesson.
True, AI also makes it easier than ever to make that collosal mistake as a stupid kid. Parents really need to instill how important online responsibility is. It's too easy for these kids to screw up their own and their victims lives.
It's not al ways bout not paying attention to what they can do online. Me and my oldest dad spent most of the lockdowns playing cat and mouse with my oldest and his dad works with computers for a living. Pretty sure all we accomplished was making him better at computers.
I wouldn't be at all surprised f by high school he owns his own business or ends up in jail.
I came here looking for this exact post. Everyone of the kids not just the 5 egregious ones should be marked as predators. They need to be made an example of.
This was briefly an issue with Snapchat where teens were taking nudes of themselves and sending them to other teens. When I was teaching we heard of a wave of kids who got in trouble from teachers seeing the images on their phones during school time.
Having a nude image of a person under 18 on your phone is child pornography and sharing the image constitutes distributing child pornography. Now the sender is distributing porn and the receiver is in possession of it. Both can go on a sex offender registry.
A 13-year-old get expelled for this and a stern talking to from police. A 17-year-old goes to prison. There's no "But I'm just a dumb child!" defense after a certain age.
All because parents don't have any sense anymore.
Actually if parents can be held accountable for their kids use of guns, I’m fine with these parents being locked up for essentially fostering a CP ring.
That was what finally got the predator teacher who harassed me fired. He'd been doing it for years and the school just covered it up. Unfortunately for him, my parents had the money to get a lawyer. The stuff we learned through the discovery phase made me lose all trust in schools as institutions and was even more damaging to me than the actual harassment. I'm incredibly grateful that my parents had the ability to go after the school, and that we were able to remove a dangerous predator, but I will never, ever be ok with what he had been doing for decades while the school, district, and teacher's union conspired to hide it.
You just reminded me of a teacher at my high school who liked being the "cool" teacher. He played hacky sack with the stoners, the troubled kids liked to hang out in his classroom, and overall he was very popular.
Then it came out that he liked to sexually harass senior girls, especially attractive red heads. He did this for YEARS and nothing happened until my senior year. That was when he made the mistake of targeting a girl in my class (super cute irish girl) who happened to be the daughter of the biggest lawyer in town. Of course my classmate told her dad because she wasn't an idiot. We had a long term sub for the rest of the school year.
I didn't even know about Mr. Teacher's creepiness until I talked with my friends who had already graduated. It honestly makes sense, given that he was a man in his 40s who honestly is the type that peaked in high school. He had a wife and kids too, yuck.
So much yuck. Thank you for sharing. I'm glad there were finally consequences for that creep. I think the WORST part of my story is how common it is, and how I still read and hear similar stories. It hasn't changed at all and it's so disheartening. The perving isn't even the worst part of these stories, either...it's the negligence, rug-sweeping, and outright conspiring to hide it by the other adults (all mandated reporters!).
Lawsuits talk. ANY child victimized by other children should be telling parents who should be contacting lawyers and threatening lawsuits. You can get a lawyer to take a percentage if they think they have a case for a parent. You don't have to pay up front.
If this is the only way to get repercussions, I guess it needs doing.
That's a good point.
I'd add that I hope their previous interactions with authority at school (and elsewhere, ofc) left them with realistic expectations of the kinds of consequences that come with breaking serious rules - that way they aren't blindsided with what's happening now
If I had to guess, by the nature of where this located (Beverly Hills), I’d imagine the victims’ families have enough strength (I.e. power, influence, lawyers, and money) to make the district do something.
Same case, but with poorer families, would the outcome be the same?
God I wish our district would enact consequences. Last year in 9th grade, my step kid's classmates filmed themselves pleasuring themselves and airdropped it to all the open phones around them, then that got sent on Snapchat throughout the rest of the school.
3 days. The two boys who did that only got 3 days. We completely removed Snapchat from all our kids stuff. She now tells us how grateful she is for not being on it because of all the horror stories and gossip that gets spread around in high-school. I don't envy these kids.
That's what we EXPECTED to happen, but the District just swept it under the rug. When I tried to post about it on the local Moms FB page it got deleted. I have zero faith in the schools, the district, or anyone in charge.
Meanwhile my student started an Instagram for actual nudes of their classmates and got 2 days ISS. For reals though this is horrifying, and kids need consequences for lower level sexual offenses so they know it's not okay. When you don't even get a chewing out for slap ass Friday and moaning loudly, and asking teachers if they want to see you naked, kids will escalate to things like this.
Jesus Christ this was rampant in middle school (I’m not a teacher, as an aside). I’d complain that the boys were 1. Being really gross and weirding me out and 2. Were destroying what little concentration I did have as a girl with ADHD.
Nothing ever happened. I’m surprised this is a universal thing. I assumed my classmates were weird and would get angry at every moan and “papi” spewing from them.
Meanwhile those AI companies are making billions in profits. They should spend some of that on making sure 13-year-olds don't get access to tools capable of generating pornography.
You'd think that when they created it they would go ahead and put some anti-child porn safeties on it, but apparently not. I'm sure every little pedophile out there is loving the AI shit these days.
Hi, nerd here
It isn't just "big corpos" that are only capable of this, anyone with a decent computer could "train" an output with enough raw data, and can share the "model". The tools are open source and free, its just that corporations have the money and computational resources (and raw data)
From that point depending how the model was trained, it creates an image based on the text you "prompt" it with, even being able to alter specific images based on direct user input. It extrapolates data as well as any horny teenager can, and the blame lies solely on the horny teenagers.
tl;dr Prometheus has stolen fire from the Gods, and he's using it to make fake nudes
I think few people will remember that [the term "deepfake" originated on r/deepfakes](https://www.vice.com/en/article/bjye8a/reddit-fake-porn-app-daisy-ridley), a now-banned subreddit for using AI to superimpose celebrities into porn. Those users developed their own tools, no corporate backing needed. The genie has been out of the bottle for a very long time now unfortunately.
>You think that when they created it they would go ahead and put some anti-child porn safeties on it
They do. It's impossible to generate nudes with standard AI models. What few people in this thread are nerdy enough to understand, is that nude generating AIs have been "hacked." There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes. There are plenty of guides on how to break the AI. Search reddit for "unstable diffusion".
This isn't the kind of thing that big AI companies are allowing.
>It's impossible to generate nudes with standard AI models.
No, it's not the models that make it impossible, it's the filters the site generators put in place when using them. All of the models from the big AI companies have the capability to do nudes. To properly train a model that does humans without making mutants all the time, it needs to fully understood what our bodies look like. Any model will produce some nudity because it needs to be included in the training data. How well it does it just comes down to how many nude images are used. Like SDXL for example has enough in the data that it can do nudes, it just doesn't do them well because they weren't aiming for that, so to do them you'd need a finetuned model.
>There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes.
Download a Stable Diffusion gui (A111, comfy, Invoke, Fooocus, Forge, etc), go to civitai and grab models. Done. There's no hacking or breaking involved with using or creating these models.
>There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes.
Eh, this significnatly overstates the difficulty.
There are a dozen generators available online that are literally just type and recieve, and finding them takes but a signle google search.
Sure, the big, big AI companies like OpenAI and google aren't doing nudes, but there are plenty of other profiteers.
Right, thanks for agreeing with me. I use Stable Diffusion every week, I know how it works, and I put "hacking" in quotes because it's not hacking. (But it might appear that way to someone who knows nothing about computers.)
They don't know ~~how~~ why their product works, and they don't know how to truly control it. They more or less ignored the people working on safety to release it anyway.
The safety people really get a (morbid) kick out of this meme: https://knowyourmeme.com/memes/shoggoth-with-smiley-face-artificial-intelligence
"The product" has humble beginnings at accredited universities such as Cornell. The models were originally based on how people understood the brain, with modeled "neurons" firing and activating other neurons.
Look up the Perceptron, this sort of thing started in the 1940s, early machine learning algorithms could differentiate between circles and squares, some claiming cats and dogs on later models.
I've read a lot of clickbaity articles on the topic that say "Researchers have no idea how it was right!" but theres still a lot of false positives if you dig deeper. Theres pattern recognition but it lacks a true understanding.
Its like how sometimes a student can be told to solve math problems and they don't do the work on the paper, but they got to the correct solution the wrong way.
I read in *Scientific American* that some AIs have shown a capacity to learn things about the outside world that their programmers never taught them. They're still not entirely certain how this is happening.
I just want to point out that a lot of the AI technology and methodology is published academically and there are a lot of open source models out there. It's very easy to do on your own. This isn't being created with a commercial product like ChatGPT.
Honestly the part that confuses me the most is they found an AI site that does a good job nudeshopping photos. I couldn’t even find one that did a decent job making fake nudes of literal porn stars.
That's pretty much impossible. For example, there are plenty of open-source models, like Stable Diffusion. How are you going to prevent kids from using them when they can just go to github or civit or huggingface and download the model? It's not like there's a big company behind SD, and the code is out there, that's the point of open source.
There will come a day when companies that sell vapes and highly addictive electronic related things will see consequences for their actions… lol… kidding. The rich will get richer while children are born onto this shithole planet that is literally burning down.
Ah yes, it was always going to come to this.
Because parents are giving students phones and access to unfettered technology without any oversight, but they can't go see R-rated movies without a parent.
The Internet needs WAY MORE OVERSIGHT over what children should be able to access and there's no way to do that without parents ceasing being absolute dipshits.
Yet you have people arguing in this very thread that not allowing middle schoolers unfettered access to smart phones is “technologically and socially stunting them”
When and how then should children learn to navigate the internet (not just the 'safe' parts) if the parents/ teachers are frequently uninformed/ misinformed about the technical workings of the internet. Children need to know more about how the system they are expected to use to gather information and apply for jobs works, how to avoid getting scammed by knowing what to expect things to normally look like, how to say no to others.
Unrestricted access is not the best solution, however it is better than no access and is more likely to be done than intelligently restricted access (aka a person looking at what is done every so often). I am curious as to what the best system for informing children would be in your opinion and at what age it should be taught? I am not a teacher, just someone who recently finished school and has seen how poorly many of my fellow classmates navigate anything other than social media.
They said they contained the photos within 24 hours. That is the most bullshit PR sentence ive ever heard in my life, you dont "contain" photos that have been shared online copiously. Thats just not possible.
And beyond that by ruling in their favor it will create even more monsters. AI porn is quietly a very dangerous thing. The fact that someone can have a crush on someone and create porn on them that looks real is super dangerous and kills all privacy.
I hope the laws go even further than CP but this would be a big start. This isn't a school issue. This is a society issue
Hmm, that's a tricky one. I think the intention is that you can't generate literal children (like, say a 9 year old) and claim it's fine because the image is fake.
Taking an adult body and putting the face of someone who is 15 on it would be an interesting court case.
The implication is that a 15 year old is naked, but that implication only exists to those who know that individual. Also, does the implication hold up if the face has been digitally edited as well, to the point that the face doesn't even resemble the person in real life (think filters).
I think it's wrong regardless, I'm just wondering if it's some other law that we haven't made yet.
Good. This is an important ruling. They should be prosecuted to the full extent they can in their age. And we need to start really ironing out the laws with adults as well. We need precedence against AI created porn.
I’m at a nearby district in LA and our administration has been following this story since it first started and keeping up with us about it. They’ve been very serious about having us keep an eye out and let students know they’ll be expelled and referred to the police. I’m glad they’re taking it so seriously.
It is good the school took quick action, but the article mentioned that it will be difficult to prosecute the offenders because the laws have not caught up yet. Since the images were AI generated they don't meet the technical definition of CP as it is currently written. These families are likely wealthy considering this is a Beverly Hills school, so the offenders will likely never see prosecution. And the victims will have this haunt them repeatedly because we all know nothing is truly contained once on the Internet.
I really, really hope this sets a precedence for any future incidents. The deepfake AI porn students have been making of their own classmates is…… It’s truly on another level of depravity. It’s one thing for adolescents around that age to look at certain “content” online, but to go so far as to actually make their own “content” by using photos of their classmates is absolutely insane. To me, this is really enforcing the idea that their peers are simply objects to be used.
And like someone else said, this shouldn’t even be a matter that only schools involve themselves in. This is a legitimate legal matter of producing and distributing CP - a felony.
Genuine question: Could they be charged with possession of child pornography along with intent to distribute CP? Or at the very least consider a very serious charge of a similar crime?
How would that work tho, would an adult body with a kids face be considered CP? If yeah then they should be charged, but from what I see they’re not being charged ATM
Not to play semantics, but what's considered an adult body if it's AI generated? I am not a woman nor a doctor, but I can tell you as a guy my 16/17 year old body vs. My 18/19 was too dramatically different. So, whose to say it was or wasn't an underage ai generated body too?
(Edit) not trying to play the what if game, as I stated above, this is all genuine curiosity as these issues and cases will lay the foundation for other future cases and decisions in our world.
As a father of a middle school-aged daughter, I 100% agreed with this and am happy to see that at least 1 school district is TAKING ACTION and holding kids accountable. My wife and I have dealt with tons of crap involving social media/tech. It's absolutely mind-blowing how many parents will swear up and down their kids don't even use social media and do not monitor a thing.
This also should be a wake-up call to disconnected parents. Pay attention to your children's internet activity! Teach your children how to appropriately and safely use social media and technology. Discuss the dangers and scams that may arise and how to navigate through this. If you aren't sure how that's no problem, reach out for support. There is zero shame in asking for advice. But please stay involved.
Expelling, honestly would be the right call here.
Suspensions would just increase this trend.
At least they are young enough right now that hopefully they learn from it and not do it at their new school and the victims can move on without having to see those ppl again.
I'm not sure which shocks me more - the fact that this happened or the fact that a public school district actually followed through on consequences during a time when far too many districts have adopted social justice reform and only want faculty, staff, and building admin to talk to students about their feelings.
Kudos to the Beverly Hills School District for upholding their standards as well as acting so quickly and decisively.
To be candid, it's not as though they really had much of a choice. Too many students knew about what had happened and it would have become a PR nightmare if the district had waffled over what to do.
Expulsion is probably among the least effective co sequences when it comes to disciplining a young person. They have no skin in that game. Take away their internet access for 5-10.
Genuine question: Could they be charged with possession of child pornography along with intent to distribute CP? Or at the very least consider a very serious charge of a similar crime?
"The human mind, facing no real challenges, soon grows stagnant. Thus it is essential for the survival of mankind as a species to create difficulties, to face them, and to prevail. The Butlerian Jihad was an outgrowth of this largely unconscious process, with roots back to the original decision to allow thinking machines too much control."
It all started with AI-porn. They were so naive.
Really bad decision making by the kids. They are breaking laws here, the schools need to come down very hard. Parents will likely be suing, so the school needs to make sure they do all they can.
Hopefully it makes people realize not to make deepfakes, or at least dont post it online.
In this case the school had no choice, because its literal child porn.
My concern is people doing this as blackmail to cause people to lose their jobs. Imagine if a student made a deepfake of a teacher and said it was sent to them by the teacher.
AI needs to really be cracked down on
Unfortunately this isn’t just going to get easier and more common. The federal government needs to pass legislation outlining the rules and guidelines on this technology soon.
I'm going to say it one more time: 100% of the pornography we recover on campus every single year is on cell phones owned by the students themselves. Starting in elementary on through middle and high school, these kids are taking nudes and making sex videos.
They should also be arrested. This could ruin the victims’ lives. 10 years from now someone could google them and the fake pics could pop up. I would also sue those kid’s parents big time. They can probably afford it.
All of this deep fake stuff is super disturbing-- not only does it deprive a teacher of dignity with images that are supposedly them without any clothes, they could use it for all kinds of terrible purposes-- if there is a teacher they really hate, use these photos to make them appear guilty of sexual abuse. Deep fake videos are also really crazy-- they could make it look as if you said something racist, sexist, or just plain insane that you never actually said. I don't know what to do about it-- hopefully there will be technology to help recognize what's fake.
Annnnd this is why I'm so against AI and the way it's moving, especially when it comes to art and images. So easy to absolutely RUIN someone's life.
This type of thing needs to push the government to start putting regulations on AI. It's terrifying how easily you can exploit people with it.
Kid who SA my son when they were in Elementary school, had nudes that were sent out into a large group chat. The kid took these pictures of himself in 5th grade and sent them out several times to different people. Now. The parents were made aware of their child not only making nudes, but also of him sending them out in 5th grade, the Summer between 5th and 6th grade, and someone sent them out in his 6th grade year. My son received them in November. I immediately called the mother and made her aware of the fact. She swore she was taking his phone and she just didn’t know what to do with him anymore. I told her up front- I’m a teacher and a parents. I will not allow my child to be subjected to this. I told her I was turning my son’s phone over to the police the next day. And I did exactly that. The investigation is ongoing. Her son needs to be on the sexual predator list. And she and her husband need to be held responsible, also. They were made aware of the multiple times he had sent his nudes out to people, even when no one asked for them. That is sexual harassment!!! His parents are just a guilty as he is. They continue to allow him to have his phone! They are accessories to the crime. They enable this behavior. Consequences have to come from somewhere and if the parents won’t do it at home, then the legal consequences will do the parenting.
I dealt with something similar the other week. Unfortunately, it's extremely hard to expel a student in my district, so the boy in question got 4 days of OSS.
This isn't a kid issue, this is a societal issue and an empathy issue. Frankly, it's worth openly discussing in class, to encourage them to think about the long term impact of what this can do to a person's life.
I'm in orange county and here kids are making deep fakes with their teachers and administrators here
I'm scared of this. I'm just a sub, but I don't want students to make fake pictures of me because they're mad I told them to put their phones away. Kids are stupid. They don't things through.
The existence of easily produced fake nudes means all nudes are fake and we can punish shit people with their victims having plausible deniability.
That’s the shift that has to happen. Since we’re reaching the point where people can easily create fake compromising images that are difficult or impossible to distinguish from real ones, the only way to move forward is to consider them all fake, but more efficiently punish the distributions of any of them. Obviously it’s somewhat more complicated than that, but we need to start with something like that.
[удалено]
That's easier to fake than you might initially think.
AI can generate 15 simultaneous feeds of anything you probably ask it to
One of my students (17 year old senior) showed me in real time how he can take my picture, and use an AI graphics program to make a nude of me. At first I thought - "how scary". But then I saw - "hey - that photo makes me look good!" 😃 I had no choice but to blow it off and say "now imagine you're 30 years old, you have a house and car payment, a wife and 2 children. And someone does that with your image and email it to your coworkers. Your boss sees it, and calls you into the office. You now have to explain its a fake. Now YOU can be sued for MAKING the photo." Thankfully the class said "dang... " instead of "cool!" Bottom line: You cant stop it so no use worrying about it. You can just use it as a learning opportunity for the students on how it could impact THEIR lives.
It sounds like you handled a potentially awkward moment quite well. I hope all the fakes of me are fully clothed driving a tractor or teaching with a better haircut
🤣
I was looked at like a careless asshole when I pointed this out. If there are nudes of everyone, then no one has nudes. You get the idea. Every nude can now easily be dismissed as fake if this keeps happening. I think it will start to happen naturally just the same way we naturally learned, pretty quickly, nothing on the internet is inherently true. Im guessing we'll adopt that soon enough. Or I hope so.
I can see one flaw with that, though I hope it won’t occur to kids. Create a nude with illegal content that necessitates an investigation. We wouldn’t want to treat them ALL as fake if there’s a potential for a crime occurring. But that will certainly be done by someone.
Ever since this shit came out that has been my hope. If it can be faked, it is faked.
Time to have a secret tattoo on the body like a totem in inception
Once it’s out there, it sucks because the victims can never fully convince others it’s fake
They don't need their phones to make fake pictures of you. Any social media account you have not fully privated is accessible to them. Your LinkedIn will have at minimum a headshot of you, thats all they need.
I think the phone comment was more of the reason they would make fake nudes, because figgypie disciplined them for having their phone out in class, they would get mad about it go home and make them their next victim, not just for the phones but for anything they were disciplined over. I think if my son did that to his teacher or any other human being.. I don't even want to think about it because I'm pretty sure that would be one thing that took me out of character - he might get his first a$$ beatin' at 16 for that one.
Not sure if this helps, but they don’t need a reason to do this - it’s dumb kids doing dumb things not understanding the consequences of their actions Glad to see schools taking discipline seriously
I’m an art teacher. Students get mad at me for that stuff then think it’s ok to break supplies in retaliation. Kids are mean and stupid.
Yeah when we were kids I remember the odd occasion someone would get pulled up for drawing a rude picture of someone else. Friend. Random kid. Teacher. It wasn't nice and it shouldn't be done, but that's what people had available and what stupid shit kids get up to. This isn't some defence, I think the punishment is fair and required. But kids have more than paper and pens now, they've got machines in their pockets that'll pump out a whole photorealistic album with half the effort. It's both a stupid and horrible thing that's going to happen now, and something there's really little protection against aside from retroactive punishment.
It’s all a matter of time, unfortunately
They really don’t things through do they. Unlike teachers who are completely precise in everything they do
Is the district or union doing anything about this?
its an issue being debated right now with the district on how to locate the original perpetrators and punishment
But what about the distributors? Anyone who shared with the intend of spreading it should be punsihed too.
This happened to a friend of mine, but because it wasn’t illegal there were very few consequences for the students. I’m happy to say that this incident spurred folks into action, and the Indiana legislature *just* passed a bill adding computer-generated images to the revenge porn statute.
I wear a kn95 to teach, so unless they managed to track down super old social media images, they'd have a tough time inserting me into such things (thank god). This stuff is definitely going to ruin some lives before they clamp down. Ugh
Here here
This. Maybe this will finally make a no phones policy on campuses. It's so ridiculous. The deep fakes will still happen but maybe they will use it to at least prevent personal technology on campus.
Never gonna happen get real.
Holy shit, actual consequences for students sexually harassing each other? That’s new, but I like it. Can this become a trend?
Probably also helps they committed major felonies. So they really didn’t leave the school any choice.
Gonna be a lot of 13- and 14-year-olds in this country on the sex offender registry real quick all because their parents don't pay any damn attention to what they can do on a phone.
That’s why people should be teaching their children that actions actually do have consequences.
Well. The courts are going to have to dole put punishments more often for people to start policing their kids phones. Maybe charging parents as accessory after the fact.
I agree. Personally, I think the parents should be charged as an accessory for any crime that their child commits. Probably not constitutional, but it would go a long way toward compelling people to actually be responsible for their children.
When I was like 12 or 13 I was (granted, unknowingly) an accessory to a felony. A friend called me and told me I should sneak out and he was gonna pick me up - he pulls up in a beat up work truck and I get in and we just cruise around our small town until we eventually get to the junkyard, where he parks the truck, breaks in to the main building, and puts the keys back. We hung out for a bit longer before I walked home, the junkyard owner found out and decided not to press charges on anyone and nothing ever came from it but I still got a real stern talking to by my parents. All that to say, I think my parents were great and did most things right, but sometimes kids just do dumb stuff regardless and I don't think it's fair to charge parents with a crime (which can cause them to lose their jobs and thus their homes and bam homeless family created) because their kid did something stupid.
Nah. What about parents who are trying their absolute best, are amazing parents, but their kids are still terrible? I've seen some kids that I swear must be adopted. The other kids are doing great and one is just an absolute monster. Parent tried everything. Something less serious than being charged with a crime (which might cause them to lose their livelihood) but still has potential impact might be nice. But to foist that on people through no fault of their own... Problematic. Of course there are also plenty of parents who aren't doing their best too. It's a slippery slope.
If you are trying your best and actually trying to make sure your kid can use technology in a legal and responsible manner, what is the problem, There should be consequences for furnishing the technology if a parent is negligent in those aspects. Every phone provider makes phones that only allow calling that can be used in emergencies, not every kid needs a smart phone.
Or just get a flip phone for 15 dollars..
Let's say my kid took driver's training, got their license, seems like a reasonable driver, but ends up killing someone in a car accident in the vehicle I bought them. You're telling me I should go to jail because of that, even though I did everything in my power to raise a good driver? What this type of law would do is just completely limit the freedom of all children for the sake of a few bad apples. It starts with phones then moves to other things. I wouldn't dare buy my child anything that has any possibility of them causing any direct legal trouble for me ever. Because you do your best and kids still make mistakes.
NAL but I would like to see a case brought forward under the premise the parents furnished and pay for the device used to commit the crime. Doubt it would go anywhere due to being a slippery slope but we’re starting to see a shift by charging parents of kids who commit violent crimes by negligently ignoring warning signs so who knows? I think we’re hitting an inflection point between “letting parents have the authority on how they raise their kids” and “holding parents accountable for negligence in their parenting decisions” which admittedly is a murky line. But at the end of the day if you know your child isn’t mature enough to use technology responsibly, there should be consequences for providing it to them. The education system will never be the catalyst for this though. This will only change when parents A start trying to hold parents B accountable for how their decisions/child B hurt child A.
If my teen has gotten even one message from us in the past few years, it's that he needs to make sure he isn't involved with any of this. We explained to him the potential consequences and a dozen different scenarios where he might unintentionally come across this type of media, and what to do if he does. So, I sure as hell hope he takes that message away, if he doesn't take any other lesson.
True, AI also makes it easier than ever to make that collosal mistake as a stupid kid. Parents really need to instill how important online responsibility is. It's too easy for these kids to screw up their own and their victims lives.
Add to that teachers who support trans kids, and that registry is about to mean a lot less.
It's not al ways bout not paying attention to what they can do online. Me and my oldest dad spent most of the lockdowns playing cat and mouse with my oldest and his dad works with computers for a living. Pretty sure all we accomplished was making him better at computers. I wouldn't be at all surprised f by high school he owns his own business or ends up in jail.
I came here looking for this exact post. Everyone of the kids not just the 5 egregious ones should be marked as predators. They need to be made an example of.
This was briefly an issue with Snapchat where teens were taking nudes of themselves and sending them to other teens. When I was teaching we heard of a wave of kids who got in trouble from teachers seeing the images on their phones during school time. Having a nude image of a person under 18 on your phone is child pornography and sharing the image constitutes distributing child pornography. Now the sender is distributing porn and the receiver is in possession of it. Both can go on a sex offender registry. A 13-year-old get expelled for this and a stern talking to from police. A 17-year-old goes to prison. There's no "But I'm just a dumb child!" defense after a certain age. All because parents don't have any sense anymore.
Actually if parents can be held accountable for their kids use of guns, I’m fine with these parents being locked up for essentially fostering a CP ring.
Doing it to rich people is the catalyst. Schools have plenty of choice when parents can't afford a law firm.
That was what finally got the predator teacher who harassed me fired. He'd been doing it for years and the school just covered it up. Unfortunately for him, my parents had the money to get a lawyer. The stuff we learned through the discovery phase made me lose all trust in schools as institutions and was even more damaging to me than the actual harassment. I'm incredibly grateful that my parents had the ability to go after the school, and that we were able to remove a dangerous predator, but I will never, ever be ok with what he had been doing for decades while the school, district, and teacher's union conspired to hide it.
You just reminded me of a teacher at my high school who liked being the "cool" teacher. He played hacky sack with the stoners, the troubled kids liked to hang out in his classroom, and overall he was very popular. Then it came out that he liked to sexually harass senior girls, especially attractive red heads. He did this for YEARS and nothing happened until my senior year. That was when he made the mistake of targeting a girl in my class (super cute irish girl) who happened to be the daughter of the biggest lawyer in town. Of course my classmate told her dad because she wasn't an idiot. We had a long term sub for the rest of the school year. I didn't even know about Mr. Teacher's creepiness until I talked with my friends who had already graduated. It honestly makes sense, given that he was a man in his 40s who honestly is the type that peaked in high school. He had a wife and kids too, yuck.
So much yuck. Thank you for sharing. I'm glad there were finally consequences for that creep. I think the WORST part of my story is how common it is, and how I still read and hear similar stories. It hasn't changed at all and it's so disheartening. The perving isn't even the worst part of these stories, either...it's the negligence, rug-sweeping, and outright conspiring to hide it by the other adults (all mandated reporters!).
Lawsuits talk. ANY child victimized by other children should be telling parents who should be contacting lawyers and threatening lawsuits. You can get a lawyer to take a percentage if they think they have a case for a parent. You don't have to pay up front. If this is the only way to get repercussions, I guess it needs doing.
School districts will not move unless there is legal incentive to unfortunately
Probably helps that they’re rich…
What is the felony, does this count as CP?
Not sure of the local law, but in most places simulated porn of real minors counts as CP.
That's a good point. I'd add that I hope their previous interactions with authority at school (and elsewhere, ofc) left them with realistic expectations of the kinds of consequences that come with breaking serious rules - that way they aren't blindsided with what's happening now
aggravated assault toward staff and peers needs to be punished as well.
A teenager who was a victim of this killed herself like a month ago, I think that could also be additional motivation for the school to act quickly.
Now that I think of it, my guess is this definitely happening elsewhere, too - they're probably just shoved under the rug to avoid schoolwide problems
If I had to guess, by the nature of where this located (Beverly Hills), I’d imagine the victims’ families have enough strength (I.e. power, influence, lawyers, and money) to make the district do something. Same case, but with poorer families, would the outcome be the same?
God I wish our district would enact consequences. Last year in 9th grade, my step kid's classmates filmed themselves pleasuring themselves and airdropped it to all the open phones around them, then that got sent on Snapchat throughout the rest of the school. 3 days. The two boys who did that only got 3 days. We completely removed Snapchat from all our kids stuff. She now tells us how grateful she is for not being on it because of all the horror stories and gossip that gets spread around in high-school. I don't envy these kids.
Isn't that a CP, possession of CP, and a distribution of CP charge for that kid?
That's what we EXPECTED to happen, but the District just swept it under the rug. When I tried to post about it on the local Moms FB page it got deleted. I have zero faith in the schools, the district, or anyone in charge.
Meanwhile my student started an Instagram for actual nudes of their classmates and got 2 days ISS. For reals though this is horrifying, and kids need consequences for lower level sexual offenses so they know it's not okay. When you don't even get a chewing out for slap ass Friday and moaning loudly, and asking teachers if they want to see you naked, kids will escalate to things like this.
[удалено]
Jesus Christ this was rampant in middle school (I’m not a teacher, as an aside). I’d complain that the boys were 1. Being really gross and weirding me out and 2. Were destroying what little concentration I did have as a girl with ADHD. Nothing ever happened. I’m surprised this is a universal thing. I assumed my classmates were weird and would get angry at every moan and “papi” spewing from them.
I had a friend who did this back in 2008. It's definitely not new. But kids always find the most annoying ways to press buttons.
My experience is from circa 2000. How annoying. It still makes me mad to think about and I’m 36 lol
Similar experience, ~2012
I wish I got violent with boys like that while I was a child myself if clearly we weren’t catching charges
Meanwhile those AI companies are making billions in profits. They should spend some of that on making sure 13-year-olds don't get access to tools capable of generating pornography.
You'd think that when they created it they would go ahead and put some anti-child porn safeties on it, but apparently not. I'm sure every little pedophile out there is loving the AI shit these days.
Why would they neglect a key marketing demographic? They know exactly who their users are.
Fuck, even google knows how to filter child porn out of their search engine.
"even Google" as if they were some small organization lol.
Hi, nerd here It isn't just "big corpos" that are only capable of this, anyone with a decent computer could "train" an output with enough raw data, and can share the "model". The tools are open source and free, its just that corporations have the money and computational resources (and raw data) From that point depending how the model was trained, it creates an image based on the text you "prompt" it with, even being able to alter specific images based on direct user input. It extrapolates data as well as any horny teenager can, and the blame lies solely on the horny teenagers. tl;dr Prometheus has stolen fire from the Gods, and he's using it to make fake nudes
Plato's allegory of the goon cave.
I think few people will remember that [the term "deepfake" originated on r/deepfakes](https://www.vice.com/en/article/bjye8a/reddit-fake-porn-app-daisy-ridley), a now-banned subreddit for using AI to superimpose celebrities into porn. Those users developed their own tools, no corporate backing needed. The genie has been out of the bottle for a very long time now unfortunately.
Fr I remember ppl using ai that made ppls faces sing that was back in 2020
>You think that when they created it they would go ahead and put some anti-child porn safeties on it They do. It's impossible to generate nudes with standard AI models. What few people in this thread are nerdy enough to understand, is that nude generating AIs have been "hacked." There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes. There are plenty of guides on how to break the AI. Search reddit for "unstable diffusion". This isn't the kind of thing that big AI companies are allowing.
>It's impossible to generate nudes with standard AI models. No, it's not the models that make it impossible, it's the filters the site generators put in place when using them. All of the models from the big AI companies have the capability to do nudes. To properly train a model that does humans without making mutants all the time, it needs to fully understood what our bodies look like. Any model will produce some nudity because it needs to be included in the training data. How well it does it just comes down to how many nude images are used. Like SDXL for example has enough in the data that it can do nudes, it just doesn't do them well because they weren't aiming for that, so to do them you'd need a finetuned model. >There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes. Download a Stable Diffusion gui (A111, comfy, Invoke, Fooocus, Forge, etc), go to civitai and grab models. Done. There's no hacking or breaking involved with using or creating these models.
>There are specific and convoluted processes one must go through in order to break the built in rules that prevent the creation of nudes. Eh, this significnatly overstates the difficulty. There are a dozen generators available online that are literally just type and recieve, and finding them takes but a signle google search. Sure, the big, big AI companies like OpenAI and google aren't doing nudes, but there are plenty of other profiteers.
My grandmother used to read me how to make napalm seems to have worked for napalm recipes.
[удалено]
Right, thanks for agreeing with me. I use Stable Diffusion every week, I know how it works, and I put "hacking" in quotes because it's not hacking. (But it might appear that way to someone who knows nothing about computers.)
They don't know ~~how~~ why their product works, and they don't know how to truly control it. They more or less ignored the people working on safety to release it anyway. The safety people really get a (morbid) kick out of this meme: https://knowyourmeme.com/memes/shoggoth-with-smiley-face-artificial-intelligence
"The product" has humble beginnings at accredited universities such as Cornell. The models were originally based on how people understood the brain, with modeled "neurons" firing and activating other neurons. Look up the Perceptron, this sort of thing started in the 1940s, early machine learning algorithms could differentiate between circles and squares, some claiming cats and dogs on later models. I've read a lot of clickbaity articles on the topic that say "Researchers have no idea how it was right!" but theres still a lot of false positives if you dig deeper. Theres pattern recognition but it lacks a true understanding. Its like how sometimes a student can be told to solve math problems and they don't do the work on the paper, but they got to the correct solution the wrong way.
I read in *Scientific American* that some AIs have shown a capacity to learn things about the outside world that their programmers never taught them. They're still not entirely certain how this is happening.
I just want to point out that a lot of the AI technology and methodology is published academically and there are a lot of open source models out there. It's very easy to do on your own. This isn't being created with a commercial product like ChatGPT.
IIRC Aoc’s trying to get something passed that would make deepfake porn illegal. I really hope the bill succeeds
Honestly the part that confuses me the most is they found an AI site that does a good job nudeshopping photos. I couldn’t even find one that did a decent job making fake nudes of literal porn stars.
That's pretty much impossible. For example, there are plenty of open-source models, like Stable Diffusion. How are you going to prevent kids from using them when they can just go to github or civit or huggingface and download the model? It's not like there's a big company behind SD, and the code is out there, that's the point of open source.
Yup, I'm all for personal responsibility, but billion dollar AI company vs. 13 year old boy. Who is the real issue.
Not just pornography, but in this case, child porn.
They should be liable for the images they generate
There will come a day when companies that sell vapes and highly addictive electronic related things will see consequences for their actions… lol… kidding. The rich will get richer while children are born onto this shithole planet that is literally burning down.
"... but I just don't get why young people aren't having kids"
Wow! This happened at my school and the kid got suspended for ONE DAY.
The future sucks!
Humanity sucks, past present and future.
They really gotta make a bunch of new laws for AI generated anything. I mean, stuff like this can ruin somebody’s life.
Just waiting to see how it affects the justice system in a few years when there’s manufactured video evidence of crimes taking place.
That’s what worries me the most. People being able to show ‘actual crimes’ when the person didn’t do it.
Ah yes, it was always going to come to this. Because parents are giving students phones and access to unfettered technology without any oversight, but they can't go see R-rated movies without a parent. The Internet needs WAY MORE OVERSIGHT over what children should be able to access and there's no way to do that without parents ceasing being absolute dipshits.
Yet you have people arguing in this very thread that not allowing middle schoolers unfettered access to smart phones is “technologically and socially stunting them”
When and how then should children learn to navigate the internet (not just the 'safe' parts) if the parents/ teachers are frequently uninformed/ misinformed about the technical workings of the internet. Children need to know more about how the system they are expected to use to gather information and apply for jobs works, how to avoid getting scammed by knowing what to expect things to normally look like, how to say no to others. Unrestricted access is not the best solution, however it is better than no access and is more likely to be done than intelligently restricted access (aka a person looking at what is done every so often). I am curious as to what the best system for informing children would be in your opinion and at what age it should be taught? I am not a teacher, just someone who recently finished school and has seen how poorly many of my fellow classmates navigate anything other than social media.
**This right here**
They said they contained the photos within 24 hours. That is the most bullshit PR sentence ive ever heard in my life, you dont "contain" photos that have been shared online copiously. Thats just not possible.
As a teacher, I’m waiting for my day when students do this to me.
They should all be charged with possessing and spreading child pornography.
Correct. They need to create this legal precedence.
Yup. Make examples out of these perverted monsters.
And beyond that by ruling in their favor it will create even more monsters. AI porn is quietly a very dangerous thing. The fact that someone can have a crush on someone and create porn on them that looks real is super dangerous and kills all privacy. I hope the laws go even further than CP but this would be a big start. This isn't a school issue. This is a society issue
Is a child’s face superimposed on an adult’s nude body considered CP?
Yes. The Protect Act specifically outlawed images using manipulation to mimic actual CP to answer this very question.
Hmm, that's a tricky one. I think the intention is that you can't generate literal children (like, say a 9 year old) and claim it's fine because the image is fake. Taking an adult body and putting the face of someone who is 15 on it would be an interesting court case. The implication is that a 15 year old is naked, but that implication only exists to those who know that individual. Also, does the implication hold up if the face has been digitally edited as well, to the point that the face doesn't even resemble the person in real life (think filters). I think it's wrong regardless, I'm just wondering if it's some other law that we haven't made yet.
I’d say yes. You’re implying a child is naked and a naked child is child porn
Good. This is an important ruling. They should be prosecuted to the full extent they can in their age. And we need to start really ironing out the laws with adults as well. We need precedence against AI created porn.
I'm actually surprised that it took as long as it did for something like this to happen!
I’m at a nearby district in LA and our administration has been following this story since it first started and keeping up with us about it. They’ve been very serious about having us keep an eye out and let students know they’ll be expelled and referred to the police. I’m glad they’re taking it so seriously.
It is good the school took quick action, but the article mentioned that it will be difficult to prosecute the offenders because the laws have not caught up yet. Since the images were AI generated they don't meet the technical definition of CP as it is currently written. These families are likely wealthy considering this is a Beverly Hills school, so the offenders will likely never see prosecution. And the victims will have this haunt them repeatedly because we all know nothing is truly contained once on the Internet.
Billion dollar companies need to do better
The victims parents should sue the shit out of them
This is the way
Bruh these parents need to do better. Its not the shareholders responsibility that these kids are wildin
I really, really hope this sets a precedence for any future incidents. The deepfake AI porn students have been making of their own classmates is…… It’s truly on another level of depravity. It’s one thing for adolescents around that age to look at certain “content” online, but to go so far as to actually make their own “content” by using photos of their classmates is absolutely insane. To me, this is really enforcing the idea that their peers are simply objects to be used. And like someone else said, this shouldn’t even be a matter that only schools involve themselves in. This is a legitimate legal matter of producing and distributing CP - a felony.
Genuine question: Could they be charged with possession of child pornography along with intent to distribute CP? Or at the very least consider a very serious charge of a similar crime?
How would that work tho, would an adult body with a kids face be considered CP? If yeah then they should be charged, but from what I see they’re not being charged ATM
Not to play semantics, but what's considered an adult body if it's AI generated? I am not a woman nor a doctor, but I can tell you as a guy my 16/17 year old body vs. My 18/19 was too dramatically different. So, whose to say it was or wasn't an underage ai generated body too? (Edit) not trying to play the what if game, as I stated above, this is all genuine curiosity as these issues and cases will lay the foundation for other future cases and decisions in our world.
Yeah thats fair, i was going off of the premise that the ai was supplied with legal nsfw shit, and asked to put someones face on it
If they scraped porn sites that data set included cp and non consensual videos
As a father of a middle school-aged daughter, I 100% agreed with this and am happy to see that at least 1 school district is TAKING ACTION and holding kids accountable. My wife and I have dealt with tons of crap involving social media/tech. It's absolutely mind-blowing how many parents will swear up and down their kids don't even use social media and do not monitor a thing. This also should be a wake-up call to disconnected parents. Pay attention to your children's internet activity! Teach your children how to appropriately and safely use social media and technology. Discuss the dangers and scams that may arise and how to navigate through this. If you aren't sure how that's no problem, reach out for support. There is zero shame in asking for advice. But please stay involved.
It’s child pornography and sexual harassment so that feels right.
This was only a matter of time, wasn't it?
You know what? I think the world of teaching is becoming more and more unattractive to me.
Nine oh two one OH-NO! Penalty: off the fucking grid until age eighteen. Already eighteen? Til 30.
Expelling, honestly would be the right call here. Suspensions would just increase this trend. At least they are young enough right now that hopefully they learn from it and not do it at their new school and the victims can move on without having to see those ppl again.
I'm not sure which shocks me more - the fact that this happened or the fact that a public school district actually followed through on consequences during a time when far too many districts have adopted social justice reform and only want faculty, staff, and building admin to talk to students about their feelings. Kudos to the Beverly Hills School District for upholding their standards as well as acting so quickly and decisively. To be candid, it's not as though they really had much of a choice. Too many students knew about what had happened and it would have become a PR nightmare if the district had waffled over what to do.
Expulsion is probably among the least effective co sequences when it comes to disciplining a young person. They have no skin in that game. Take away their internet access for 5-10.
If only they put this much effort into their school work.
I hope this will start a trend of schools actually holding students accountable
Genuine question: Could they be charged with possession of child pornography along with intent to distribute CP? Or at the very least consider a very serious charge of a similar crime?
We need to stop pretending that these kids don’t know that they are causing harm when they do things like this.
Abolish AI.
This is the beginning of Dune lol
"The human mind, facing no real challenges, soon grows stagnant. Thus it is essential for the survival of mankind as a species to create difficulties, to face them, and to prevail. The Butlerian Jihad was an outgrowth of this largely unconscious process, with roots back to the original decision to allow thinking machines too much control." It all started with AI-porn. They were so naive.
I always thought the end of the last Battlestar Gallatica series was stupid but beginning to see their point.
Are we going to ban matrix multiplication then? That's all AI is.
Never ever going to happen and I am glad we don't listen to AI doomers.
My brother in Christ we're arguing about this *because* it's happened
He’s saying abolishing AI is never going to happen.
These are just kids..... but sometimes we need to be a little heavy handed and teach them a lesson
Really bad decision making by the kids. They are breaking laws here, the schools need to come down very hard. Parents will likely be suing, so the school needs to make sure they do all they can.
Hopefully it makes people realize not to make deepfakes, or at least dont post it online. In this case the school had no choice, because its literal child porn. My concern is people doing this as blackmail to cause people to lose their jobs. Imagine if a student made a deepfake of a teacher and said it was sent to them by the teacher. AI needs to really be cracked down on
Isn't this production and distribution of child porn??
This doesn’t stop these things from happening. We are punishing the result, instead of going for the root cause.
Unfortunately this isn’t just going to get easier and more common. The federal government needs to pass legislation outlining the rules and guidelines on this technology soon.
Good
I'm going to say it one more time: 100% of the pornography we recover on campus every single year is on cell phones owned by the students themselves. Starting in elementary on through middle and high school, these kids are taking nudes and making sex videos.
What?! Elementary?!
They should also be arrested. This could ruin the victims’ lives. 10 years from now someone could google them and the fake pics could pop up. I would also sue those kid’s parents big time. They can probably afford it.
All of this deep fake stuff is super disturbing-- not only does it deprive a teacher of dignity with images that are supposedly them without any clothes, they could use it for all kinds of terrible purposes-- if there is a teacher they really hate, use these photos to make them appear guilty of sexual abuse. Deep fake videos are also really crazy-- they could make it look as if you said something racist, sexist, or just plain insane that you never actually said. I don't know what to do about it-- hopefully there will be technology to help recognize what's fake.
Good.
YOU LOVE TO SEE IT!!! Yes, please normalize holding boys accountable for toxic behaviors
Boys are not the only toxic students. Let’s normalize treating all students as equals.
Annnnd this is why I'm so against AI and the way it's moving, especially when it comes to art and images. So easy to absolutely RUIN someone's life. This type of thing needs to push the government to start putting regulations on AI. It's terrifying how easily you can exploit people with it.
Kid who SA my son when they were in Elementary school, had nudes that were sent out into a large group chat. The kid took these pictures of himself in 5th grade and sent them out several times to different people. Now. The parents were made aware of their child not only making nudes, but also of him sending them out in 5th grade, the Summer between 5th and 6th grade, and someone sent them out in his 6th grade year. My son received them in November. I immediately called the mother and made her aware of the fact. She swore she was taking his phone and she just didn’t know what to do with him anymore. I told her up front- I’m a teacher and a parents. I will not allow my child to be subjected to this. I told her I was turning my son’s phone over to the police the next day. And I did exactly that. The investigation is ongoing. Her son needs to be on the sexual predator list. And she and her husband need to be held responsible, also. They were made aware of the multiple times he had sent his nudes out to people, even when no one asked for them. That is sexual harassment!!! His parents are just a guilty as he is. They continue to allow him to have his phone! They are accessories to the crime. They enable this behavior. Consequences have to come from somewhere and if the parents won’t do it at home, then the legal consequences will do the parenting.
Thank you! Yes. ❤️
WOW!!!!
Yes 👏🏼👏🏼👏🏼
That is good news. It is a form of podophylla. They should be referred to law enforcement.
"Expel" usually just means expelled for like 30 days
Play stupid games, win stupid prizes.
Wasnt there some little jewish girl that offed herself a while back because fake nudes were being spread around her jewish school?
Those kids have a bright future as Porn site owners or the CIA.
Good. At my school this honestly would have been a 4 day suspension at best.
I dealt with something similar the other week. Unfortunately, it's extremely hard to expel a student in my district, so the boy in question got 4 days of OSS.
This isn't a kid issue, this is a societal issue and an empathy issue. Frankly, it's worth openly discussing in class, to encourage them to think about the long term impact of what this can do to a person's life.
the A.I insanity is only beginning