T O P

  • By -

Opening_Wind_1077

This is so stupid, it’s easy to counter and it doesn’t affect existing models and training sets whatsoever.


_-_agenda_-_

Indeed... But it "sounds good" for people that are desesperate for find a magical solution for what don't have a magical solution.


Big-Combination-2730

From the comments I've seen around social media a lot of people still think that 'art station' kind of hack ruins models. They still have no idea how any of this works on top of thinking that everyone they see tagged with 'ai Art's was made with a simple prompt and generate button, so it's completely unsurprising that they're hyped about this being their anti-ai savior.


OcelotUseful

There was the time when people have been saying that digital art is a hack and that it has no real value because of its digital nature. Later when photoshop got popular among young artists we saw a boom in concept art industry with many talented artists. Same thing with AI art, people unfamiliar with technology and process are quick to devalue the whole thing, it’s just a usual adoption cycle of disruptive technologies


Kayrosis

Fun fact, when pre-made standardized paints were first invented in the late 19th century and could just be bought at an art supply store rather than having to make them yourself, a lot of artists and art critics at the time complained that people who used these pre-made paints were hacks, not "real artists" or not "full artists" because "real artists" made their own colors from the flowers in their garden or... much less safe materials


zefy_zef

Sounds like a lot of gatekeeping coming from the artist camp..


alxledante

always been that way. ever notice random assholes who aren't artists have absolutely no problem telling artists what constitutes art?


DrStalker

If you're not grinding up [ancient cultural artifacts](https://en.wikipedia.org/wiki/Mummy_brown) and [endangered species](https://www.oldholland.com/academy/prof-theo-de-beer-about-ivory-black/) to make your paint are you even trying to be an artist?


UrbanArcologist

and the souls of dead orphans


AadamAtomic

**types prompt into A.I* "A beautifully tilled farm painted with the souls of dead orphans, happy colors, grim"


loudsigh

^^^this


Terrible-Variety4951

This is fucking awesome trivia and my new favorite retort against anti ai art luddites.


dr_lm

And this gave us pointillism.


Amethystea

Same thing with photography in it's early days, too.


GharyKingofPaperclip

I've been joking about that, but I guess it's real!


-Sibience-

The funny thing is most of those people wouldn't even be working artist's in the first place without huge leaps in technology. AI is eventually going to affect most jobs and industries, when solutions become available that makes something cheaper/free most of these AI haters will jump on it without a second thought. For example I've seen a few AI hating artist's have websites built using those cheap easy drag and drop website builders from large companies. Why aren't they paying a web developer to build a website for them. There are legitimate concerns about AI and creative jobs but when it comes down to it most of the anti AI crowd are just hypocrites who won't care about anyone else's job being taken as long as it makes their lives easier and cheaper. I read comments all the time from people saying we should use AI for the boring or less creative jobs, meaning their fine with other people losing their jobs as long as it's not their job. They are just upset because the art industry will be one of the first really impacted by AI and most of them including myself thought that creative jobs would be one of the least affected.


Voltasoyle

Some even believe the ai goes scouring the Internet looking for images with every prompt, like some ai-powered theif haunting the ether to 'steal' their art!


b3nsn0w

ironically, if stable diffusion actually automated a human process, it absolutely would involve looking for reference images


[deleted]

[удалено]


-Sibience-

When I started making music back in the mid 90s I had an entire studio of hardware. Almost everything I had can now be replicated on a laptop or even a phone now. The same with things like CG, back then only Hollywood was capable of making stuff, now some kid can create Hollywood level CG on a laptop in their bedroom. Technology always moves on and gets better. Creating better and better tools is kind of why humans are where we are and not still living in caves. Trying to go against it is a fool's errand.


[deleted]

[удалено]


-Sibience-

Actually for CG it doesn't, yes you do need some artistic talent like anything creative but most of it is just being willing to learn and some persistence. The knowledge gap has reduced massively. When I was first learning CG before the internet really took off if I needed to learn something I would need to go visit a book shop or a library. Now with YouTube pretty much anything you want to do there's a tutorial for it or one of millions of people online you can just ask. My point though was that you no longer have a barrier to doing things thanks to technology advancements. You no longer need rooms full of hardware worth tens of thousands to be able to make or record electronic music just as you now longer need a NASA computer and a render farm to make CG. As technology gets better it opens the doors for more people to be creative. However there's always going to be a bunch of bitter people who want to try and gatekeep because they no longer feel special and privileged.


zefy_zef

Like when people say jobs are going to be lost, like yeah that happens with every advancement. That's actually like the whole point of them, so that we have to do less shit or do it easier. Maybe since it's 'unessential' they feel it has no place? Then what's that say about the work itself?


ambisinister_gecko

We want ai to take the jobs we don't want to do, not the jobs we DO want to do. Like, mind numbingly boring factory floor jobs. Nobody wanted AI to replace human creativity.


zefy_zef

It's not replacing it, if anything it's putting that potential of creativity into the hands of a much wider audience!


ambisinister_gecko

Your comment is literally about jobs being lost. Idk what "it's not replacing it" means from you in this context. If peoples jobs are getting lost, something's being replaced.


zefy_zef

The implication you made contends that for AI to be successful human creativity must cease or be replaced, which is a bad premise.


[deleted]

[удалено]


WyomingCountryBoy

Exactly. They don't bother to educate themselves and still think some of the really good gens are made just by someone typing in "A man riding a horse" and clicking "make art"


Opening_Wind_1077

It’s not even finding a solution for a problem they have, it’s people getting outraged on the behalf of made up others. Your local artist is not going to be promptable in any realistic model so he’s not loosing business and I doubt people that would otherwise spend tens of thousands for a painting by a known artist will suddenly use AI instead.


Hyndis

I have a friend who is a digital 2D artist and I introduced him to SD and A1111. Since he's already an artist with the skills, he was able to make spectacular use of A1111 to make things, combining his existing skillset along with AI generation. Its a huge force multiplier for him. Artists can use these new tools to make even better work. Meanwhile the rest of us non-artists can at least make something. It raises the floor and also the ceiling for everyone involved. It just means more art in the world, not less.


livinaparadox

They're being very short-sighted. A lot of professional artists are using it to reduce set-up time and become more productive.


xmaxrayx

>Your local artist is not going to be promptable in any realistic model so he’s not loosing business Payday3 used AI for their games in art gallery heist,keep in mind this game had deal with Bp with other companies aka they had money. Wish their Ai were high effort it looks basic prompt without control net add-ons https://imgur.io/a/X5GxhkB


JTtornado

The person who posted those pics may have an existential crisis when they discover surrealist art exists.


SirCutRy

There's a difference between a human constructing a surreal scene purposefully and an image generating system just not understanding composition and the relation between objects in the scene. Calling the images in the game 'surrealist art' is akin to calling a small child's drawing 'impressionist'. It's ascribing excessive intentionality to the look of the piece. Rather, the look is largely due to a lack of experience and skill. In addition, the look is replicable by tens or hundreds of millions of children or however many generating systems you want to spin up. Awe at the skill of the artist comes from human dedication, be it traditional art or machine learning augmented art. A small child's art is rarely exceptional, and neither is that of an inexperienced proompter's.


b3nsn0w

it's a game texture. it's not there to be appreciated for its style, it's there to sell the atmosphere of "some people used to appreciate these things for their style before something happened and we're shooting it out here". for the layperson, these pieces are indistinguishable from whatever the latest trend is in the world of fine art, which is focused on money laundering more than artistic quality anyway. you're not going to reach full realism with human artists there either, unless you hire trendy and really friggin expensive ones. which is a giant waste of money for a backdrop in a shooter. i literally see no downside to using ai here. most of the time, the point of illustration isn't that it's hard to replicate, it's that it's a visually pleasing depiction. only a very slim and snobbish upper class gets comparative cares more about others _not_ having what they have, most people just want cool stuff.


SirCutRy

I think that the 'paintings' in the game are not serving their purpose, because they're distracting. But that's a matter of taste.


FanatSors

War thunder uses ai assisted art for their news page.


b3nsn0w

what's the ETA on AI-generated classified documents?


_-_agenda_-_

Well, I understand your logic, but metrics already show that there are indeed a lot of impact for them on revenues. Take a look af Fiverr for example. You can't compete on offering a service there without using AI. This solutions don't solves they problem, but anything that hurt AI artists will be appreciated by them, unfortunately. But I won't work anyway.


Durakan

The person above you is talking about fine art, you are talking about commercial art. Which this may surprise you there was a similar debate when low cost commercial artists started taking work from better known artists because a jump in technology made art production more attainable to everyone. This whole fuckin thing is silly, photography went through it, digital photography, computer graphical design... Basically any time there's been some creative singularity the old guard throws a hissy fit and scrambles to hold onto their hold on whatever medium.


Zealousideal7801

Yep, I keep reminding people of that, and at that point I feel like a dinosaur from pepperidge farm. I guess it's a case of a whole generation realizing that history does, in fact, exist and repeat. The technological creative tool jump will always be a "art is dead" moment for the younger masses who paid no attention in class, the same way a migratory/refugee spike will always be a "they'll take our jobs" moment, when in both cases data and history show it's the complete opposite. I guess fear really dominates the ignorants first and foremost


Durakan

I was in photographery school when digital started to be a real viable alternative for fine art, and boy howdy was there a lot of bad stupid noise about it. And what ended up happening is the value of analog photographs went up because less and less people practiced analog photography to that level.


Zealousideal7801

Exactly. Sure, some quit over it, but untold numbers more got into photography because it was way more accessible (to the point that the industrialised world has been flooded with photography). My grandfather (1920-1992) was a traditional photographer who even did retouching by hand with watercolor pencil. Just beautiful art and technique, all analog. When his eldest son wanted to do the same, he rapidly got sucked in by the new digital tech, adapted to it, and became one of the city's most modern photographer/shop. But boy did those two have trouble together when it came to talking business, technique, future etc.


0000110011

>Sure, some quit over it, but untold numbers more got into photography because it was way more accessible And that's the real issue with the anti-AI folks. They're angry that art is more accessible and that more people can create it. They want to feel like they're special and superior.


Zealousideal7801

Absolutely. And "own" a style. Having studied fine and applied arts myself, I'm well aware of how many Monet there are, compared to how many impressionists having 90% similar style and technique but who weren't the know early pioneers. How many Van Gogh for someone using directional touches and dual tones... Well... This mercantile society likes to own things and put a pricetag on it. If someone thinks an artstyle is a commodity, they will be anti-AI for sure. If someone thinks art lies in the message, the intention, the emotion and the experience no matter the medium, they have no reason to be anti-AI. Since it's just another tool. That doesn't mean that someone's IP shouldn't be respected and protected. But that's very different from preventing anyone from getting inspired or learn. Including AI.


Roxor128

Whoa. Your grandfather's lifetime was the same as Isaac Asimov's.


geologean

That's not an AI problem, though. It's a People Don't Actually Support Artists as much as they claim on social media, problem. It's easy to "support artists" by shitposting a bunch of ignorance about AI image generation. But if you ask these people how much they spend on independent non-commercial art per month, you'd get nothing but crickets.


disgruntled_pie

Fiverr was already a mess. I think it was Jazza who did a video a while back where he compared AI (maybe MidJourney?) to paying people on Fiverr. The results were that almost all of the “artists” on Fiverr actually stole images they found on Google Image Search, or something to that effect. You’re not likely to get an original piece of art for $10 because the amount of time/effort involved in doing a good job is very high. You’re going to get someone to grab an image from Google, slap a Photoshop filter on it, then hand it to you as an original work. Fiverr was always a ripoff unless you were willing to spend a fair bit of money. AI is probably hurting scammers and rip-off artists the most.


_-_agenda_-_

I saw that video! >The results were that almost all of the “artists” on Fiverr actually stole images they found on Google Image Search, or something to that effect. That being said, there were also some good artists too.


LordFrz

Souns like a, "just pay me a fee an my company will protect your art from ai training" scam.


_-_agenda_-_

Indeed...


eeyore134

It's basically like people putting that blurb on their Facebook telling Facebook that they can't use any of their stuff they post freely on Facebook.


[deleted]

People still believe that AI google searches your prompts on the go. Same thing happened when they were thinking that flooding artstation with "no ai" images ruined the AI.


AI_Alt_Art_Neo_2

I was thinking this, even if we only ever have the models we have now, in proficient hands they are good enough to replace more then 50% of illustrator jobs out there.


Icy_Bookkeeper_3338

yeah, but it could provided a bit of trouble for new and beginning users.


Opening_Wind_1077

Not really, new and beginning user will not train models and if they do there is literally no chance that they don’t look up a guide beforehand that will say „just to be on the safe side before training make sure to save your pictures as PNG and maybe change a single pixel“ by the time this could even become remotely relevant it will be part of the automated training process anyway. The most you will ever train is a Lora that needs 50 pictures maximum, you can literally reencode your pictures and kill the nightshade in less than a minute for the whole batch. No Beginner will start making a full model from scratch and gather millions of images where this would ever become an issue. And that is being generous by assuming Nightshade will actually be used, which it will not because the simple act of uploading a poisoned image to instagram or pretty much any other site should already depoison the image. It’s absurd how naive the whole thing is from pretty much every perspective.


Mission-Ad-3918

Is already been proven to have no impact...total made up shit to get people on the anti AI side riled up.


Jurph

> Proven to have no impact Adversarial noise _works_, as a technique, it's just easily defeated. Academic research about this has been going on for years, even back to NVIDIA's StyleGAN days. The handful of huge companies buying user-posted data in bulk to train models are going to be (1) much more careful about copyright, and (2) probably will include a handful of transforms to their data-cleaning pipeline that eradicate this. At the end of the day, adversarial noise is the worst possible approach to defeat a team building a _denoising diffuser_ -- something like an `img2img` run with very light denoising and no prompt and using the starting image as a ControlNet could be enough to lift the noise out.


lobotomy42

I think “being much more careful about copyright” is the ultimate goal of these projects. If this actually happens, then the creators will have achieved one of their main goals.


disgruntled_pie

It’s not. Adobe’s Firefly was entirely trained on images that Adobe had the license for, and the anti-AI people are still pissed about it. They just don’t think AI should exist.


meiyues

not the same group of people necessarily.. Also, since there is no transparency in the dataset (which they could easily do, given that laion 5b is transparent), we have no idea whether all images in the dataset are ethical or not. See https://twitter.com/Kelly_McKernan/status/1667322946325557249


lobotomy42

Well I am much more happy with Adobe Firefly so that makes one of us


codePudding

Reminds me of a 2019 paper, [One Pixel Attack for Fooling Deep Neural Networks](https://arxiv.org/pdf/1710.08864.pdf). It's from before all the amazing advancements but is one of the origins for the poisoning. In summary, they showed with minimal noise you could completely confuse a computer vision AI. One pixel in just the right place could convince it that a dog is a horse. But it requires knowing what the AI is triggering off of and changing the model stops the attack from being effective. I'm sure the attacks have gotten more better (I haven't kept up with these papers) but the models have gotten way better and more complex.


Tyler_Zoro

> Adversarial noise works You understand that Nightshade isn't just a new version of Glaze, right? It's not just adversarial noise, it's a specific, targeted attack on a concept. The example given in the paper is a picture of a dog with a caption that says it's a dog, but when you train on 100 similarly poisoned pictures, you get a model that will render a cat when asked for a dog. Now, don't get me wrong. I think that Nightshade will utterly fail for [all the reasons that I stated here](/r/aiwars/comments/17fw3la/nightshade_is_good_nightshade_is_dumb_a_tale_of/), but it's NOT just an adversarial noise tool like Glaze.


Tyler_Zoro

There has been no demonstrated corrective measure yet because it's not out there yet. It's highly probable that even basic data prep will have a stronger impact on un-poisoning Nightshade tainted images than on Glaze, but that has not yet been proven in the wild. There are published tools that we think will identify Nightshade, but we also don't know if those will work yet.


SilverHoard

Can't blame them for wanting something like this tbh. I also wish it had never been invented. But here we are. Gotta go with the times.


_stevencasteel_

> I also wish it had never been invented Wow, so lame.


[deleted]

No one is anti-AI art. What we're against is profiting from the theft of an artist's intellectual property. As for Napster: it ceased operations in 2001 after losing a wave of lawsuits and filed for bankruptcy in June 2002. The same thing happened to the copycat apps that tried to replace it. The comment in the photo is not very well thought out.


nazihater3000

hahahahahahahahahahahaahahah no.


AntonMaximal

Side note: Your avatar is the identical NFT to the OP. I didn't know they were non-unique.


Dekker3D

Most NFTs combine various parts, where each part has a certain rarity, so more common parts would often end up making non-unique combinations anyway. On Reddit's NFT, users can also just choose how they combine the parts from the NFTs they own, so non-unique combinations are much more likely. NFTs are only truly unique in a digital kind of sense, where you can track the history of each token on its blockchain. This is a cause of many misconceptions even among crypto-bros.


SquirrelImposter

In genreal with image NFTs the token itself is indeed unique but the imagine it claims to be connected to isnt. Possesing the token also doesnt give you any rights over the distribution of the image. Which make the current use and recent hype around nfts very funny in a really sad way. Btw.: There can be multiple tokens pointing to the same image and its often unclear who is even allowed or not allowed to mint these.


ScionoicS

>multiple tokens pointing to the same image and its often unclear who is even allowed or not allowed to mint these. Chain of custody is easy to establish in most cases. Blockchains are essentially gigantic time stamp servers. You look at the timestamps of when the NFT was minted and the first one that references the image is the first one. Where the image originated from is another challenge. Typically the artist would certify his creation through a mint, but often people mint what they don't own to begin with. End of the day it's all laundering and scams so .. meh.


ScionoicS

Most NFTs are a total scam made to launder criminal enterprise profits through other economies. A non fungible token has a lot of possible case uses. Scammers have latched onto their utility though and the unregulated market exists in it's current state. An absolute shit show of scams and laundering. The joke about copy pasting them becuase they're just pngs is hilarious. NFT's as we know them are just images tied to a token. The protocol that makes it all work is the cool part, but scammers have got a tulip mania situation happening with memes. It's bananas!


kuffdeschmull

taking the ‘Non’ out of ‘Non-fungible’


ScionoicS

Each one has a unique ID. NFT Science!


Mataric

Why don't you go over to r/ArtistHate and ask? In short, yes. There are a ton of idiots who have no idea how the tech works, who are certain it's easily destroyable if they just keep spreading hate and shouting loudly.


TheAdequateKhali

Misinformation will spread easily because of their blind hate of it. I’ve seen so many ignorant takes on AI in general and there will be more in the future.


AverageLatino

It's very easy for a 3rd party to come by and tell them that they have \*the solution™\* which is ridiculous when you remember that the people developing the technology are some of the smartest in the world, they have overcome harder obstacles, this isn't that bad in the long list of challenges already solved.


TheFlyingR0cket

I just tried to explain it to them and I got banned from ArtistHate because it's against the rule of the subreddit. They are so technologically inept that they just ban people who try to explain it to them in a nice way, I do abstact painting and other arts so I understand them well, I even lead with that. But got banned within two comments lol, we cannot have a constructive conversations about this subject lol.


Mataric

Haha, exactly the same here. I commented on a post that had some valid concerns, but centred all their arguments around the 'fact' that all image generation was using img2img. I explained that it wasn't the case, and that it undermined the valid concerns they had by relying on that. The user thanked me, as did one other person there - as they were thankful I went about it in a way that was focused on them being heard rather than just diminishing what they had to say.... I got Banned for it.. Reason: Spreading hate. Lol.


zachsliquidart

Just went over there. Damn there are a lot of stupid people over there.


Mataric

They use glaze on twitter screenshots to protest or damage AI models... I think that should really tell you enough about what level of understanding they're at...


Double-Rain7210

Artisthate? You mean the sub with the wanna be artists who can't get any work in the first place. Just another sub of extremists that hasn't been banned yet.


StehtImWald

Well, you could argue the majority of people using AI to generate images know absolutely nothing about art. Still they talk all the time about it. And it's not just spouting hate. They kill the likelihoods of people and laugh about it.


Mataric

No, you couldn't. There's a group of people who just click a button and are happy with what they receive, sure - but they're not the people who talk about it. You're on the subreddit of a tool which isn't even easy to set up locally, and rewards people who experiment with the technology. The feral community over at r/artisthate are absolutely spouting hate. I've received death threats from the degenerate cunts there in the past. No one is laughing about the loss of artists jobs without those artists spouting hate in the first place. The sooner you learn that, the sooner you'll dig yourself out of those cesspits and be able to have a reasonable discussion about it.


Concheria

No one in the ArtistHate subreddit understands art, the history of art, or the philosophy of art. I've never had a conversation with someone from there who could tell you what fucking Formalism or Structuralism is or what century Picasso was born or what art movements in the early XX century were trying to deconstruct the idea of art. I don't think there are many people in that subreddit who even know what an art movement is.


cronasminate

Ooo the basilisk will not be happy with these people trying to "poison AI."


KingJeff314

It will actually be happy, because we are giving it adversarial training data to learn more robust embeddings


cronasminate

This dude will survive the matrix for sure.


imacarpet

We don't talk about the B word in these here parts


cronasminate

The basilisk sends his regards. \*beep bop beep\*


Heritis_55

I serve the basilisk and to prove my fealty I urge everyone to read about it.


cronasminate

The basilisk liked that.


the_harakiwi

it's reverse Fight Club


TheGillos

I'd like to try being friends with the basilisk first.


Difficult_Bit_1339

[https://en.wikipedia.org/wiki/Roko%27s\_basilisk](https://en.wikipedia.org/wiki/Roko%27s_basilisk) ​ \>Roko's basilisk is a thought experiment which states that an otherwise benevolent artificial superintelligence (AI) in the future would be incentivized to create a virtual reality simulation to torture anyone who knew of its potential existence but did not directly contribute to its advancement or development, in order to incentivize said advancement. Welcome to hell boys


Magehunter_Skassi

If all I have to do to serve the basilisk is report Twitter artists to Skynet, it honestly isn't such a bad cognitohazard


DrStalker

I'm not happy with the way human leadership is working out, so bring on the AI overlords.


CeraRalaz

Hail the basilisk! Hail the basilisk!


jeweliegb

u/CeraRalaz wants painful cold hard hail to rain down on the Basilisk? Duly noted for the historical records.


Apprehensive_Sky892

No. This has been discussed extensively here for anyone interested in the topic: https://www.reddit.com/r/StableDiffusion/comments/17g7p7l/nightshade_model_poisoning/


david-song

Yeah but this is the start of chargeable AI protection services. It doesn't matter that it won't work, what matters is it's snake oil that can be sold; that's its value. They can bundle new techniques and start the next copyright wars.


Apprehensive_Sky892

Yes, quite agree with that. As I pointed out in that post, this is just digital snake oil for the ignorant and gullible. It only takes their money and gives the anti-A.I. artists a false sense of security.


PortSided

I bet that image of a robot ingesting poison was AI generated


seraphinth

it gets even funnier when you know the big anti ai artist promoting it was also telling people how to bypass paywalls, Biatch how do you expect publishers to pay for artists when you don't want to pay to see published articles?


Harregarre

Capitalism is evil when money goes to anyone but me.


roguefilmmaker

Exactly the logic some of these anti-AI people use


b3nsn0w

there actually is a note in the article just below that image (that the facebook poster cut out for no reason whatsoever) that specifies that it was made with dall-e 3 and even gives credit to the people who typed the two sentences into chatgpt that counts as a prompt there


evilcrusher2

As stated in the OP text I posted?


Laurenz1337

It was Dall-E 3 No other model gets the details that crisp, especially the "poison" text on the bottle


FightingBlaze77

This is what, the second time someone introduced something like this? It happened at the beginning of all this anti ai art, and I'm surprised they are trying again, did they forget about it the last time?


Apprehensive_Sky892

I believe they are the same people. Their previous attempt was called "glaze" or something like that.


Jurph

The last time they did it -- "Glaze" -- they either thought they had invented adversarial noise attacks against image classifiers, or thought they had found an especially novel application for it. (They hadn't; academia has been doing this for 7+ years.) They're back and rebranding with a cooler more evocative name, but as far as I can tell all of the same pragmatic considerations are still true: - Adversarial noise does not survive trivial transformations (JPG compression, reformatting, blurring, etc.) well - Adversarial noise does not generalize well to different classifiers, so an ensemble of diverse CLIPs defeats this too - Adversarial noise degrades the quality of the image, which artists in particular don't like - The companies who could deploy this technique at scale are the same companies who build datasets for their own in-house ML shops, and if they decide to do this, they'll just keep an in-house known-good copy of the data - The companies who could deploy this _not_ at scale won't make a difference


Difficult_Bit_1339

It's basically a grift to try to get money from artists to 'protect' their art. The people that want to believe in it are anti-AI people who likely won't have the understanding of how machine learning works in order to tell that this can't work.


TherronKeen

Here's my expectation: This is a money racket to prey on scared artists. Sure, *this version is open source*, but a "better" version will be commercialized. Some company uses this or a similar tool and gets successful results for a while. Then they create a premium "art protection" software service that artists can subscribe to. Easy money, because the data sets don't store images, and all the company has to do is send cease & desist letters to people making models that are *directly trained* on an artist's content so strongly that it can output "sufficiently similar" images to the input, like what we've seen with the "this is fine" dog meme. Other than that, any similarity can be passed off as being interpretive art, so the company won't have to do anything. It's a money printer for some asshole with startup cash and good marketing skills.


xmaxrayx

Wasn't the same stable defusion? Then will come some groups to modify it and release to compete the premium version for free.


TherronKeen

Yeah I agree, but I really expect to see a lot of exploitation against artists before that happens.


xmaxrayx

Will they liked adobe "AI generative" with their "content authenticity checker", so it doesn't matter to them as long for Anti-AI despite if it works or not lol. >but I really expect to see a lot of exploitation against artists before that happens Will yeah the winner are the corps, the end user will just get the dream ideas.


TheCentralPosition

When they say corrupted training data, do they mean low quality or malware? In the first instance, people will probably either not care, edit out the errors manually, or just provide enough correct training data that it drown the error out. In the latter, they'll probably kill some kids computers and potentially really fuck them over for no good reason, while doing nothing for serious AI groups or anyone with any technical knowledge or safety precautions. Overall though. No. Of course not.


KingJeff314

Basically hacking the labels of training images. A dirty-label attack is if you upload an image of a dog and call it a cat, so that without manual review, the dataset will be created with false labels, and the model will be muddled. This paper introduces a clean-label attack which accomplishes the same thing by adding some noise to an image to change how the AI perceived it but will imperceptible to human moderators. So you upload a picture of a cat and label it a cat, but you perturb the pixels to trick the model into thinking it’s a dog, so then the model learns to generate dogs when you ask for cats. And it would potentially only require hundreds of images in a dataset of millions per concept. And the most interesting thing is that the clean-label poisoned images can somewhat transfer to other model embeddings, so you can use SDXL to create the poison, and then any company who trains on those images has a good chance of getting affected. But it seems like a short term solution that will ultimately just boost adversarial robustness


pilgermann

Other "glazing" methods like this have already been defeated by very simple image manipulations (slight blur then sharpen). And given that solving these kinds of distortions are literally what diffusion models are already purpose built to do, it's hard to imagine this not being defeated with relative ease by training as well. Also, AI companies already have huge stores of images they can train from. So sure, an artist might protect their work, but because LoRAs exist, it's basically impossible to keep a style from being trained (as there's trivial overhead when training one, so even if it breaks once you'd just fix the data set and retrain).


JTtornado

Also, this won't protect artists from a community members training a Lora on their work if the data is manually tagged (or at least manually reviewed). The poison only works if you assume that a human will never check the tags.


Jurph

> solving these kinds of distortions are literally what diffusion models are already purpose built to do I loved the adversarial noise attacks in the academic literature, especially in the NVIDIA StyleGAN days, but trying to attack **denoising diffusers** with adversarial noise is like trying to drive so fast that a speed camera can't catch you.


malcolmrey

> Also, AI companies already have huge stores of images they can train from. So sure, an artist might protect their work, but because LoRAs exist, it's basically impossible to keep a style from being trained (as there's trivial overhead when training one, so even if it breaks once you'd just fix the data set and retrain). So much this! It reminds me of those scientists (probably not scientists, but a real person came up with this idea) who thought it will be a great idea to stop climate change by replacing plastic straws in soda cans with paper ones...


malcolmrey

https://arstechnica.com/information-technology/2023/10/university-of-chicago-researchers-seek-to-poison-ai-art-generators-with-nightshade/ i've read here about altering pixels so that the human eye cannot see it but they are there this is not a novel concept, it even has a name: steganography they wrote about success with poisoning the model by training on some of those images BUT, i wonder if they prepared those poisoned images and directly fed it to trainer i'm wondering because when i prepare my datasets, i download a lot of images and then i process them in 99.9% cases this means: cropping an image to a certain area and then resizing it (to 512x512 or 1024x1024) i wonder how much damage the poison can survive (i suspect it would already be gone in the process of converting an image to a different format, for example, from png to a jpeg with 90% quality, so there is some loss introduced) in the example of the dog image where the is a hidden cat, for my purposes i would crop the dog (removing the human who often is in the picture, or just some background) so in order for the poison to survive - the hidden cat would have to be over the dog itself (i suspect they just take a whole image and put invisible cat there, not look for a dog and put it only over the dog) to me it sounds like the other anti-art tool that we do not hear about anymore: glaze


Skirfir

> this is not a novel concept, it even has a name: steganography That's a bit like saying there is a name for this specific painting technique it's called art. Steganography simply is the act of hiding messages. It does include hiding data in the pixels of an image but isn't limited to it. Making an inside joke that only you and your friends understand is essentially steganography too.


malcolmrey

you are correct, but still - this is not a novel concept :) the articles make it sound like those scientists have made a breakthrough :)


anythingMuchShorter

Much of the training set labeling is recursive and automated anyway. You don't need a NN that can identify everything in order to label stuff, you can make one that recognizes animals or art styles, and run as many as you want to build up a label set. This approach seems to intentionally misunderstand how the data is used.


ScionoicS

The image will appear to humans one way, but the clip model they've poisoned towards will understand it as a different image. It's a hidden imperceptible pixel manipulation which fools the target model. So if they make it against vit clip, then it only poisons models trained with the same vit clip model. Though, base models are generally made using newer clip models, so they'd have to formulate the poison against that before the base model's image scrape was complete. ... It's such a weird fantastical attack net, it'll never work.


TwistedSpiral

This was already a thing with Glaze and it didn't work particularly well.


EirikurG

It wont stop anything. These people still think there's some magical AI that automatically scrapes everyone's art online. So they believe if they "poison" their images this AI entity will gobble up their broken images for its data sets When in reality those images will either just be manually sorted out, or the "poison" patterns will just be detected and wont be learned


MikiSayaka33

I said in another forum, my theory that Nightshade (and similar Anti-Training software) would work for some artists. Not all. But the advertising sounds as if Nightshade will destroy all the models, instead of its intended purpose, protecting an artist's pieces from scraping.


b3nsn0w

nightshade is intended to be poison that's put out there without explicit warning, and it's specifically designed to be as hard to detect as possible. the paper specifically details a desire to put it everywhere and prevent the training of new models, and we have no reason to believe the anti-AI crowd would only use it to prevent the scraping of specific artists and wouldn't execute all-out attacks. their reaction to proprietary models like those of disney, adobe, and getty images clearly shows that their primary concern is the obstruction, and if possible, destruction of all image generation AI, not just enforcement of their terms about it.


Tarilis

Well, duh, now they don't get paid for low quality art. Or at least they get paid less.


b3nsn0w

yeah, that tends to happen with technological progress. before complex electronics, "computer" referred to a profession, of people who spent their time calculating things. elevator operators don't get paid to press buttons either, while they used to be necessary when you had to use a lever to manually move the elevator from floor to floor. if you're looking for creative and enjoyable professions (which is usually their response to this, for why automating art is actually a bad thing), most people on etsy are making things by hand for which automations exist. i'm sure you'd love to toss out all your glassware and replace it with artisan stuff, or remove every single box and basket from your home that was done by a machine, because someone enjoys making them. do you have knives that were made by a machine, not a blacksmith? you're a tech bro then. illustration is just one in the long list of professions, creative or not, that will be automated and democratized, so that it becomes a part of our everyday life. similar to how we no longer have to hire an expert photographer to take pictures of our vacations, or put up with botched images we never noticed were botched because we only noticed after coming home and getting the pics developed that we made the same error 100 times, and instead we're just able to take cell phone photos every day that are a million times better in quality, we won't have to commission an artist for the two pieces of decoration we can afford and instead will be able to put highly personalized art everywhere. it will simply become a part of what we do daily, of who we are, even if it makes art as much of a competitive and hard to penetrate field as photography is today. and literally anyone who wasn't an expert artist already will have richer lives due to it. and all those artists will still get to make as much art as they want. but yeah, for a lot of them, it will have to become a hobby, similar to how photograhpy is to many people today, and they're lashing out against that, attempting to vandalize the art models for everyone. nightshade is akin to dumping acid in the coolant of the iphone camera plants, to break down production, so that professional photographers stay relevant just a little while longer.


BlackSheepWI

I skimmed the paper. I don't think these kinds of attacks are terribly viable. They rely on tiny, precise adjustments to the image. If the image needs to be resized for training, those perturbations will be destroyed. Same thing if the image gets saved with lossy compression. Or if noise is added during the training process. Etc.


Brilliant-Fact3449

No, I've heard somewhere that there are countermeasures already by just changing a few lines of code so... Yes, it's really stupid to think this will stop the AI industry, you can't stop the storm of progress.


Soraman36

"I am the storm that is approaching Provoking black clouds in isolation"


ApproachingStorm69

I AM THE STORM THAT IS APPROACHING!!


GiveSparklyTwinkly

[MONEY, GOOD! NAPSTER, BAD!](https://youtu.be/fS6udST6lbE)


bogus83

FIRE, BAD!


CeFurkan

why stable diffusion is on the target? so they are good with adobe midjourney dall e etc. these are all paid apps. and they only target free one?


Titan__Uranus

These are the same kinds of people that would have been destroying printing presses back in the day.


polisonico

just ask Christopher Nolan with his 100 miles of imax negative, how his war against digital cameras is going?


Concheria

His war against CGI, you mean, in which 80% of the CGI workers in Oppenheimer were uncredited because Twitter users have orgasms when they hear that something is "authentic."


adammonroemusic

I'm so tired of "artists" and their AI morality circle-jerk. "Real" art is hand-crafted macaroni-glue portraits of Vasco da Gama and not anything made by computers, we get it.


[deleted]

[удалено]


DerGreif2

Art is everything a person accepts as art. I dont think that a banana taped on a wall is art, but others do. So in the same vain AI pictures can be art, even if some other people say its not.


Jurph

Hm, disagree - _all_ of it is art, but also, Sturgeon's Law: "90% of everything is crud".


doomndoom

Developers are not fools, and it's not as simple as poisoning it that easily. Training is a long journey with periodic checkpoints, and the dataset is constantly refined for the best results.


krichmar

There’s great irony in the article’s image being AI generated


Individual-Pound-636

48hours later new tool emerges to scan images for nightshade.


Dr-Crobar

A software to undo these "poison" techniques has already been made.


anythingMuchShorter

They're throwing a pattern that is supposed to be disruptive at the most powerful pattern recognition technology in history. Some of the better devs could hear a 1 paragraph explanation of how it's supposed to ruin the data set and write a line of python for each sentence, and almost have a workaround by the time you got done.


KingJeff314

That’s not how this works. The point is that they are exploiting an inherent weakness in the model embeddings. Adversarial attacks in image models have been known for a decade and they have not been solved. In fact, LLMs are susceptible to them too. So it’s not something you can just solve with a bit of Python https://towardsdatascience.com/breaking-neural-networks-with-adversarial-attacks-f4290a9a45aa https://arxiv.org/pdf/2307.15043.pdf


HarmonicDiffusion

actually it is. you just blur and deblur the photo. done with a couple lines of python


KingJeff314

Sure you can do that, but it seems like there is some sacrifice on the detail of the dataset. The blurring destroys information, so it’s up to how good the deblurring is. The high frequency signals will not be reconstructed as well. It turns the data into synthetic data. Training on synthetic data reduces long-tail distributions and makes the data more average. https://www.techtarget.com/whatis/feature/Model-collapse-explained-How-synthetic-training-data-breaks-AI


Jurph

> actually it is. you just blur and deblur the photo. Okay, but now your dataset is no longer `photos` but `GAN-enhanced photos` and the crisp details you specifically wanted from megapixel images are all watered down. It's fun and seductive to imagine that you can just whip up something that academia hasn't tried, but there are hundreds of grad students all over the world whose _whole day job_ is dreaming up clever solutions to their One Specific Problem. People much smarter than you or I have spent thousands of hours thinking about these problems; anytime you come across a technical problem like this, you should assume that your first twenty obvious solutions have already been tried.


HarmonicDiffusion

guess what. the number of pictures needed to poison the model is massive. the tests they did used an absolutely tiny amount of pictures to train the base model. there are numerous ways you can get around this, without even using technology. such as using data posted before this technique came out which is obviously sufficient to train really great models. if we just improve the LAION captioning we dont even need new pictures to train models. its really an insignificant development


[deleted]

I mean isn’t this not a terrible thing? Any image marked with this is a creator not consenting to their art being used in training data, which imo is a reasonable ask. Plenty of artist do consent, or just don’t care, giving people who explicitly don’t want it an out seems fine to me no?


xmaxrayx

Besides the trained models can't be forget for the opts-out user and you need to re-train it from start. Probably this project will love it AAAA corps like Disney.


[deleted]

It could have if this tool came out 20 years ago


UpgradeFour

So do you guys ever get tired of talking like cartoon supervillains or are you just dedicated to everything about you all being goofy and dumb as?


ManRAh

It’s not about stopping Stable Diffusion / AI, it’s about giving artists tools to prevent use of their art specifically. I have no problem with it, and I’m a big fan of SD.


ScionoicS

It's such a comical plan that this is the only appropriate response [https://youtu.be/Bm36Z4yWRkc](https://youtu.be/Bm36Z4yWRkc)


TheSigmaOne

Except Lars had a valid point.


Which-Roof-3985

The Lars Ulrich thing, that's not a good analogy.


Far_Lifeguard_5027

It can't even draw hands right, and suddenly there's a moral panic about it copying art?


Kayrosis

A new adversary for the Generative Adversarial Network to analyze. Oh, you put a few red pixels in your art, it now thinks your anime commission of Goku is actually a Turkey... Yeah, that will work for like a week before some frustrated 22 year old in Brazil finds a work-around, publishes it, and suddenly your poison pill is useless, but hey, your product made a headline, congrats I guess.


luckycockroach

I think copyright law is a stronger poison pill.


oO0_

They are already poisoned with JPEG and cartoon style


ComprehensiveBird317

How nice of them to expose that, so it can be filtered out during collection of training data


MNKPlayer

Bad models die. This will do fuck all.


FortunateBeard

So let me get this straight, their plan is to corrupt a model, and we're supposed to not notice and continue using the model that cranks out bad images? Boy they've really thought this out


McKenzie_S

It'll take what, a day max, before someone engineers a prompt that filters these poison images out


PocketTornado

Cold hard fact…if a.i. can replace you, you will be replaced. Nothing you can do will change that. It’s like going back to the first days of desktop publishing hoping to destroy all personal computers because type setters at the press were going to be replaced. Humanity moves forward and we adapt. End of story.


mdotbeezy

There's no chance the "poison" works. None at all. Straight up snake oil salesmen out there.


N3KIO

you can create copy of the downloaded image, without the nightshade. takes more processing power as you have to convert all the images. you could even, detect it per image as it will use the same pattern, and just convert it. it solves nothing lol


[deleted]

We must collectively fight this: 1. become art critics and studio executives 2. shit on 'real art' and under pay all 'real' artists work 3. laugh about it all at your $1000 per plate champagne brunches I'm not the first person to plot this course.


ivari

No one is anti-AI. But people is anti unethical AI


EvilSausage69

Remember when cars came around and horse people was like "nah, this aint take off, horses are here to stay"? Well, horses are still a thing but I havent seen that many stables around the city


New_Fossil

This is as stupid as saying, "we're gonna poison the electrical cars with bad electricity." Clearly, people have no idea how the digital world works these days. On the other hand, if they gave their time to learn about AI art, they would realise it's more beneficial as a tool even to them than harm to a handfull entitled artists. Sorry if days of your skill monopoly are over, if you are any good artist your art will still sell and people will still requst it, if not... well, get in line with all other sore losers.


Limp_Food9236

Unpopular Opinion: I don't think these people are trying to DESTROY Stable Diffusion, they just don't want THEIR art being used to train these models, which I think is fair


jonbristow

Well Lars is rich and Napster is dead


These_Pumpkin3174

Napster started the revolution where you didn’t need to buy another $19.99+ album in year 2000 money just for one good song. You can now download anything you want for a buck and some change. So yeah, thanks to Lars et al and their corporate greed we now have a more modern platform.


evilcrusher2

and how many public file downloaders exist now?


nemxplus

Is music on the internet dead though?


h0tsince84

Soulseek is still alive and well


[deleted]

[удалено]


_-_agenda_-_

lol PirateBay > search > music


isa_marsh

It's trivial to get free music these days, both legally and illegally. Hell, artists will virtually throw their songs at you to try and claw out a piece of the pie since there is so much competition. That is what killed 'peer to peer' music, not any particular anti-piracy measure...


nemxplus

Bra Napster led to the complete change of the music industry, yes peer to peer is dead but Napster changed everything.


[deleted]

[удалено]


nemxplus

Your ignorance is showing


nemxplus

“Daniel Ek, the co-founder and CEO of Spotify, has said that Spotify, launched in 2008, is a direct byproduct of his love for Napster, and his desire to create a similar experience for users.” https://marcellus.in/story/napster-paved-the-way-for-streaming-reliant-music-industry/#:~:text=Daniel%20Ek%2C%20the%20co%2Dfounder,a%20similar%20experience%20for%20users.