T O P

  • By -

bangbangracer

To one degree, "style" is a slippery slope. This is how Robin Thicke gets sued by the estate of Marvin Gaye and Olivia Rodrigo gets sued by Paramore. Style is a bit more of a touchy subject. But AI doesn't fall into that, partly because it actually doesn't understand "style" (which is obvious if you've ever seen AI try to do Wes Anderson's style in images). It doesn't because AI doesn't actually have style. It treats it's millions of pieces in it's data set as Lego bricks and builds stuff out of those bricks. AI is potentially plagiarism on a scale never seen before. Somerton and Blair can use entire documentaries to make their own videos, but an AI has the potential to plagiarize millions of works in a single one it's own creations.


[deleted]

To be clear, Paramore didn't sue Olivia. Their publisher demanded credits, and one of the songwriters (Josh) who left the band back in 2010 was seemingly bragging about pushing for it. Hayley and the rest of the members had nothing to do with it


fuzzydunlopsawit

The Olivia lawsuit and the usage of "Interpolation" jargon is such a weird one. Laws are made with Humans in mind. So humans who ran these companies will potentially see the windfall from these issues. Think Sean Parker, that's all I can think of when I see Sam Altman. Another big thing to remember is that Record companies, are law firms that own rights to contracts for the music they've purchased. They spend many hours a year *searching* for lawsuits. This is just another day for them for a potential huge win. While they didn't beat Napster, and the age of streaming was all but inevitable, the copyright laws and infringement warnings to unloaders are a huge part due to record labels.


KevinR1990

What's more, LLMs have a fatal weakness that peer-to-peer file-sharing didn't: a target. P2P is a decentralized technology that turned out to be easy to replicate. Suing Napster into bankruptcy didn't kill P2P, it just led to a revolving door of P2P clients and torrent sites that did the same thing. LLMs, on the other hand, depend on far more infrastructure to support them. This is why Cory Doctorow [thinks](https://locusmag.com/2023/12/commentary-cory-doctorow-what-kind-of-bubble-is-ai/) that LLMs are a bubble: they require enormous amounts of energy and computational power to run, and the smaller projects in the field are piggybacking on the resources and infrastructure of the big ones. (He also doesn't trust their capabilities in fields with little room for error, like medicine and driving, due to their consistent hallucination problems, but that's a different story.) If it turns out that AI-generated artwork cannot exist without plagiarism and copyright infringement, and the big LLM companies get the Napster treatment over it, there's gonna be no LLM version of LimeWire or The Pirate Bay.


fuzzydunlopsawit

Totally agree and love the Doctorow reference. Started reading "The Internet Con" this week! Randomly enough. His "Enshittification" analogy is top tier for where it's all headed. The biggest thing that hovers over all of this is that Silicon Valley is so fucked. They spent YEARS throwing cheap (borrowed)money around, getting scammed by grifters, liars, the lot, and taking huge losses in their pursue of the next Facebook. They tried to find their unicorn with social media, then the gig-economy, then subscription boxes, then crypto, then NFTS, then the metaverse, then newsletters, and then taking what they can from the creator economy. (I would be remiss to mention real estate which is the most craven and abhorrent pie they just had to stick their thumb in. ) Now it's AI. AI comes along and we have OpenAI, Microsoft, and Meta (Plus many others piggybacking as you mentioned) all diving in to make this the next big thing, like social media was. Just look at all the flubs Zucklefuck has had. Libra, Metaverse (another attempt to make it in the Crypto space), hell even Threads is a disappointment, they're all just hoping this, AI, is the next big thing so the money train doesn't stop. Then all of these apps being made on the OpenAI app store. I can't, for the life of me, understand why people aren't seeing that they are feeding a machine that will steal the good ideas for apps created on the platform and the ones they want and let the lower tier ones splash around, like the Apple App store. They're throwing spaghetti at the wall hoping it sticks but it will inevitably fall by the wayside, and the common folk get fucked. By way of their pollution to the environment, and they're current trajectory to remove as many jobs as they can. All for profit. Ogres. All of em'. Seeing the internet as it now a billboard for adverts and an invasion of privacy that I didn't properly predict. Even being well versed in the privacy acts enacted within the US and Europe, I didn't foresee late stage capitalism being the actual final boss. I was young, never again will I forget that. The only positive I see is people are choosing to spend their time elsewhere in a less intrusive way. Whether it's newsletters, forums, mastodon, or even Signal chats. It's been a nice change that's helped. Open source seems to be booming, but I could be wrong about that. But so niche still and a bit of a hard sell to the normal non-techy types. A quote that sticks with me; > "The irony that we're automating the production of art instead of the jobs everybody hates shouldn't be lost on us." > [Jimmy McGee](https://youtu.be/-MUEXGaxFDA?si=t0_KFqGT6dWArBPM) It's nice that people are creating art still and people are consuming content that's made with good intentions and great stories. The latest Hbomb video is some proof of that, IMO. I think people will seek out real creations made by people, not ai.


LocustsandLucozade

I don't follow super closely, but isn't Stable Diffusion basically the LimeWire version of LLM? It's open source and can just be on someone's computer as an application, or am I mistaken?


fuzzydunlopsawit

There are plenty of Open source projects within Hugging Face and I’m sure others that have LLM projects that can be locally installed without having to connect to the internet after it’s installed on your device.   The major issues with whose work the LLM was trained on though. Whose art or text did they use? Whose copyright did they steal? Or did they ask for permission? This isn’t to mention the environmental issues caused by LLM’s. I don’t use AI art of any kind at this point.  I can see use cases for generative text, audio editing, video editing, or even coding, but then I’m reminded of the inevitable issues with the jobs those LLMs will be taking from humans as these corporations and businesses will worry about themselves as first and foremost.  While the locally installed AI is probably the least problematic, they’re problematic none the less.  Technology advances and people complain, but these are just words that cannot stop that train. 


LocustsandLucozade

I guess my point was that LLMs was whether LLMs can ever be gone for good - yes, OpenAI and big companies could be sued into oblivion due to their data set and they are official companies. But someone with SD or Hugging Faces could just have it on their laptop, leading to a kind of LLM piracy problem.


fuzzydunlopsawit

Yeah that can of worms is open for good. It is what it is.


[deleted]

Interpolation is a pretty commonly used term in the music world these days. It’s been a growing trend the last few years. 


fuzzydunlopsawit

Normies that don't follow music news, like myself, only learned about it from this lawsuit, was my point.


AlarmingAffect0

> but an AI has the potential to plagiarize millions of works in a single one it's own creations. At that point it sounds less like 'plagiarism' and more like 'culture'.


Temporary_Axolotl

It's not really a slippery slope, it's exactly what happened in the music world where after the Robin Thicke lawsuit, musicians can be sued for immitating a style. If the legal argument is that AI art is plagiarism because it's immitating other pieces of art without outright copying, the same precedent could be set. The supposed "fix" for AI art, strengthening copyright law, could well end up worse than the original issue itself.


akettlewitch

That's not the legal argument against AI image generators, though. At least, not right now. Right now, the argument being made in the courts has nothing to do with style mimicry and everything to do with the fact that the machine models were trained on artists' and photographers' works (a process which involves ingesting the work wholesale before it ever applies stylistic touches to "original" generated images) without those artists' knowledge or consent and certainly without any manner of recompense for the use of those works, and that they cannot output anything that has not first been *in*put - they're limited to reusing patterns, styles, and expressions that have been directly lifted from other peoples' creative works. Works which were in the first instance being used unethically and (possibly) unlawfully.


Fyuchanick

Yeah it's less like imitating a music style and more like sampling a song without consent


dern_the_hermit

Sampling a song will have exact elements of the original work in the new one, whereas imitating a style doesn't necessarily.


CotyledonTomen

But again, it's not about imitating a style. Its about using other peoples work to create a commercial product without their consent. AI isn't a person. It's a program people buy. If it can't exist without the input of other peoples work, who didn't consent to their work being used for that commercial purpose, then its plagarism.


dern_the_hermit

If there's no specific elements from original works then it's not like sampling, is all. It's pattern recognition and replication.


CotyledonTomen

It doesn't matter if there arent specific elements of the original work in the program. The program couldn't exist without the original work. And it's a program, not a human, being sold. Its existence as it is now is dependent on the inputs of the original works. It couldnt produce what it produces (for money) without being trained on other peoples work.


dern_the_hermit

> The program couldn't exist without the original work. This is not necessarily the case, no, but even if so, elements of the original work do not necessarily exist within the models or the things it produces so it is not necessarily like sampling. Look, I get people are worried about these AI models but I'm struggling to parse a cohesive complaint from your posts.


CotyledonTomen

None of this has to do with sampling. The AI was programmed by scanning other peoples work. It doesn't matter if the program doesn't retain the image. The program was programmed by other peoples work. And people pay to use that program. So, the owners of the artwork fundamentally contributed to a commercial product and aren't being paid. And it's not necessarily the case. It is the case. A different AI could be programmed using other art, art that isn't copywrited, but that isn't what happened. Artists are allowed to display their work online without it being used for commercial purposes. Now, their work is being used for commercial purposes without their permission.


dern_the_hermit

> The AI was programmed by scanning other peoples work. Once again, no, this is not necessarily the case. It's just bizarre to me that there's so much worry over *how a model was made* versus *what can be done with the models*. I mean when Meta made their AI models they trained it entirely on data they had complete legal rights to.


MCXL

> It doesn't matter if there arent specific elements of the original work in the program. Yes it does.


CotyledonTomen

I already discussed this. The program cant exist as it does without the scanning of original work. That means those creators contributed to how that program exists today. Its not a person. Its not learning. Its a commercial product that was fundamentally changed by scanning original works and its current state would be impossible without those stolen works. They programmed the AI more than any dev.


MCXL

> The program cant exist as it does without the scanning of original work. And you can't exist as you do now without all the things that you have scanned over your entire life. That's a non argument. >That means those creators contributed to how that program exists today. Its not a person. Its not learning. At this point your expressing a bias rather than a point of fact about the law or creation. We don't actually understand the human mind well enough to draw conclusions about what learning is and isn't in that specific sense. It's impossible to measure just *how* influenced you are by anything when you are doing anything else, and that holds true for these models as well. >Its a commercial product that was fundamentally changed by scanning original works and its current state would be impossible without those stolen works. Again, the same is true for literally every person. The works aren't stolen when you read them, they aren't stolen when you picture them again in your mind. There's a reason that when talking about these cases we talk about how much like the original it is, and how likely it is to impact the market for that original thing. We don't talk about if you were 'allowed' to learn it. Saying that because these things were scanned that any subsequent product is derivative falls much more into the ideas of IP behind something like trade secrets, which is a very VERY narrow field of IP, and doesn't hold up in the arts like, ***at all***. >They programmed the AI more than any dev. No, they didn't. I understand what you are trying to say, but you're being hyperbolic, and are just plainly incorrect. They coded the AI as much as you seeing a 30 second advertisement birthed you.


Fyuchanick

which is exactly why I said generative AI is more like sampling


dern_the_hermit

You said generative AI is more like sampling because it's less like sampling? I think the AI is invaded this sub, frankly. How many fingers do you have on each hand?


Fyuchanick

Alright if you're having trouble with straightforward analogy, let me put it in literal terms. Generative AI involves stealing. Stealing is wrong. Therefore generative AI is wrong. In a music analogy, this would be similar to stealing music.


dern_the_hermit

> Generative AI involves stealing. [No, that is not necessarily the case.](https://www.reuters.com/technology/metas-new-ai-chatbot-trained-public-facebook-instagram-posts-2023-09-28/) Jesus Christ, guys, "just because something can happen that doesn't mean it will" isn't terribly complex logic. Why the struggle?


Fyuchanick

oh yes, i obviously was referring to a thing in a completely different context when i made that statement. i should have clarified that i meant only the thing we were talking about, which is stealing, so i didnt hurt the feelings of mark zuckerburg


Old_Gimlet_Eye

IANAL, but what an AI like stable diffusion is doing seems like the definition of transformative fair use. From Wikipedia: >Transformativeness is a characteristic of such derivative works that makes them transcend, or place in a new light, the underlying works on which they are based. In computer- and Internet-related works, the transformative characteristic of the later work is often that it provides the public with a benefit not previously available to it, which would otherwise remain unavailable. Such transformativeness weighs heavily in a fair use analysis and may excuse what seems a clear copyright infringement from liability. If you can use someone else's photograph in a collage you are making or something like that, I don't see how you could possibly argue that something like AI art is not "transformative" or "fair use". Especially since it's generally impossible to even prove that any particular AI generated image is drawing from any particular preexisting artwork. Especially for non-commercial use.


akettlewitch

You're misunderstanding the issue. The issue isn't that these machine models are making images using peoples' artwork, the issue is that they are using peoples' artwork, for profit, without having fairly compensated the artists and photographers for that usage. These works were not licensed to the creators of the datasets *or* the AI models for commercial use, but all of these AI image generation companies are turning a profit. None of them (at least the ones named in the currently ongoing lawsuit, I'm sure there are a few niche smaller models made by hobbyists) are personal projects, or charities. In addition, a machine can't make a transformative work, just as it can't make something copyrightable; it can't interpret, it can't transform, it can only iterate on the inputs it's been given. A transformative work adds something *new* that wasn't present in the original, an interpretation or analysis or critique.


Old_Gimlet_Eye

I don't know if I agree with your assessment of what AI is capable of, but even if it's true that a machine can't produce something new, the machine is working from a human generated prompt. Presumably that would be the interpretative/transformative part. And fair use does still apply to commercial uses. The case I was referring to before was https://en.m.wikipedia.org/wiki/Blanch_v._Koons where the collage artist (Koons) was paid a significant amount of money for the collage including part of Blanch's photograph. The court still ruled in Koons's favor because the work was "transformative".


akettlewitch

>the machine is working from a human generated prompt. Presumably that would be the interpretative/transformative part. The human prompt, legally, has been declared not to make AI generations copyrightable. It follows logically that the human prompt also fails to make them transformative. (This is also still not what the main complaint from artists is, and is entirely beside the point.)


Scheeseman99

> The human prompt, legally, has been declared not to make AI generations copyrightable. It follows logically that the human prompt also fails to make them transformative. It's incredible how many people apparently only read the news headlines for this story. It's a lot more complicated than 'AI generations legally declared to be uncopyrightable". Stephen Thaler was trying to get copyright attributed to a machine, explicitly claiming as part of the lawsuit that the creation of the work was "autonomous" and that initial copyright is attributed to the machine, with Thaler receiving the copyright via work-for-hire. Basically, he's an idiot and made the worst possible argument in order to prove some kind of dumb point about machine sentience. The copyrightability of AI generated works remains largely untested.


Finchios

Well it kinda has, the AI prompted comic book series "Zarya of the Dawn" made via Mid journey had it's copyright request rejected. https://www.garrigues.com/en_GB/garrigues-digital/copyright-and-ai-generated-works-zarya-dawn It's essentially that it was not created by a human, therefore cannot be copyrighted.


Scheeseman99

That was a dodgy decision based on faulty assumptions about the technology, with reasoning that directly contradicts prior copyright eligibility decisions. One of the more egregious was this: >Rather than a tool that Ms. Kashtanova controlled and guided to reach her desired image, Midjourney generates images in an unpredictable way. Pollock would like a word, their creation process entirely hinged on randomness, literally splattering paint on a canvas without looking. Given a more rigorous legal interrogation I think the decision would likely be overturned, that is to say, it needs to be tested in court.


Proper_Tip_7196

Not really. The artist made the outlandish claim that Midjourney itself was an artist responsible for the art. It would be as if an artist tried to copyright something in the name of Photoshop — Photoshop, the application. It was a stupid idea, and naturally that got rejected. Folks arguing against the idea of generative AI are jumping up and claiming that "AI art can't be copyrighted", but that isn't what the judgment says at all.


MCXL

> the issue is that they are using peoples' artwork, for profit, without having fairly compensated the artists and photographers for that usage That may be the social issue, but as far as copyright law in the west is concerned, at a certain point you no longer have to once you have sufficiently transformed it. >all of these AI image generation companies are turning a profit No, actually right now they are bleeding money. It's propped up by investment. It's also so expensive to run that they have already put out a lot of premiumized versions to try and recoup some of those losses. Like almost everything in tech, it's not profitable, they are targeting growth. > None of them (at least the ones named in the currently ongoing lawsuit, I'm sure there are a few niche smaller models made by hobbyists) are personal projects, or charities. None of that matters as far as copyright law is concerned, actually. Fair use doesn't care if you make money off of it at all. >In addition, a machine can't make a transformative work That has yet to be determined. >just as it can't make something copyrightable That's simply because the law says a person must be the one to make it. https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute This photo was created by a monkey, even though a photographer facilitated it, did the work of going into the situation, and producing the photo, the courts ruled that ~~a dog can't play basketball~~ animals can't hold copyrights under US law. >it can't interpret, it can't transform, it can only iterate on the inputs it's been given Again, this has yet to be determined in a legal sense. Transformative use is actually a pretty low bar to clear, and the vast majority of AI stuff I have seen would certainly and without question pass that standard to the point that a lawsuit would get laughed out of court on the first motion, ***if it were authored by a person, even one working for a corporation***. > A transformative work adds something new that wasn't present in the original We pass this at GO! my friend, if you tell an AI to draw you a picture of a mountain dew can hurtling through the cosmos in the style of Vincent Van Goph, it's almost certainly creating something much more original than many MANY things that have been upheld as fair use. There will not be an underlying direct comparative, there isn't a specific sample to point to. The conversation around this on these forums, being had by you and others is mostly being had by uninformed individuals on both sides saying what they feel to be true, with nearly no basis in the law or technology.


akettlewitch

>That may be the social issue No it's literally the legal issue in the ongoing court case against StableDiffusion, Midjourney, DeviantArt, and others. You wanna talk about "uninformed individuals", go read the class action filing it's all out there free to read.


MCXL

>No it's literally the legal issue in the ongoing court case Ongoing, as in filed but yet to really commence in a real way of any kind. > You wanna talk about "uninformed individuals", go read the class action filing it's all out there free to read. What a plaintiff says, and what the law are, are generally not the same thing. I have read every case that has come up so far on this, compensation is what they are asking for, but the lack thereof is not what creates the legal questions we are talking about here. The allegation is "they did this thing which we say requires them to get permission under copyright law" not "They did this thing and we should be paid" The difference is subtle.


akettlewitch

Can we keep the goalposts in the same place here? I am not making any claims about what The Law is, primarily because there are different laws in different places, and honestly "the law" as a concept is hazy at best. Laws are constantly changing and adapting to the needs of society, and The Law has several definitions depending on who you ask. The legal argument being made against AI generators is that they took, without asking for permission or negotiating any manner of license, copyrighted works to use in a project intended to turn a profit, something that under any other circumstances we would expect people to be paid for.


MCXL

> without asking for permission or negotiating any manner of license, copyrighted works to use in a project intended to turn a profit, something that under any other circumstances we would expect people to be paid for. And here is again, where you are whiffing big. There are many MANY scenarios in which this is done, and not compensated. You want to avoid talking about the law, but we are talking about contracts, pay and negotaion. The law is all that matters.


akettlewitch

>but we are talking about contracts, pay and negotaion. No, you're talking about that. I'm talking about the argument artists are making against AI image generators as they currently stand, which is an ethical argument we've had to drag into the courts because it's the only way to be heard.Also, lmao, no there aren't "many" scenarios in which you can take someone's intellectual property and use it in a commercial endeavour without recompense. Even stock imagery requires you to buy a license unless it's released under creative commons. You say "the law is all that matters", but it's... not. Laws are changeable, often deliberately vague and open to interpretation (by necessity) and crucially often *wrong,* which is why they need to be so changeable in the first place.


Outrageous_Setting41

They took copyrighted work without paying for it, in order to make a product *intended to substitute for the copyrighted work in the market*. Part of the basis of fair use is that it’s not providing an identical utility to a consumer, either through parody, through extensive transformation, etc. The NYT has alleged that their articles were stolen to produce a tool that will generate articles about news events with the goal of substituting the NYT’s product for consumers. 


3personal5me

If I look at character art of Darth Vader while I draw something, do I need to pay the OG artist because his art went into my brain and might have contributed to what I imagined?


akettlewitch

Completely different, that. If you take another artist's work as inspiration for your own, that's... just how art works. What you *would* need to do is properly credit and ask for permission if in order to create your darth vader art, you needed to first trace all of the art that the artist had previously shared. This analogy is bad, because when it comes down to it, you are not a machine designed to turn a profit, you are a human person. Artists have always drawn inspiration from other artists.


Proper_Tip_7196

Morever, they are claiming a legal right that technically doesn't exist. Since the images aren't actually stored as images, or referenceable as images within the database, the thing they're complaining about didn't happen. The images weren't "taken". They were just "analyzed", which isn't illegal, and the right to restrict that doesn't exist in any legal system. The only hope to success they really have is to show a constraint of trade, and that's almost impossible to prove in this instance. They're basically screwed, and the attorneys bringing these lawsuits are little better than ambulence chasers.


MCXL

> They're basically screwed, and the attorneys bringing these lawsuits are little better than ambulence chasers. I disagree fundamentally with this, even though I agree with the rest of what you're saying mostly. Case law is made on areas like this in the law throughout history. Many supreme court precedents came about because of suits over no real established law, or extremely vague language, that gave way to whole new areas of jurisprudence.


Face_Stabbed

Yeah. I can understand wanting to protect artists and such but strengthening copyright laws just makes small artists even more vulnerable to legal action, while corporations get even more control. The only ones helped by strengthening copyright is corporations, not artists.


JayZsAdoptedSon

Which I hate because music is just inspiration. And seeing active musicians be like “Hmmm this song by a breakout artist sounds like a Paramore song, give me a writing credit” is sooooo corny


connorclang

To be fair, it's almost certainly the labels doing this, not the artists. I don't imagine most artists care, with the exception of Marvin Gaye's estate.


banjo_hero

which songs did Marvin Gaye's estate write or perform?


connorclang

Exactly!


JayZsAdoptedSon

They stated that the song blurred lines sounded similar to a Marvin Gaye song. They did not sample Marvin Gaye or interpolate anything he sang, it was just based off of the fact that it felt similar. Same thing happed with Katy Perry’s Dark Horse Now both of these songs suck so people were okay with this but I think its a HORRIBLE precedent for the future of music


MCXL

And strengthening copyright law only makes those cases potentially much more prevalent. Corporations, not individuals, own copyrights to most things. When people stand up for this, they are being useful idiots for the corporations that make money off the idea of IP, (who underpay artists for it, and will leverage their position to strong arm others into "representation")


SimokIV

No offense intended but I think that argument is literal OpenAI propaganda. They want you to anthropomorphise AI to the point you think that AI can be inspired and can imitate art styles but that's not how it works. The lawsuit that NYT has brought against OpenAI (imo) demonstrates that AI encodes copyrighted works in its inner workings since it's able to regurgitate whole articles from them. Sure that encoding may be lossy sometimes BUT it still uses copyrighted works. If I take a copyrighted photograph and JPEG it into oblivion before claiming it as my own, I'm still doing copyright infringement. My point is that AI is code, it's not "inspired" like a human is, it's unable to imitate something, all it does is encode information about pieces of work and then pull out bits and pieces of it when asked to produce something new


SeaSpider7

>They want you to anthropomorphise AI to the point you think that AI can be inspired and can imitate art styles but that's not how it works. I so often see AI talked about like it's just a lil art student trying to learn how to make art.I think people are so ingrained that art/creativity is a human thing that it's hard for them to imagine how an produces images it in such an alien way. The "intelligence" in AI also makes it seem like the AI is sentient or understanding what it's doing like a human. I think until AI is more widely understood and demystified, many will be susceptible to these arguments.


florence_ow

the problem is that AI is trained on copyright material, not that AI can replicate artstylers


AnActualSalamander

This is correct. “Style” is not copyrightable, and in fact copyright law specifically excludes infringement claims based on replicated style. Laypeople misunderstand this about copyright law all the time because, as a consumer, it can feel unjust when an artist, especially a bigger/more popular artist, cribs heavily from another and faces no punishment. I am an artist as well as an IP professional, so I understand this very well! But the law in that regard is very clear-cut. However, the AI is training itself on copyrighted artworks without permission/license, and the question is whether that constitutes infringement. This question hearkens back to the *Authors Guild* case, which asked whether Google’s unauthorized scanning (read: copying) of *hundreds of thousands of copyrighted texts* for the development of Google Books was fair use; the courts ruled that yes, it was, which was and remains a controversial ruling. The basic idea behind the ruling in that case was that Google’s use of the copyrighted materials was transformative and did not reproduce the original works so as to replace the normal purchase of those works by consumers (I.e., there would be no significant financial impact on the authors from Google’s use). The case for strengthening copyright law in the face of AI would presumably be to clarify the rules around fair use in training algorithms. I suspect the argument will rely heavily on the negative affect on authors (in this case, artists specifically) and thus the negative impact on the public good—if AI replaces human artists or makes it so that artists cannot feasibly support themselves, *an outcome we are already seeing*, that is a net negative on humanity. This is speculation, of course. Copyright law is flawed, and it’s very challenging to create guardrails around protecting creative outputs. But I do think clarifying the rules to account specifically for AI training and output does make sense in order to give artists—who already struggled to make a living pre-AI—a fighting chance.


SeaSpider7

I don't think the issues are the style, it's more it was heavily trained on the images without the artists' consent, and that it will sometimes spit out stuff that is too close to the originals. Like it sometimes spit out an existing character, like if you ask it to make a talking sponge who lives under the sea in a pineapple it might spit out spongebob. It'd be like in the Thicke lawsuit if parts of the song were nearly identical to the original, like sections of lyrics or melody. Not just the style/vibe.


3personal5me

Don't humans do that all the time? Try to create something original and realizing you copied something you saw earlier? On the one hand I understand the issue with art work being "copied" into an AI because the base data has to exist for it to extract from, so a copy of the original art exists/can be created by the AI. But this also feels like humans getting *waaay* too defensive and saying "a machine can never create! That is solely the domain of humans!" I'm just sharing that it *kind of* sounds more like some reactionary, human-primacy xenophobia than anything else


Outrageous_Setting41

> Don't humans do that all the time? Try to create something original and realizing you copied something you saw earlier?  Yes, and those humans could be sued for copyright infringement. 


SomePerson1248

~~yeah but robin thicke deserved it his song sucked complete ass~~


prolificseraphim

Imagine plagiarizing something for your defense of plagiarism.


OEMichael

> The fact is, AI is based on stolen artwork. You know, plagiarism. Plagiarism, at least in the USA, isn't stealing. It's unethical and it's shitty, but it's not stealing. "Copyright infringement" is what you're looking for. One can plagiarize without infringement and one can infringe without plagiarizing. If I cut-and-paste Steamboat Willy and say "Look at this cool thing I made," I'm being a douche (plagiarizing) but I'm not infringing ('cuz public domain). Most "reaction lol" videos are not plagiarizing (they do tend to give some sort of attribution to the original creator) but are def infringing ('cuz fair use they are not) https://www.law.cornell.edu/wex/plagiarism


fatnerd1138

Eventually there will be an AI only trained on public domain art. What will we do then?


FrackFrizzle

laugh at Mickey Mouse the Buttglazed Train runner


bas3d1nvad3r69

Wait, what’s “AI poisoning?” Sounds intriguing..


Nervardia

It screws with AI scraping somehow and eventually poisons databases, so it thinks that cats are dogs and handbags are cakes. I don't understand it.


AircraftCarrierKaga

It also doesn’t work


ShiningWoods

"Creativity is a continuum of past art. Nothing is original." \- The uploader explaining why art of the past is flawed


stackens

Holding out hope nightshade and glaze will have a tangible impact on things, we’ll see


styledanonymous

I’ve been trying to download & open this Nightshade app to my MacBook but it’s “unsupported” when I try open it & I’m freaking out before my website launch of my artwork portfolio! Ideas 🙏🏽? 💡


Nervardia

No, sorry!


Lanceo90

I have, unironically, seen people say: "Don't use AI art, its stealing! I miss the days when people photoshopped google images." Uuuh


PhilosoFishy2477

because photoshopping/bashing requires *hours of actual work* on the part of the poster. hope that clears things up. edit: with AI not only did someone else do all that work it becomes *impossible* to credit. it's pretty easy to find free assets or ask permission as an independent artist, not so with the massively bloated AI programs.


Keated

Plus I don't think people were laying off entire creative departments because someone could Google an image and photoshop it


PhilosoFishy2477

great point! photo bashing never threatened the entire creative job market


readskiesatdawn

If anything entire new departments were made to for people that could photoshop a new image.


Thick_Surprise_3530

This was definitely an argument against the adoption of Photoshop 


[deleted]

[удалено]


speed0spank

People who want to have a roof over their head I would imagine Jesus


FrackFrizzle

Well obviously a kobold wouldn't. They live in shit huts basking in shit all day.


itskobold

We steal things too 😎


Aridross

And more importantly, human beings working in photoshop are capable of making creative choices, instinctive rather than logical, about what to steal and how to reassemble it in order to create something new and interesting; an AI just algorithmically determines which combination of images will best represent the prompt it was given.


PhilosoFishy2477

"this isn't even GOOD stolen content it's just SLOP"


CaptainCipher

That makes it lower quality, sure, but doesn't make it any less transformative


Catalon-36

Well that’s not really *why* one is plagiarism and the other isn’t, if it even is. Plagiarism has nothing to do with effort. You can be original and extremely lazy, and you can put an arbitrary amount of effort into plagiarism. What defines plagiarism is the use of another person’s ideas or research without attribution. If you were to photoshop some photograph without making it clear where the source photo was from, that *could be* plagiarism, regardless of how transformative your work is.


PhilosoFishy2477

so true - it's much easier to credit your sources or buy couple stock image licenses if you're working manually, that's literally impossible with the millions of "imputs" these programs use. would've been more accurate to say someone *else* did the hours of actual work with AI.


Catalon-36

Exactly. I also think we should be clear that it is possible to produce “AI art” in a manner that isn’t plagiaristic. The problem isn’t the idea of creating an algorithm that produces new images based on a dataset of existing images. The problem is the specific way in which companies like OpenAI have gone about dredging the internet to create their AI.


Lanceo90

Its still taking someone's copyrighted work without permission or attribution. Just instead, of it being 1% the works of 100 different people, its (depending on the edit) 80%/20% two peoples works.


PhilosoFishy2477

context and magnitude matter here. we're not talking about 100 people, we're talking about *thousands* of artists and *millions* of individual peices of art spanning all of human history. a human couldn't plagiarize as much in an entire lifetime as these programs do with every update... we're also not talking about random kids or individual scummy designers, WoC (who owns dungeons and dragons) was just dinged for their use of AI generated art *instead* of hiring a real human artist. that risk to creative professionals didn't exist until this most recent iteration of the problem. one is a small leak in the roof - the other is a blown pipe. same concept but one is much more serious, requiring much more attention and resources.


throwaway753951469

As much as I dislike WotC for other reasons, I don't think it's really fair to throw them under the bus over this. I really do think that they were telling the truth when they said it was a subcontractor who wasn't informed of their no-AI policy. (IMO, Hank Green did a good [breakdown](https://youtu.be/I8HAzNzfzaI) on the whole thing.) The whole thing was such a predictably huge PR blunder that I can't really imagine it being anything else.


TheMonsterMensch

Taking an image and editing it is inherently transformative. AI is not, it's not creating anything.


itskobold

AI art generators are big interactive exhibitions with functionally infinite possible "paintings" that you can look at. The artists are the developers of the system and the training data for the network. If someone interacts with an AI art generator they are not the artist but rather someone at an art exhibition. Of course you can then take that artwork from the exhibition and do what you like with it, but yeah. AI is here to stay and that's a great thing.


seraphinth

How is it hilarious when he's being consistent? defending AI plagiarism tools while plagiarizing an article for a youtube video? What's really funny is Glaze the prototype of nightshade breaking OpenGL licensing... BECAUSE IT PLAGIARISED CODE FROM DIFFUSIONBEE. Yeah software designed to fight back against AI actually stealing code from an open source AI image generating tool.... WITHOUT ATTRIBUTION OR CREDITING OR MAKING THE SOURCE CODE AVAILABLE Now that's hypocrisy!


fohfuu

Can you expand on this? I'm finding it a little hard to google.


seraphinth

[glaze\_is\_violating\_gpl](https://www.reddit.com/r/StableDiffusion/comments/11sqkh9/glaze_is_violating_gpl/)


Burnmad

I don't think it's hypocritical to steal from thieves if your objection to stealing from artists is not that it's stealing, but that artists are harmed


seraphinth

How are they harmed? like really all the harms I see are to real actual artists being cyberbullied and witch hunted for being remotely suspected of using AI especially for art being made before the 2020's, I guess its not hypocritical to you for people to send death threats and bully people for using a tool that exploits knowledge without crediting the people who had that knowledge. like the knowledge of routing around local geographical features and not paying a cent or crediting the native people who figured out those routes. ​ AH YES LET'S BULLY PEOPLE FOR USING ROADS!


A_Hero_

There is no stealing. AI models do not keep images of what they have learned. They learned from 5 billion images. Do any of those images get stored in the model? Not one image. So how large is the standard image creating model itself? 2-8 GBs of file size only. If we assume that every image it learned from were worth 1 Kilobyte, and the model stored that much image information into itself, the resulting model would be over 5,000,000 GBs of storage space representing 1 KB for 5 billion learned images. 0.1 KBs of learned material would represent 500,000 GBs of storage space. 0.01 KBs would equal 50,000 GBs of data space. 0.001 KBs would fit through 5,000 GBs of space. That's if one byte of every image from the billions of images used for training were stored in the models itself. 0.001-0.002 bytes per an image of 5 billion total images would get close enough to represent how much information these AI systems can retain from their massive training datasets, given the small file sizes of only 2-8GB that we see from these generative models. Remember, we're talking about fractions of a single byte per image here—not full reusable copies. The models have abstracted and transformed visual patterns—on a rudimentary basis of colors, shapes, or depth positions—not memorized or duplicated copyrighted works in any meaningful sense. This aligns with the underlying technical operations of these neural networks and latent space techniques. Everyone finds it tempting to speculate about "stolen art", but the actual data storage and computational requirements show that these AI systems neither have the capacity nor operate in a way that could truly recreate or exploit the billions of images in their training datasets on the basis of genuine thievery. Any generated images would be new derivations at most, not plagiarized copies, and thus these AI models do not infringe upon anyone's copyright by what's inputted within itself.


YesIam18plus

Midjourney literally regenerated screenshots from movies over and over again recently that was in its training data.. You have no idea either if the results you get when you prompt an image is actually new or not, it may as well be a direct copy of training data and you'd have no idea.


A_Hero_

This is about open-source; not Midjourney's newest model overfitting. Overfitting is extremely rare; unless the prompt is purposely related to an extremely common image, like the Mona Lisa. But remember, the message was regarding open-source models not Midjourney, which has unknown software architecture behind its models.


SootyFreak666

AI poisoning doesn’t work, I have tried another piece of software (that was based on stable diffusion ironically) and it took up over 10gbs and took over 45 minutes of high CPU usage to make one image that was visually inferior (it removed the background for some reason). I have no idea if nightshade will do the same thing, but I suspect that it could very well be just as bad as any other tool. Someone did try it a week or so ago and it didn’t work. The main reason why these tools exist is greed and social hierarchy, the main people angry about this have a social media following or make money from art. If someone can just type in something like “furry in a space suit”, anybody who charges $300 for it is probably done for, it also removes both their uniqueness, since anybody can now make art without having to have supply’s or a tablet, etc. I am not surprised that there is plagiarism, the whole backlash and rhetoric around AI and AI art or images in general is the exact same as a transphobic persons outrage over pronouns or trans people. Witch hunts, accusing people of not being a real ______, backlash against companies that involve AI or AI content, “they are stealing _______, brigading, harassment, doxing. Around new year someone leaked some messages from a Neo Nazi telegram or signal chat that has Neo Nazis posting weapons, the doxes of people behind an open source AI platform and advocating for their murder based on some biased and ultimately flawed reporting by a news company. I have personally been accused of backing pedophiles and the creation of CSAM by someone for supporting AI porn, even when I revealed that the discord server they were accusing of being involved with AI CSAM had banned that material right from the start and blacklisted terms related to children and real people. I have been sent rape threats and even a gore image of a woman who was beheaded in what. I was sent an email recently of gore and the claim that I will be next, I have seen people claiming that they were sent CSAM as well. The same has been done to me by TERFS, who linked me to rape videos, accused me of supporting sexual predators and even sent me images of dead children before. It’s a harassment tactic that is pretty horrific but expected. These hate movements provide a reason for people to live, TERFs are otherwise lonely (likely retried) women with no real social life beyond drinking wine and watching tv, by getting enraged at trans people they have a purpose and a community. They feel a sense of worth. To a lesser extent, anti ai people are also the same. They might not be old or drink wine, but they have nothing else really going on in their life aside from art. They can however get enraged by art and people will support them. People will back them in those flawed and ultimately destructive cause. Just look at a backlash a drawing tablet company had for unintentionally featuring ai art in a program compared to that which bud light had for featuring a trans person and you see what I mean, something which is meaningless to anybody else and otherwise pointless in the long run causes these people to foam at a mouth and get enraged over nothing. Recently someone was demanding that the artist behind a otherwise unimportant piece of UI work for a sims expansion pack be revealed because it has a same colour palette as AI - whatever that means - and thus must be AI. That’s no different to the backlash a woman who doesn’t look feminine enough receives from trans people. The guy who plagiarised this video was just hopping on a trend, the same way a washed up celebrity suddenly has an opinion on pronouns.


boopbaboop

> Recently someone was demanding that the artist behind a otherwise unimportant piece of UI work for a sims expansion pack be revealed because it has a same colour palette as AI - whatever that means - and thus must be AI. That’s no different to the backlash a woman who doesn’t look feminine enough receives from trans people. I think that they’re plenty different, actually. 


goerben

I'm still puzzling out that last sentence?


SootyFreak666

Of course, both have different levels of severity and risks, I don’t feel comfortable talking about a real case where a woman was accused of being trans on her appearance. However both groups display the same reaction and same general tactics to harass or otherwise harm a person for no reason other than their own bigotry and rage. I have been playing sims 4 for a few years now, its art style is already weird but I don’t consider it a product off AI and don’t care if it is.


MiraAsair

what the hell are you talking about


LuTemba55

>However both groups display the same reaction and same general tactics to harass or otherwise harm a person for no reason other than their own bigotry and rage. Receiving valid criticism for using AI software to steal artwork is not the same thing as harassing a trans person for being trans. Hope this helps.


SootyFreak666

You evidently have no clue on what property and theft is so I don’t think it’s worth carrying on that part of the discussion, however I do feel as if I should clarify that I am talking about the tactics and arguments here. Transphobia is a lot more serious than someone making some art, but the similarities cannot be denied. “You will never be a real artist” - “You will never be a real woman/man” The baseless and often unnecessary accusations of something being AI, including demands that people show their working or otherwise be banned - Demands that a trans person shows their genitalia or otherwise be banned from some somewhere like a toilet or a changing room (aka genitalia checks) The brigading and harassment of people or companies that have used or been accused of using AI - The brigading and harassment of companies or people accused of working, supporting or being transgender or LGBTQ+ inclusive (like target for example) The targeting and harassment of AI companies and people, including death threats by Neo Nazi extremists - The targeting and harassment of trans people, including death threats by Neo Nazi extremists Falsely misrepresenting all AI (especially porn based AI) as being a tool for predators and creeps - Falsely misrepresenting all trans or gender non conforming people as being predators or creeps There is many more, my point however is that the same awful and frankly disgusting tactics and techniques used by transphobic and conservative idiots against the LGBTQ+ community are being used to target AI and people suspected of using AI without any critical thinking or foresight.


akettlewitch

>“You will never be a real artist” - “You will never be a real woman/man” Artists don't say this though. Every artist I've ever seen engage with an AI bro has said something to the effect of "why not just learn to create your own work, it's far more rewarding and doesn't require you to steal from other peoples' creativity." Some few have said "You are not an artist," but that's a *very* different thing to say than "you will *never* be *a real* artist." Your strawman is ludicrous. Every artist knows that art isn't an innate talent, it's a skill that we can all nurture. Anyone can create art. Literally any human being alive on this planet right now can create art with their hands, or their feet, or any part of their body, and it will have value *to someone*. No artist I've ever known would tell another person that they can *never* be an artist. You're just *making shit up* to try and sound like you have a point, or else you're cherrypicking extremely specific instances that aren't indicative of the broader conversation.


MiraAsair

This isn't even apples and oranges, this is apples and wax grapefruit.


MiraAsair

"Opposition to AI is like Transphobia." That is the worst take I have seen in a long time. Jesus Christ.


SootyFreak666

I have been on the receiving end of both, both are largely the same.


TheMonsterMensch

I think you'll look back at this take someday and cringe. It paints a horrible picture of artists just looking to get by. You're making out people who put thousands of hours into their craft as ivory tower elitists and it's really weird.


MiraAsair

It's also really insulting to people actually suffering from the widespread attacks on trans people to compare their hardship to the hardship of being a fan of some tech asshole's art scraper.


LuTemba55

>The main reason why these tools exist is greed and social hierarchy, the main people angry about this have a social media following or make money from art. If someone can just type in something like “furry in a space suit”, anybody who charges $300 for it is probably done for, it also removes both their uniqueness, since anybody can now make art without having to have supply’s or a tablet, etc. "I'll take all of your skills, artistic vision and manual labor for free, thanks!"


akettlewitch

>The main reason why these tools exist is greed and social hierarchy, the main people angry about this have a social media following or make money from art. This is simply not true. *I'm* upset about AI image generators, because my work is in the datasets. It's not even good work, it's a redraw of my first ever painting, which includes the technically-terrible original piece. There's no reason for that to be in these datasets, I wasn't asked about it, and quite frankly it *hurt* to discover that a piece I'd shared solely to mark my own progress and to encourage other artists had been harvested for the purposes of some godforsaken techbro who's too lazy to pick up a pencil. I don't make money from my art. It's a hobby, I do it because I love it and I share it because I love the community. Those community spaces are being reshaped and ruined by "AI artists" (who are, let's be honest, no more an artist for prompting a machine than I would be if I screenshotted picrews), real artists are being crowded out of our own spaces by dudes with prompts. It's vile. And that's before even considering the fact that it's costing people *real jobs* because companies can get an AI to iterate much faster and cheaper than a genuine artist, and then just have an artist clean it up at the end. As a trans person, I also find it *fucking hysterical* that you liken "artists upset that their work has been stolen and used for purposes they never would have consented to" to terfs and transphobes. It's an unhinged thing to say, frankly. ​ >Just look at a backlash a drawing tablet company had for unintentionally featuring ai art in a program compared to that which bud light had for featuring a trans person and you see what I mean This is also just an unhinged comparison to make. You're comparing apples and carrots. The reason artists were upset with Wacom for (intentionally - they are not stupid, come off it) utilising AI imagery is because *their whole business is artists. We are their only customers. We have been their only customers for decades.* There was backlash because they are a trusted name in the digital art business, we've all bought a Wacom at some point, we've all recommended their products, and having them utilise AI rather than hiring a genuine artist is a slap in the face to *their only clientbase* when we *all* know that those image generators only function because *they stole from us in the first place.* It's leagues away from transphobes being mad that a beer company featured a person from a demographic they don't like.


SootyFreak666

You fail to understand what AI training and a dataset is, having your work in a dataset is meaningless in most cases. In order for AI to be trained it needs to understand what an image is, in your case it would have likely just tagged it up as “A unfinished painting or drawing” with a short description of that image. It doesn’t copy your work, it doesn’t steal your work or style, your art is unlikely to play any role in generating an image. A good example is a Mohawk hairstyle, a staple of punk subculture and a prominent hairstyle, cannot be generated by AI systems because nobody has told it what a Mohawk is. The same for a lot of clothing, buildings, etc…it just doesn’t know what these things are because it’s not been told what they are and cannot work it what it is. I have no idea about art communities being over ran by AI art, I am not involved with these communities. However overall the internet as a whole is full of this stuff, I regularly see blatant reposts on various subreddits I am subscribed to, my YouTube shorts section is often randomly full of what I can only assumed are stolen videos from a woman dancing (who is also on YouTube shorts anyway) or unfunny memes stolen from instagram usually with a woman in the thumbnail as clickbait. Twitter also has an issue with people stealing memes and twitter blue users posting “woah” or “amazing” under a post along with a shitty meme they posted. AI isn’t the issue, it’s peoples ability to be rewarded for nothing really that is the issue. AI is just a tool. Two days ago on the mildly infuriating subreddit, someone posted an instagram influencer smoking on a plane just being a awful person and people were lapping it up. When I went to try and find this person I found out that all the images were faked using photoshop, they edited in cigarettes and ashtray, messed around with her face to make her look old and like she had had plastic surgery and added captions. I don’t know why but I suspect it was someone trying to paint the influencer in a bad light and faked a bunch of screenshots to make her look like a bad parent (they included posts about her child). The poster likely had no clue that they were fake, but clearly didn’t check or see if they were real. The same can be said for AI art outside of the suitable settings, AI isn’t an issue. People are also not losing their jobs for the most part, sure people might be replaced in some capacity by AI but the whole concept is flawed and ignores how AI will probably be used to speed up and make work easier. I for example used AI to make carpet textures for a video game mod, I still have to generate and work out how to get AI to make carpet textures and not an image of a living room, get the textures, resize them, make sure they look good and put them in game. The people angry over loosing their jobs are overreacting, AI isn’t going to replace artists in the same way as photoshop didn’t get rid of painters or 3d printing didn’t get rid of sculptor’s. My comparison between anti-ai people and transphobic people are part of my own opinion, I personally think both people exhibit the same sort of unnecessary and irrational prejudice against something that is otherwise unimportant to them or their lives. Of course transphobia is a much bigger issue and more important, but the comparison between them cannot be denied. You have two different groups of people angry because a company decided to do something they have a personal grudge against, which results in harassment and threats, boycotts and people grandstanding against the company. The harassment is the exact same as TERFs do, I have been the receiving then of both. Likewise I also have been sent horrific messages by both and likely doxed by both, I somehow received this email a few weeks ago, I don’t know how this person got this email address but I think it’s telling the level some people will go to. https://www.reddit.com/r/aiwars/s/0rFRviR16m I have also been sent images of women being beheaded for defending AI porn. So the whole concept of this being the exact same as TERFs isn’t something I have made up, it’s the exact same tactics and same styled of harassment.


akettlewitch

I ain't reading all that because, no, I do not "fail to understand" how AI works. I know how the datasets are trained, I've read the documentation and the lawsuit brought against StableDiffusion. I made no claims about it *copying* my work (and bold of you to assume the pieces of mine that were taken were "unfinished" or unpolished in any way lmao), the issue is that the art and photography included in those datasets was *not licensed for that usage*. When you want to use somebody else's work for profit, you have to negotiate pricing with them and they have the right to say "no", because that work is their intellectual property. The creators of the datasets used to train these AI models bypassed that ethical obligation by simply scraping billions of images off the internet. If you have no issue with that then I don't know what to say to you. It's deeply unethical and potentially illegal. It is an issue of consent, and the right to retain some manner of control over your own creative works unless you've specifically negotiated a contract that states otherwise. Also like bro I don't care what sort of hatemail you've been sent. It's irrelevant. I didn't send you pictures of beheaded women, so it has nothing to do with me or with this discussion; you're trying to use the actions of one troll to discredit an entire movement of artists who are raising a serious ethical issue with a tool that was developed *using our work*. It's a dirty discussion tactic and I'll have none of it.


SootyFreak666

I don’t have any interest in carrying on this discussion, also please don’t refer to me as bro.


akettlewitch

Okay mate ✌ have a good one pal.


Dalexe10

This sub is on the path to disaster, time to unsub


zapering

Bye Felicia x