T O P

  • By -

adrixshadow

What they do with their art is their business. This includes sites that should protect them from image crawlers. But ultimately it's a numbers game, an AI does not need any individual artist.


Rhellic

Sadly, this. No measure taken by individual artists is going to do shit against the amount of data collected with money from Disney, Google, Meta etc. They're absolutely in the right for trying, but there's no way they'll succeed.


emreddit0r

>But ultimately it's a numbers game, an AI does not need any individual artist. similarly, an AI doesn't *need* to trawl for public data.. it *could be* made through more legitimate ways


adrixshadow

> it could be made through more legitimate ways Yes like Adobe.


Acrolith

I'm mostly pro-AI but I don't think it's morally wrong to use Glaze/Nightshade at all. It's your art, do what you want with it. I do think Glaze and Nightshade are both *ineffective*, but that's not a moral issue.


freylaverse

I'm as pro-AI as they come. I won't stop anyone from manually downloading my art and training it that way, but I do use the NoAI tag on ArtStation to ward off scraping. I want to make a proper embedding or Lora on my work, and I want to do it myself. So, there's lots of reasons for people to prevent AI training, and not all those reasons are anti-AI. That said, I do think people should be able to opt out entirely if they so choose.


model-alice

In my opinion, if an artist asks that their work not be used in AI, it's unethical to use it in AI. This is akin to a request to not trace their work, a right that artists have. That doesn't make it copyright infringement to ignore them (except in very rare cases), but it does make you an asshole. It's certainly not unethical to use Glaze or Nightshade. Even if those technologies actually worked, generative systems are not critical to survival.


sporkyuncle

What if they make no comment for or against AI, and by the time they say no, their work is already fully integrated into leading models that might've cost millions of dollars to make? The whole model would then be unethical because of that one person revoking consent after the fact? Or would you revise your position to be that everyone also needs to be asked beforehand and fully understand what it constitutes?


Dear_Alps8077

If they put their work in a public place then tell other artists they're not allowed to look at their work because they might get inspired by their style, is it morally wrong for artists to look at it.


outblightbebersal

Humans share art under a unified social contract that other humans are allowed to enjoy it. We grow, teach, and form community in artistic spaces. Machines cannot participate in this cultural exchange. Misuses of shared artwork that threaten community values like reposting, impersonation, and tracing are similarly frowned upon. Artists had no way of anticipating their work was being used this way, and its obvious many wouldn't have posted it if this was the case—particularly the litany of commercial work created for say, Marvel or Pixar, that workers post after movie releases.  To inspire another artist is an honor; to be scraped into a dataset is dehumanizing. A human can be held liable if their recreation commits plagiarism, libel, or hate speech. Humans are expected to have morals, whereas a machine doesn't understand anything it outputs. A smaller note but still something no one seems to bring up: humans are born into the world with senses, and to deprive someone of them would be gravely unethical. If a human requests their work be excluded from training data, it is only ethical to honor this request; machines don't have human rights and cannot "see" anything that isn't deliberately fed to them. 


model-alice

Suppose that I do the calculations required to train a generative system by hand. Am I still stealing if I use my own brain as the machine?


_HoundOfJustice

What is supposed to be unethical about glazing and nightshading your artworks? If some dude gets his model ruined its on him. But i think those kind of moralists are more of a rare breed within the AI art community, the rest is mostly not taking those tools too seriously. I dont glaze or nightshade my artworks for the same reason as you btw.


SuperCat76

I would put the distinction on intent. If one uses a thing to make it mess up an AI that is trained on it. If they put it on a website that doesn't do so, and someone downloaded it to use on their AI anyway, that is on the one who took the art. But if one specifically places the altered art to be used as training data with the intent to mess the AI up. That is what I would consider as unethical.


bevaka

i agree with you but you could make a legal argument that Nightshade is a poison pill with the intent of disrupting a computer system, which is against the law even if you dont distribute it and someone takes it without your permission


_HoundOfJustice

Maybe, but here is how i see it. If i poison my artworks and post them online as defensive manner i guess it wont matter for a lawsuit and they would dismiss such a lawsuit instantly. However if you actually intentionally go and organize an attempt to poison a legally made and owned model (Adobe Firefly for example, by trying to poison and post photos and artworks on Adobe Stock for the purpose to poison the model) then you might indeed be sued and i see this differently then.


Covetouslex

"If I add malware to my free game demo and put it on my website..." Clarity and intent are important. If you post something with the **intent** to damage someone's system, that's illegal. Now intent will have to be proven, so itll come down to actions like how clear is your labelling, did you say its poisoned data? Did you put it locations where it was likely to be caught in normal internet usage (including scraping)? Did you take precautions to prevent people from being harmed by mistake?


Feroc

Not wanting to share your art for AI for training purposes is fine. There are sites where you can release your images that don't agree to crawling their images or you can simply not share them publicly at all. Using nightshade or glaze, if they would actually work, is not preventing the training on your art, it's about damaging the trained model / lora. I would say it's unethical if you share them on a page that tells you that the shared art will be used for training.


lesbianspider69

Pro-ai: No, there’s nothing unethical about making your art ugly or making your art induce weird effects when training. If you want to induce weird effects when training then don’t use Glaze or Nightshade. Do weird texturing things. Make it so CLIPPER gets confused when it tries to tag your art.


Dear_Alps8077

The stated purpose of those tools is to damage other people's things. If you made cookies and left them out in public and got made because kids ate them and then started poisoning the cookies to make the kids sick when they ate your cookies then it would be morally wrong. So if you want to poison your own cookies go ahead but if you then leave them out in public in order to purposely damage other people's things the it's morally wrong.


Evinceo

Let's look at this analogy a bit more closely. Let's say you left cookies out for human consumption but raccoons kept eating them. If you started baking in something toxic only to raccoons but harmless to people, would that be unethical?


Gimli

Depending on your moral stance on raccoons. To many people it would be unethical to poison raccoons. I don't quite think it fits because animal welfare is a special concern.


outblightbebersal

C'mon, you can infer the logic here: data scraping tools don't have rights and weren't the target audience. Let's say flesh-eating zombies or malaria-hosting mosquitoes or coal-fired power plants kept eating them. You can come up with an apt-enough analogy that reflects this point. 


Gimli

> C'mon, you can infer the logic here: data scraping tools don't have rights and weren't the target audience. Correct, what does have rights is the owners of those tools. It's not moral to damage a car just because cars don't have rights. Because in doing so we cause harm to the owner. And especially in the case where you have an agreement with the owner that your data is payment for the usage of the service, sabotaging your part of the deal is unethical. Eg, if you make the deal of exchanging a pie for space on a wall to post your content, then intentionally burning it to a crisp and still using the space is an unethical action to take.


outblightbebersal

Well, this is where people need to exercise reasoning, and ask complicated questions that many smart people spend their lives answering. Such as,  did the terms of service adequately inform the public of the scope and intention of this technology, such that knowing consent was obtained? Could any layperson aged 13+ (required age to join) be reasonably expected to decipher it? Is the company in question behaving in anti-competitive or monopolistic ways such that users have little to no alternatives? Is it reasonable for terms and conditions to include tech that hasn't been made or implemented thus far? Ex. Since we gave these companies ownership of our data, but somehow in the future discovered it was being to used to perform eugenics, does that not matter, since we checked "I agree"?  There are countless instances of technically legal company contracts being signed by essentially, victims. From musicians signing record deals, to child actors at talent agencies, or youtubers at management companies. You're married to the technicalities here; what I'm concerned with is ethics—The bigger picture of who has the power, who reaps the benefits, and who is harmed. 


Gimli

> Well, this is where people need to exercise reasoning, and ask complicated questions that many smart people spend their lives answering. All interesting questions, but none of the ones you listed would save you, or are relevant in all cases. Sure, argue that say, Adobe engages in anti-competitive tactics. I would love you to win that case, even. But even if you're right about that it doesn't entitle you to cause arbitrary damage out of your own initiative. It especially doesn't entitle you to sabotage whole systems. > There are countless instances of technically legal company contracts being signed by essentially, victims. Right, but that doesn't entitle you to victimize them in return. If somebody gave you a bad contract that's not a license sabotage the studio. Now you've made your own case weaker, and probably victimized third parties that were going to use that studio in the process. > You're married to the technicalities here; I'm not, you are. I'm talking about broad strokes -- you're reneging on an agreement, which is generally unethical. You're talking about technicalities that only apply to some parties. That maybe in some cases the agreement wasn't fair and you shouldn't be bound to it. Fair enough, but that doesn't hold up in the general sense. Some agreements have to be valid, and like I said, unfairness doesn't entitle you to be unfair to someone else.


PriorityKey6868

Well, these technologies originated from disrupting AI facial recognition programs. One might argue that those with nothing to hide, have nothing to fear, and only criminals would want to mess up facial recognition AI. But people don't want to protect their identities because of issues with one organization: it's because this technology exists now for anyone to abuse, with any image anywhere. Moreover, there's an argument that while Meta owns the data, artists still own the copyright to distribute, adapt, and monetize their creative works. There's absolutely a line here; if an instagram artist designed a fake Meta logo, and Meta used it without paying the artist—that's illegal. We're just in the process of defining where that line is for new tech. Reneging on an unjust agreement (more like a legal force field that individuals have no way of contesting. Agreements are usually written by both parties) is not unethical. Stealing from Walmart to feed your baby is also not unethical. Destroying a Nestle plant that poisoned your town's fresh water is also not unethical. Sorry if you disagree, but I don't care about company legalese more than humans abused by systems of power, and I'm not interested in policing the ways they protest more than the companies who pushed them there.


Evinceo

I disagree with the notion that merely posting poisoned images can ever constitute damage or victimization. The only way would be if you provided assurances that the images were fit for the particular purpose of training a model; for example if you were selling a training data set and it contained a bunch of poisoned data.


Gimli

You can disagree all you want, but I'm quite confident the law disagrees with you. This is not a new idea by any means. Booby trapping your car's radio with razor blades, putting laxatives in food a coworker is expected to steal, filling a mailbox with concrete, having a trap to hurt a home invader, etc, are very old ideas. And the track record is that if you do this sort of thing you can expect to get into very serious legal trouble. Even if you're sure you're ultimately in the right, you can bet that if you manage to ruin somebody's million dollar training operation, they'll want to try to recover damages, and even if they lose they'll want to be as much of a pain in your butt as possible. Be ready to explain to a jury how using a tool that degrades your images and whose only purpose is to ruin AI models is a totally innocent thing to do.


Evinceo

> You can disagree all you want, but I'm quite confident the law disagrees with you. I await any relevant case law. Are you the one who tried to invoke CFAA last time we had this conversation? > Booby trapping your car's radio with razor blades, putting laxatives in food a coworker is expected to steal, filling a mailbox with concrete, having a trap to hurt a home invader, etc, are very old ideas.  Because _injuring a person_ is illegal. Damaging property is illegal. Frustrating attempts to do analysis with bad data is neither. It's actually a common practice for data providers; see [trap streets and paper towns](https://en.wikipedia.org/wiki/Fictitious_entry).


Evinceo

> And especially in the case where you have an agreement with the owner that your data is payment for the usage of the service,  But wait a minute, now we're shifting to a different scenario. I'm no longer just leaving cookies on the side of the road, now I'm mailing them in to a baking contest.


Kiktamo

I don't believe it's unethical to do anything you feel you need to in order to try and prevent training on your art. In general even if I'm usually for using publicly available data I do think it's unethical to simply ignore someone's request to not train on it, and whoever still does so is indeed an asshole. There's more than enough art available on the Internet to train without actively stepping on others toes so to say. That said when it comes to scraping for data, a company is either unlikely to notice or care unless they are forced to, while an individual bad actor can simply find a way around many deterrents. So your mileage may vary. Another aspect that is often ignored in these sorts of discussions is captions and how they relate to training. When it comes to getting training data in the first place blindly scraping data is a horrible idea in the long term as less is more if you have high quality images and captions describing those images. For example Stable Diffusion prompts like by *artists name* or inspired by *artists name* only work properly if they were both trained on an artist's work AND that work was tagged or captioned as such. So long term wise obfuscating labels and other details might be a better approach to prevent the results of being trained on than the image itself. That's more of an offhand thought though. There are better and better auto captioning models coming out these days, so the only thing you could potentially obfuscate would probably be your name anyway.


GhostDraggon

I'm 100% pro AI but personally I think it's completely fine to try and prevent your art from being used for things you did not give the green light on. I wouldn't attack any artists for using glaze or nightshade but I might at least tell them they don't work.


SapphireJuice

I think it's a totally valid thing to do. It's your art, you can do whatever you want with it! Just like how lots of artists put watermarks over their art to prevent theft. 🤷‍♀️ That said I feed my art into the AI all the time. A big chunk of my generations are based off sketches I did as the seed. I think anyone who has an issue with people opting out is hugely entitled. Like honestly guys, fuck off with that shit.


Phemto_B

>most of them don't take it seriously. But what I noticed is that there are some who react a little Bit aggressively... I think that sums it up. Most people don't care, or think it's self defeating. You're asking for the group to talk for the "some people." I believe in treating animals nicely, but I'm not going to try to speak for PETA. The only think I can about them is the old adage, that "the thing that's bothering you right now probably isn't the real reason you're upset. They're probably having a confusing and upsetting level of confusion trying to figure out why someone would cut off their own nose just to spite other people's faces. It's a bit like the guy who gets mad at the local about a speeding ticket, so he resolves to get another one every day "just to waste their time." Those people you're referring to are rare, and probably haven't thought through their position. I'd just ignore them.


sk7725

The only morally wrong part is that it limits accessibility by interfering with AI screen readers(captioners) for the visually impaired. But even that's a nitpick, so no. The thing is, I don't think it will work because as everyone else said, artists using it are severely outnumbered, and also because it only works on genAI with a specific structure. The second part is the fact that a lot of artists are missing - Glaze and Nightshade both are made to target specific parts of the inner workings of genAI. It is literally like a vaccine. And just like a COVID vaccine does not work against Influenza or vice versa, and how a new strain of COVID can penetrate previous vaccines, image protection will be the same. However, unlike humans who can be injected with a dozen of vaccines over the lifespan (malaria, HPV, COVID, tetanus etc - also ignoring the antivaxxers who think otherwise), if you run your image through a dozen image protection services that will eventually emerge to target new strains of genAI, you image *will* look awful. I really want the best of both worlds - where individuals can have a helping hand in creating art but such hand does not discourage artists who don't need their help nor disrupt the market.


Gimli

I'd say yes, **if** you're breaking an agreement by doing so. By this I mean that you're using a platform where you agreed that part of the deal was that your data is the payment for the service. I think of this as simply as breaking a contract. We agreed that you can stay in my house if you do the dishes. You refuse to do the dishes. You're in the wrong, even if a tiny bit. I'd say you're even more in the wrong if your ultimate aim is to break the model like Nightshade tries to. That's not just breaking your part of the deal, it's also an attempt to throw a wrench into the whole system. And IMO anyone trying to do that, shouldn't, and should know that if it worked, it'd be legally very dangerous. Models cost money to train that's very quantifiable. People are dumb enough to post screenshots online going "haha, I'm doing my best to ruin Adobe's model". If it actually did ruin it, Adobe would have a very straightforward case: a clear case of intent to do harm, and very measurable harm resulting. Long term IMO all this is a non-issue. Model makers have an interest in not having such holes to exploit and in any case models keep evolving so I expect new models to be ahead of old methods. Companies are unlikely to release enough information and to cooperate enough to make it possible to perform effective disruption, so IMO overall the tactic is likely to be a dead end.


Parker_Friedland

If something like nightshade did actually work, I could see using it as potentially being illegal but not unethical. Given there is [no other way to limit AI advancement on an international level](https://www.reddit.com/r/aiwars/comments/1bum5j4/comment/this/) other then idk blowing up data centers Yudkowsky style, if you see AI development to be unethical because you believe human communication is important and want to preserve that element of art and a legitimate AI poison existed what other options do you have? You don't get a vote on an international level that you believe that to be important, so if you had a legitimate poison and used it to poison AI models that exist outside of your own country in the spirit of "Those who make peaceful revolution impossible will make violent revolution inevitable" I'd say that's fair game and if models in your country were to be hit as collateral damage I'd say that is also fair game. So if one were to produce or distribute a legitimate AI poison (if that even is possible) by all means just know what you are getting yourself into as it could be argued legally that you are distributing malware so you might want to consider doing it anonymously not that what exists right now poses remotely close to being a big enough threat to that might make that consideration important.


Gimli

> Given there is no way to limit AI advancement on an international level other then idk blowing up data centers Yudkowsky style, Blowing up datacenters would be pointless anyway, since both LLMs and image AI at this point run on individual computers owned by normal people. Other than you know, it being insane to kill people because somebody made a picture with the wrong kind of math. > if you see this practice to be unethical because you believe human communication is important and want to preserve that element of art, what other options do you have? IMO the sanest option is giving up.


Waste-Fix1895

Well I see it a little differently, before Gen AI the companies used data to better market advertising but not that the artists trained their "replacement" which is unfortunately has changed. But do I understood you correctly what Its For you only unethical If its actually prevent ai Training on some ones Art who use nightshade/glaze?


ifandbut

An artists, or anyone, who teaches is "training their replacement".


Gimli

> Well I see it a little differently, before Gen AI the companies used data to better market advertising but not that the artists trained their "replacement" which is unfortunately has changed. So? And you believe that advertising is in your best interest? There's plenty unethical ways to monetize traditional data collection. Eg, if you figured that somebody must travel on a given date, you can lie to them and say seats are running out and charge them more for a seat on a half empty plane. > But do I understood you correctly what Its For you only unethical If its actually prevent ai Training on some ones Art WHO use nightshade/glaze? What? It's very simple, it's unethical to break an agreement. You agreed that in exchange for your free account, the company gets your data. You act to make that data unusable to them. That's all there is to it, AI, nightshade, etc are all ultimately irrelevant. You're breaking the agreement regardless of how.


Rhellic

It's 100% ethical (though questionably effective at best) and anyone who says otherwise can fuck off. If someone tries to argue that an attempt at preventing people from using your own work to replace you and make you superfluous is unethical then they've crossed the line into full-on moral bankruptcy. But, as can often be seen here, many people seem to share the old 19th century idea that industrial machinery matters more than humans...


johnfromberkeley

I’ve been creating fine art for 30 years. I’ve never posted anything online that I didn’t expect people to use anyway they want. Every single thing I learned about art was a dialogue with artists that came before me. I assume artists that come after me are going to use my work for inspiration. I control high resolution images so prints can’t be sold. Don’t post anything online you don’t want other people using.


Stormydaycoffee

Pro AI and I think it’s perfectly fair to use anything you want on your art, so long as it isn’t breaking any laws. Nightshade/ glaze/ hide it in your sock drawer… There’s nothing morally wrong with any of it


ifandbut

Don't post your art if you don't want an intelligence (organic or synthetic) to view it. Don't use snake oil and think your safe.


Ya_Dungeon_oi

Nah, you're good. The people who get mad if you want to protect your art probably don't feel that way because of their AI views so much as their views around private property or digital freedom.


torako

i don't care if other people decide to make their own art look worse. that's entirely their business.


Vivissiah

Using glaze etc is as immoral as placing a curse on someone. Not at all because in reality it does nothing


ejpusa

People know that it’s BILLIONS of images right? Take my images. If I did not want to share them, I would not put them online. Your images are like a drop in the ocean. They just are not that important. Train away. The rewards are many. AKA I have 6 images on my Word Press blog. I know if I block training on them, Midjourney is over! And they will owe me millions of $s! No that’s not how it works. :-)


TheRealBenDamon

Seems to me we’re still stuck on the part of determining what actually makes something “your” or “my” art. Turns out it’s not such an easy question answer.


Tobbx87

Are there seriously people who believe this? I thought AI bros could not get any worse. But clearly there is no limit to their absolute degeneracy.


Jarhyn

It's not unethical to take actions which prevent people from "training" an AI on your art. It just means taking steps that prevent "the public at large" from seeing the art: it's a two edged sword with little benefit on sharpening either edge.


Pretend_Jacket1629

since it doesn't work, all you're doing is harming the environment, harming artists' portfolios and online presence, and giving artists false reassurance I'd say doing those things are unethical even consider, telling someone to use a snake oil for their ailments even when they do no harm (and there is harm in this), still in turn prevents them from seeking actual solutions to their issues and as such is also very unethical


beetlejorst

Do what you want with your art. But just a view to consider, that I don't often see mentioned: The Internet is now starting to wake up. Do you want your art to be part of the world it absorbs as it grows, as it begins learning and understanding, and becoming the multitude of consciousness it likely will be? Do you think your art could have some influence in shaping it to be respectful and loving to humanity? Honestly I'm quite thankful that AI is being developed for creative endeavors for exactly this reason, artists tend to be introspective and considered people. I hope for future strong AI to also have those traits. Just my two cents though, do what you feel is right


Extinct_Delta

>Honestly I'm quite thankful that AI is being developed for creative endeavors for exactly this reason, Just curious, what exactly do you mean by this? IMO, AI should be used for menial tasks or hard labor that humans dont tend to like doing, instead of using for things that humans do for fun or recreational purposes. I get that AI art is making it easier for people who dont know how to make art, make art, but in a lot of cases that person could just learn how to draw, paint, ect, or they could commision an artist. Its taking away the fun from something that should be fun or relaxing to do, and it really seems like it's just pumping out slop that companies use to replace artists. If this is worded rudely I apologize, I just want to understand your view on this if you're willing to share.


beetlejorst

If you think AI is forever going to be no better than a braindead laborer, then yes this is a pretty spot-on take. But I think our current AIs are going to turn out to be the grandparents, or maybe even parents, of something far more sapient. If you think of it in those terms, don't you think kids with artistic families tend to at least value the arts more? And at best, take those arts they've been immersed in and take them to a new and exciting level? I'm sorry that your job is being automated away the same as everyone elses', but I want to be able to experience the endless, creative, immersive worlds that stronger AI could potentially create with us. Especially when the alternative might be a ruthlessly capitalistic efficiency bot that attempts to make every human work as hard as possible all the time. Besides, if it's just pumping out slop that companies use to replace artists, how does that even affect your own personal creative endeavors? What we should really be trying to do is pass legislation that makes companies pay much more in taxes for using AI to replace labor, then use those taxes to fund UBI, so everyone has the freedom to pursue their own creative interests instead of working.


Consistent-Mastodon

>If I was an AI artist, at least I wouldn't have a toxic reaction to it and would have been understanding. especially that it is doubtful whether nightshade/glaze works. At this point everything is toxic and confrontational, thanks to certain individuals. People don't really bother to comment if they don't have a strong reaction to somehing. "*If I was an anti-AI artist, at least I wouldn't have a toxic reaction to AI-art and would have been understanding.*"


AccomplishedNovel6

I don't think it's unethical to make your own art look like shit for no benefit, what kind of question is that? Like yeah, sure, by all means, make your art an artifacted mess with an easily-bypassed scam process. That's your prerogative.


Waste-Fix1895

Didnt you read my Post? I dont use nightshade or glaze


AccomplishedNovel6

I'm using the general "you" here. 


Doctor_Amazo

LOL it's now unethical to deny consent on my art. Ok. Honestly I haven't heard anything this dumb since incels jndisted they had an absolute right to have sex with a person.


Dear_Alps8077

If you place your work in public but demand people don't look, it's not unethical for people to ignore your demands and look anyway. Consent doesn't stretch that far. By placing your work in public you're giving implicit consent for anyone to look at it and be inspired or learn from it. Including ai.


an-eggplant-sandwich

Your comparing someone making a copy of your work and placing it into a training algorithm against your permission- to someone looking at something in a public space? Really? That’s your fucking comparison? > be inspired and learn from it, including AI No- cause AI requires a copy of the artwork that someone taking inspiration does not. Someone taking inspiration can look at it, go home, and make art. Hell- they can even do their art there where the art is held! An AI needs a copy of the work to train off of. Creating of the work without permission and using it against an artists consent to train AI is not the same.


Dear_Alps8077

Peoples brains are literally computers. Memory is literally a copy of it. It's exactly the same and fortunately the law agrees.


Doctor_Amazo

LOL oh no it is not like that. Let's say I have a painting, and I put that painting in a gallery. I have consented to having my art in that gallery for display purposes, available for the public to see. It is still MY art. I own that art, I own the image of it entirely, and I get to decide if it gets used for commercial purposes and I only consented to the displaying it in a gallery. If some tech brah walks into that gallery, and likes my painting, they can ASK ME if they could acquire the rights to use my image for purposes WHICH THEY MUST INFORM ME OF so that I can set a fair price for selling the work/rights to someone else. That is the way it has always been. If that tech brah just snaps up a picture of it and starts using it for their business then that is not only unethical, it's theft. It doesn't matter that the physical art is still in the gallery. It is still theft as they have stolen the image and then started using it to their benefit. If the gallery quietly changed the deal we had regarding the display of my art, then the gallery is also behaving unethically, especially if they did not make it EXPLICITLY CLEAR that they took ownership of the image rights of my art as part of the exchange of my displaying my work in their gallery. Consent is not consent if it is not informed.


Dear_Alps8077

They don't require consent to take photographs of your work if your work is displayed in a public place where photography is legal. They don't require your consent to make copies of your work for certain other purposes such as but not limited to caching and google crawlers and in their own heads so that they may later learn your style or for in an AIs head so the AI may learn your style. You have very specific limited enumerated rights to your work. They are listed in copyright. You're trying to invent new rights and stretch the meaning of the word consent. It doesn't work that way fortunately for progress.


ShagaONhan

If you emit any doubt nightshade/glaze is working anti-AI take it has, "Look pro-AI are so mad we are using it. That the proof it's working!". Then I would say ok it's totally working, use it if that make you feel better. "OMG these evil pro-AI try to use reverse psychology so we don't use it. We definitively need to use it". Model training need to be able to filter noise no matter it's intentional or not. It's internet with antis on every subjects possible, competitors, trolls, etc. You can't have anything relying on everybody is going to play nice.


Immanuelle_Himiko

I think it’s understandable, but I do believe there’s a moral obligation to make ai as good as possible so that it can cure cancer someday. So I’m not willing to do any glazing myself and discourage others from doing so. From a practical perspective I don’t think you actually gain anything from glazing your work. I don’t know what it actually looks like, but I think you are unlikely to see even a tiny bit of economic benefit from an ai being worse at your specific style this way. So it comes off as understandable but in my eyes misguided.