the darling Glaze “anti-ai” watermarking system is a grift that stole code/violated GPL license (that the creator admits to). It uses the same exact technology as Stable Diffusion. It’s not going to protect you from LORAs (smaller models that imitate a certain style, character, or concept)
An invisible watermark is never going to work. “De-glazing” training images is as easy as running it through a denoising upscaler. If someone really wanted to make a LORA of your art, Glaze and Nightshade are not going to stop them.
If you really want to protect your art from being used as positive training data, use a proper, obnoxious watermark, with your username/website, with “do not use” plastered everywhere. Then, at the very least, it’ll be used as a negative training image instead (telling the model “don’t imitate this”).
There is never a guarantee your art hasn’t been scraped and used to train a model. Training sets aren’t commonly public. Once you share your art online, you don’t know every person who has seen it, saved it, or drawn inspiration from it. Similarly, you can’t name every influence and inspiration that has affected your art.
I suggest that anti-AI art people get used to the fact that sharing art means letting go of the fear of being copied. Nothing is truly original. Artists have always copied each other, and now programmers copy artists.
Capitalists, meanwhile, are excited that they can pay less for “less labor”. Automation and technology is an excuse to undermine and cheapen human labor—if you work in the entertainment industry, it’s adapt AI, quicken your workflow, or lose your job because you’re less productive. This is not a new phenomenon.
You should be mad at management. You should unionize and demand that your labor is compensated fairly.
So that first thing isn’t what you think it is. Anyone can use GPL’ed code for whatever they want. The trick is that they have to distribute the modifications they made when requested, (if and only if they distribute the program.)
I can sit here and use GPL’ed code all day, but as long as I don’t distribute the compiled code or allow people access to any source code modifications (if I release the object code) I’m in the clear.
Even then, if you don’t do that its not stealing. Its failure to comply with the licensing terms, so some sort of copyright infringement. But also literally in the tweet you linked:
We are releasing the source code for Glaze front end, and also working on a rewrite of the frontend.
Assuming they did that, in the 29 days after they said they would do that, it squares them with Section 8 of the GPL:
Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.
We have done a clean from scratch rewrite of the glaze front end, and also made a detailed pass through all remaining code to ensure there was no code from any GPL projects.
The “don’t bother with Glaze” shit sure started up fast, huh
Gosh wow this argument isn’t even original, it comes up literally every single time an artist in any format complains about their shit being stolen, every single goddamn time, and that still hasn’t ever made it acceptable. That’s why laws get made about it.
Defend your work, do not trust people who tell you there’s no reason to bother, that should make you angrier and more willing to double down and sue a motherfucker.
“get mad at management” that’s what we’re doing you fucking imbecile
the darling Glaze “anti-ai” watermarking system is a grift that stole code/violated GPL license (that the creator admits to). It uses the same exact technology as Stable Diffusion. It’s not going to protect you from LORAs (smaller models that imitate a certain style, character, or concept)
An invisible watermark is never going to work. “De-glazing” training images is as easy as running it through a denoising upscaler. If someone really wanted to make a LORA of your art, Glaze and Nightshade are not going to stop them.
If you really want to protect your art from being used as positive training data, use a proper, obnoxious watermark, with your username/website, with “do not use” plastered everywhere. Then, at the very least, it’ll be used as a negative training image instead (telling the model “don’t imitate this”).
There is never a guarantee your art hasn’t been scraped and used to train a model. Training sets aren’t commonly public. Once you share your art online, you don’t know every person who has seen it, saved it, or drawn inspiration from it. Similarly, you can’t name every influence and inspiration that has affected your art.
I suggest that anti-AI art people get used to the fact that sharing art means letting go of the fear of being copied. Nothing is truly original. Artists have always copied each other, and now programmers copy artists.
Capitalists, meanwhile, are excited that they can pay less for “less labor”. Automation and technology is an excuse to undermine and cheapen human labor—if you work in the entertainment industry, it’s adapt AI, quicken your workflow, or lose your job because you’re less productive. This is not a new phenomenon.
You should be mad at management. You should unionize and demand that your labor is compensated fairly.
So that first thing isn’t what you think it is. Anyone can use GPL’ed code for whatever they want. The trick is that they have to distribute the modifications they made when requested, (if and only if they distribute the program.)
I can sit here and use GPL’ed code all day, but as long as I don’t distribute the compiled code or allow people access to any source code modifications (if I release the object code) I’m in the clear.
Even then, if you don’t do that its not stealing. Its failure to comply with the licensing terms, so some sort of copyright infringement. But also literally in the tweet you linked:
We are releasing the source code for Glaze front end, and also working on a rewrite of the frontend.
Assuming they did that, in the 29 days after they said they would do that, it squares them with Section 8 of the GPL:
Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.
We have done a clean from scratch rewrite of the glaze front end, and also made a detailed pass through all remaining code to ensure there was no code from any GPL projects.
the darling Glaze “anti-ai” watermarking system is a grift that stole code/violated GPL license (that the creator admits to). It uses the same exact technology as Stable Diffusion. It’s not going to protect you from LORAs (smaller models that imitate a certain style, character, or concept)
An invisible watermark is never going to work. “De-glazing” training images is as easy as running it through a denoising upscaler. If someone really wanted to make a LORA of your art, Glaze and Nightshade are not going to stop them.
If you really want to protect your art from being used as positive training data, use a proper, obnoxious watermark, with your username/website, with “do not use” plastered everywhere. Then, at the very least, it’ll be used as a negative training image instead (telling the model “don’t imitate this”).
There is never a guarantee your art hasn’t been scraped and used to train a model. Training sets aren’t commonly public. Once you share your art online, you don’t know every person who has seen it, saved it, or drawn inspiration from it. Similarly, you can’t name every influence and inspiration that has affected your art.
I suggest that anti-AI art people get used to the fact that sharing art means letting go of the fear of being copied. Nothing is truly original. Artists have always copied each other, and now programmers copy artists.
Capitalists, meanwhile, are excited that they can pay less for “less labor”. Automation and technology is an excuse to undermine and cheapen human labor—if you work in the entertainment industry, it’s adapt AI, quicken your workflow, or lose your job because you’re less productive. This is not a new phenomenon.
You should be mad at management. You should unionize and demand that your labor is compensated fairly.
So that first thing isn’t what you think it is. Anyone can use GPL’ed code for whatever they want. The trick is that they have to distribute the modifications they made when requested, (if and only if they distribute the program.)
I can sit here and use GPL’ed code all day, but as long as I don’t distribute the compiled code or allow people access to any source code modifications (if I release the object code) I’m in the clear.
Even then, if you don’t do that its not stealing. Its failure to comply with the licensing terms, so some sort of copyright infringement. But also literally in the tweet you linked:
We are releasing the source code for Glaze front end, and also working on a rewrite of the frontend.
Assuming they did that, in the 29 days after they said they would do that, it squares them with Section 8 of the GPL:
Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.
it’s been raining for like four days and i have to bike everywhere i just wanna take a lyft to get groceries and also i have bills and my cat needs his special $70 prescription food for his peepee
Hi, Tumblr. It’s Tumblr. We’re working on some things that we want to share with you.
AI companies are acquiring content across the internet for a variety of purposes in all sorts of ways. There are currently very few regulations giving individuals control over how their content is used by AI platforms. Proposed regulations around the world, like the European Union’s AI Act, would give individuals more control over whether and how their content is utilized by this emerging technology. We support this right regardless of geographic location, so we’re releasing a toggle to opt out of sharing content from your public blogs with third parties, including AI platforms that use this content for model training. We’re also working with partners to ensure you have as much control as possible regarding what content is used.
Here are the important details:
We already discourage AI crawlers from gathering content from Tumblr and will continue to do so, save for those with which we partner.
We want to represent all of you on Tumblr and ensure that protections are in place for how your content is used. We are committed to making sure our partners respect those decisions.
To opt out of sharing your public blogs’ content with third parties, visit each of your public blogs’ blog settings via the web interface and toggle on the “Prevent third-party sharing” option.
For instructions on how to opt out using the latest version of the app, please visit this Help Center doc.
Please note: If you’ve already chosen to discourage search crawling of your blog in your settings, we’ve automatically enabled the “Prevent third-party sharing” option.
If you have concerns, please read through the Help Center doc linked above and contact us via Support if you still have questions.
bark bark bark
Hey so @staff has this data been sent off already?
And if so, to what address do I send the invoice so you can compensate me for all the data I’ve contributed?