Glaze: The Glaze Project offers a free downloadable tool called Nightshade that allows artists to apply invisible pixel-level changes to their images before posting online, causing AI image-generation models that train on those images to learn incorrect associations between objects and concepts. It was publicly released in January 2024 and has been downloaded more than 2.5 million times. | AI Trace
OtherVerified
The Glaze Project offers a free downloadable tool called Nightshade that allows artists to apply invisible pixel-level changes to their images before posting online, causing AI image-generation models that train on those images to learn incorrect associations between objects and concepts. It was publicly released in January 2024 and has been downloaded more than 2.5 million times.
Details
Where Glaze is defensive (preventing style mimicry of a specific artist), Nightshade is offensive: it embeds hidden pixel distortions that cause AI models to associate the wrong concepts with images — for example, training data showing dogs may register as cats to the model. The Glaze Project has described this as a 'data poisoning' approach intended to raise the cost to AI companies of scraping artists' work without consent. Nightshade was developed at the University of Chicago SAND Lab and peer-reviewed at the IEEE Symposium on Security and Privacy in May 2024. The team has stated they plan to integrate Nightshade as an optional feature within Glaze.