Nightshade was first announced months ago to help artists protecting their copyright by “poisoning” AI models seeking to train on artists’ work, which is now available for artists to download. The tool was developed by Professor Ben Zhao at the University of Chicago as the Glaze Project. The mechanism behind Nightshade is using PyTorch to understand the image and applying a tag that subtly changes the image at pixel level, so that the AI model to use the image for training would think something completely different from the original image. The processed image does not look much different to human eyes, while looks completely different object to AI models training on it, which will lead to unexpected behaviors of the trained AI model. For example, as stated in the project website, “a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.” Nightshade can be only used in a Mac with Apple chips inside and a PC with a Nvidia GPU. Due to overwhelming demand, the processing time of images can be very long, in some case it could be 8 hours to get the processed image. While Nightshade is designed to protect artists’ work copyright, some internet users complain that this is cyberattack on AI models and companies. What is your thought on this? Left your opinion on the comment section.


Nightshade Example Images From Nightshade Paper Preprint
Resources:
More discussions on Nightshade can be found at https://venturebeat.com/ai/nightshade-the-free-tool-that-poisons-ai-models-is-now-available-for-artists-to-use/
Nightshade website: https://nightshade.cs.uchicago.edu/whatis.html
Nightshade paper preprint: https://arxiv.org/pdf/2310.13828.pdf
