They launch software that poisons images

2024-01-22 15:53:23

Artists have a new weapon to resist generative AI systems plundering their works without permission to train. The Nightshade development team has announced that version 1.0 of this software is now available for open access download on the University of Chicago website. This tool for Windows and Mac is capable of “poisoning” images before they are uploaded to the web in order to make them counterproductive for training image-generative AI. The software is “designed as an offensive tool to distort the representations of an image that can be made by generative AI models”, such as DALL-E, Midjourney or Stable Diffusion, the researchers explain.

Concretely, by making indistinguishable pixel modifications, “Nightshade transforms images into “poisoned” samples, so that models who train on these images without consent will see their models learn unpredictable behaviors that deviate from expected norms” , adds the development team led by Professor Ben Zhao. “For example, a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space,” the researchers explain.

They had already developed other similar software to resist image generative AI. Called Glaze (“varnish”, in English), it adds an imperceptible layer to images capable of modifying the artistic style of a work in the eyes of AI. An integration of the two tools, allowing increased protection for artists’ works, is being studied, their developers say.

1705943000
#launch #software #poisons #images

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.