Called “Glaze” (“varnish”), the program adds a layer of data on the image, invisible to the naked eye, which “confuses the tracks”, summarizes Shawn Shan, the student in charge of the project. an image of a nymph “Karla Ortiz style”.
An appropriation without the interested parties having given their consent, being credited or financially compensated, the 3 “Cs” at the heart of their battle.
In January, artists collectively filed a complaint once morest Midjourney, Stable Diffusion and DreamUp, three AI models formed using billions of images collected from the internet.
One of the main plaintiffs, Sarah Andersen, felt “intimately aggrieved” when she saw a drawing generated with her name, in the style of her ‘Fangs’ comic.
Her indignant reaction on Twitter was widely relayed, and then other artists reached out to her. “We hope to set a legal precedent and force companies specializing in AI to respect the rules,” she says.
In particular, artists want to be able to accept or refuse that their works be used by a model – and not have to ask for their removal, even when this is possible.
Under these conditions, one might imagine a “licensing system, but only if the commissions are sufficient to live off it”, notes Karla Ortiz, another plaintiff.
“Easy and cheap”
No question “of receiving pennies while the company pockets millions”, insists this illustrator who has worked for Marvel studios in particular.
On social networks, artists tell how they have lost a large part of their contracts.
“Art is dead, man. It’s over. AI won. Humans lost,” Jason Allen told The New York Times in September 2022, following submitting an image generated by Midjourney to a competition, which he won.
The Mauritshuis Museum in The Hague is currently exhibiting an image generated with AI for a competition to create works inspired by Vermeer’s “Girl with a Pearl Earring”.
The San Francisco Ballet, for its part, caused debate by using Midjourney for its campaign to promote The Nutcracker in December.
“It’s easy and cheap, so even institutions don’t hesitate, even if it’s not ethical,” says Sarah Andersen indignantly.
The accused companies did not respond to requests from AFP, but Emad Mostaque, the boss of Stability AI (Stable Diffusion), likes to compare these programs to simple tools, like Photoshop.
They will allow “millions of people to become artists” and “create tons of new creative jobs”, he said, believing that “unethical” use or “to do illegal things” is the “problem” of users, not technology.
Apocalypse of creation
Companies will claim the legal concept of “fair use” (“reasonable use”), a kind of exception clause to copyright, explains lawyer and developer Matthew Butterick.
“The magic word is + transformation +. Does their system offer something new? Or does it replace the original on the market?”, Details the consultant.
With the law firm Joseph Saveri, he represents artists, but also engineers in another complaint once morest Microsoft software, which generates computer code.
Until a distant trial, and an uncertain outcome, the mobilization is also organized in the technological field.
Called to the rescue by artists, a lab at the University of Chicago launched software last week to publish works online by protecting them once morest AI models.
Called “Glaze” (“varnish”), the program adds a layer of data on the image, invisible to the naked eye, which “confuses the tracks”, summarizes Shawn Shan, the student in charge of the project.
The initiative is greeted with enthusiasm, but also skepticism.
“The responsibility will fall to the artists to adopt these techniques”, laments Matthew Butterick. “And it’s going to be a game of cat and mouse” between companies and researchers.
He fears that the next generation will become discouraged.
“When science fiction imagines the apocalypse by AI, robots arrive with laser guns”, notes the lawyer. “But I think the victory of AI over humanity is when people give up and stop creating.”