Artists outraged by artificial intelligence that copies in seconds the styles they have sacrificed years to develop are waging battle online and in court.
Fury erupted in the art community last year with the release of generative artificial intelligence (AI) programmes that can convincingly carry out commands such as drawing a dog like cartoonist Sarah Andersen would, or a nymph the way illustrator Karla Ortiz might do.
Such style-swiping AI works are cranked out without the original artist's consent, credit or compensation -- the three C's at the heart of a fight to change all that.
In January, artists including Andersen and Ortiz filed a class-action lawsuit against DreamUp, Midjourney and Stable Diffusion, three image-generating AI models programmed with art found online.
Andersen told AFP she felt "violated" when first she saw an AI drawing that copied the style of her "Fangs" comic book work.
She fired off an indignant reaction on Twitter; it went viral, and other incensed artists reached out to her with stories of their own.
Backers of the suit hope to establish legal precedent governing generative AI models that copy artists' styles.
Artists want AI creators to be required to secure permission for works used in training software, with an option to remove it.
They also want suitable compensation.
"There is room for a conversation about what that would look like," said Ortiz.
Compensation could take the form of a licensing model, she mused, and would need to be appropriate.
It would be wrong for artists to "get a couple of cents while the company gets millions" of dollars, added Ortiz, whose resume includes working for Marvel Studios.
On social networks, artists are sharing tales of jobs being lost to generative AI.
The suit notes that a video-game designer named Jason Allen last year won a Colorado State Fair competition with art created using Midjourney.
"Art is dead, dude. It's over. AI won. Humans lost," Allen was quoted as telling The New York Times.
The Mauritshuis Museum in the Netherlands sparked controversy by displaying an AI-generated image inspired by Vermeer's ‘Girl With a Pearl Earring’.
The San Francisco Ballet, meanwhile, caused a stir by using Midjourney to generate illustrations used in promotional material for ‘Nutcracker’ performances in December.
"It's sort of a natural consequence of something being easy and cheap and accessible," Andersen said.
"Of course they are going to use that option, even if it is unethical," she added.
AI companies named in the lawsuit did not respond to requests for comment.
Stability AI founder and chief executive Emad Mostaque has portrayed generative software as a "tool" that can tend to "mundane image output" and provide new ways "of ideating for artists."
Mostaque contends that it will allow more people to become artists.
Critics disagree. When a person prompts software to draw in the style of a master, they say, it does not make that person an artist.
Mostaque has said that if people choose to use generative AI unethically or to break the law, "that's their problem."
Companies defending themselves from artists' copyright claims are likely to claim "fair use," an exception sometimes allowed when a new spin is put on a creation or when it is only briefly excerpted.
"The magic word used in the US court system is 'transformative,'" said lawyer and developer Matthew Butterick.
"Is this a new use of the copyrighted work, or does it replace the original in the marketplace?"
Artists are turning not just to the courts but to technology to defend themselves against generative AI.
Prompted by artists, a team at the University of Chicago last week launched their ‘Glaze’ software to help protect original works.
The programme adds a layer of data over images that, while invisible to the human eye, "acts as a decoy" for AI, said Shawn Shan, the doctoral student in charge of the project.
That still leaves the onus on artists to adopt Glaze. Butterick predicts a "cat-and-mouse game" as AI makers figure out ways around such defenses.
Butterick also worries about the effect of AI on the human spirit.
"When science fiction imagines the AI apocalypse, it's something like robots coming over the hill with laser guns," he said.
"I think the way AI defeats humanity is more where people just give up and don't want to create new things, and (it) sucks the life out of humanity."