Since the where-the-hell-did-that-come-from boom of tools like ChatGPT and Dall-E, creators around the world have been wondering how to protect art from AI.
And it’s a good question.
For people like you and me – designers, illustrators, writers – AI is an existential threat. And when that threat is backed by billions in VC funding, it’s time to knowledge up.
This year, the generative AI market is expected to be worth $36 billion, rising to an eye-watering $356 billion by 2030. But, so far, all that money seems to be at the expense of creators.
The crazy thing is that this is no secret. Generative AI tools are being designed to replicate and eventually replace the work of humans, as OpenAI’s CEO, Sam Altman, explains:
What’s even more maddening is that human art (maybe yours, maybe mine) has been used to train these AI tools – largely without consent. And this is something 94% of artists feel they should be compensated for.
Later in this article, I’ll get into some practical tips on how to protect your art from AI. And I’ll also share some major lawsuits against AI firms that are still in the works. But first, let’s talk about the three C’s of copyright and why they’re central to this issue.
The three C’s of copyright: consent, compensation, and credit
When people use generative AI, they enter reference points in their prompts. Sometimes this will be art styles like watercolor or minimalism. But, in other cases, people will explicitly name a film, animation studio, or even a specific person.
That’s exactly what happened to Tennessee-based artist, Kelly McKernan. According to The New Yorker, McKernan’s name had been appearing on the Discord chat associated with Midjourney.
It was used over 12,000 times.
It’s one thing to see work out there that feels similar to yours. But when you see your name being used so flagrantly to rip off your art, it’s sure to hurt. So McKernan joined a class action lawsuit with two other artists to take on Midjourney, Stable Diffusion, and DreamUp.
Their case (which took a recent step forward) comes down to the three C’s:
- Consent – they didn’t give permission for their work to be used
- Compensation – they weren’t compensated for the use of their work
- Credit – they weren’t credited for all the images mimicking their style
Here’s how one of the attorneys on the case, Matthew Butternick, explains it:
I’ll tell you about a few more interesting lawsuits later. Now let’s get into how you can protect your art from AI.
Four ways to protect your art from AI
One thing you often hear about generative AI is that the horse has bolted. In other words, the damage is done and now we just have to live with the consequences.
Well, I’m not so sure that’s true.
AI firms are getting sued from every angle. Regulations seem inevitable at this point, but these things take time. So, between now and then, we need to protect ourselves and our art in any way we can.
Here are my four tips to help you do just that:
- Disallow GPTBot so your work can’t be used to train AI models
- Add a disclaimer to your website or publication
- Use Nightshade to protect images before publishing them
- Report AI art that infringes on your unique style
Let’s jump in.
1. Disallow GPTBot so your work can’t be used to train AI models
If you publish art on your own website, then the first thing you should do is block GPTBot.
This is a totally legit way to prevent OpenAI’s crawlers from scanning your content and using your art to train their AI models. They’ve even published guidance on their crawlers and how to block them.
All you have to do is add this text to your robots.txt file:
User-agent: GPTBot
Disallow: /
The only thing to bear in mind here is that OpenAI is also used to power Bing’s AI search results. So while blocking the bot won’t harm your SEO performance, it may harm your ability to appear in AI-generated results.
2. Add a disclaimer to your website or publication
It’s one thing to block OpenAI from crawling your website. But what about all the other AI bots and crawlers out there?
While you can’t actively stop every AI bot from scanning your content, this is a great solution for any artists who publish their work online or in print.
All you have to do is add a short disclaimer to your work stating that it should not be used for AI training. Here’s some sample text from the Authors Guild:
“NO AI TRAINING: Without in any way limiting the author’s [and publisher’s] exclusive rights under copyright, any use of this publication to “train” generative artificial intelligence (AI) technologies to generate text is expressly prohibited. The author reserves all rights to license uses of this work for generative AI training and development of machine learning language models.”
This could be extremely important if you ever choose to take legal action against an AI artist or legal firm. You’ll have written proof that you didn’t give consent for your work to be used.
3. Use Nightshade to protect images before publishing them
You want to publish your art, but you don’t want AI tools to be able to copy it.
Enter Nightshade.
Nightshade is an image-cloaking tool created by a team at the University of Chicago. It works by applying filters to your images that are virtually invisible to the naked eye. These filters trick AI models into thinking the image is something else: like a house instead of a tree, or a shark instead of a hat.
This is called “poison” and the idea is that, over time, generative AI tools will become less effective because they’re being trained on false data.
Here are a few other tools you can explore to protect your art:
4. Report AI art that infringes on your unique style
Last but not least, don’t be afraid to report an AI artist for ripping off your style.
If, for example, you see something on Etsy that’s extremely similar to your own work, you can file an intellectual property infringement report.
This Reddit thread tells the story of exactly that. Here’s the original post from u/Emergency_Sail8059:
Selling AI art and received an IP infringement report
I’ve submitted a counter notice, and they said they don’t accept non-copyright claims.
I’ve researched that, but I didn’t find much about it.
Is there anyone with similar experience who can share?
Thanks in advance and happy new year.
Well, it’s safe to say that the Reddit community wasn’t entirely impressed. Here’s the most upvoted response from u/lostterrace:
I mean… it’s quite likely that any particular AI generated art may well infringe on someone else’s copyright.
Lots of artists had their art fed to AI art generators without their consent.
So, if you get a copyright claim from a particular listing… you should let it go. Enough of those claims and your shop will be banned.
Also… keep in mind that the AI art you have generated is not protected by copyright itself. You cannot assert that it is your intellectual property because legally, it is not. AI art cannot be protected by copyright.
So, if you’re confident that someone is infringing on your copyrighted work, report them.
This is an easy way to rumble people making money from AI tools that were trained using your work. And you’ll likely scare them enough to take it down!
Five lawsuits against AI firms that are still in progress
It’s fair to say that AI has pissed off a lot of creators. But fighting the case is a battle of resources, and the chances of one or two artists going up against a tech giant and winning are slim.
Business vs. business, on the other hand … Now that’s a much fairer fight.
Here are five major lawsuits against AI firms that, at the time of writing, are still in progress.
1. The New York Times vs. OpenAI and Microsoft
In December 2023, The New York Times filed a case to sue OpenAI and Microsoft for using its copyrighted work.
The Times states that they didn’t give consent for their articles to be used to train GPT large language models (LLM). In other words: copyright infringement. Now they’re seeking damages and for the LLM to be destroyed.
As of November 2024, the case is ongoing. The Times has identified millions of articles in discovery while searching through OpenAI’s training data. And now they’ve asked a federal judge to order OpenAI to admit all the newspaper articles it used to train the LLM.
2. Authors Guild vs. OpenAI
Zadie Smith, Stephen King, Elena Ferrante – these are just a few of the thousands of authors whose books have been used to train AI tools without their consent.
So in September 2023, a group of writers joined the Authors Guild in a class action lawsuit against OpenAI, later adding Microsoft as another defendant. The case is still in its infancy, but that hasn’t stopped the authors involved from coming out to say their piece.
3. Getty Images vs. Stability AI
Getty Images is a high-quality stock image library for marketers and journalists. They make money by selling photographers’ work, who in turn make commissions from those sales.
In January 2023, Getty started legal proceedings against Stability AI, alleging that the company copied 12 million images to train its AI models without permission or compensation.
Since then, Getty has launched a separate lawsuit in the UK. And despite Stability AI trying to get the case thrown out, a UK court has ruled that it can move to trial.
4. Universal Music, ABKCO, and Concord vs. Anthropic
In October 2023, a group of music publishers sued Anthropic for using copyrighted song lyrics to train its AI chatbot, Claude. The lawsuit has even received financial backing from Google and Amazon.
The latest news at the time of writing is that Anthropic has asked a US federal court to narrow the case down. But the lead attorney on the case refused to budge.
5. Alden Global Capital newspapers vs. OpenAI and Microsoft
In April 2024, a group of newspapers owned by Alden Global Capital sued OpenAI and Microsoft for copyright infringement, unfair competition, and tarnishing the newspapers’ reputations.
The papers involved include The New York Daily News, The Chicago Tribune, and The Orlando Sentinel. And once again, the claim comes down to using the publishers’ work to train AI models without their consent or compensation.
Watch this space.
Final thoughts
I hope you’ve learned as much from reading this article as I learned writing it.
For me, the biggest takeaway is that there’s a serious legal case to be had against generative AI firms. But it’s going to take a few years for that stuff to play out.
So, in the meantime, you have to do what you can to protect yourself, your art, and your career. Remember, we’re all in this together!
Featured image by Rock’n Roll Monkey
Leave a Reply