Adobe has created a new tool to help artists, photographers, and creators stop AI companies from using their images to train artificial intelligence. The tool, called Adobe Content Authenticity App, works like a “Do Not Train” sign for digital art. Just like websites use robots.txt files to block unwanted bots, this tool lets artists add hidden tags to their images, saying, “Do not use this for AI training.”
How the Adobe Tool Works
The tool lets users add their name and social media links to up to 50 images at once. These details are stored in the image’s hidden information, even if the image is edited or shared online. Here is what it does

- Stops AI Training
A checkbox in the app lets artists mark their images as “off-limits” for AI training.
- Verifies Identity With LinkedIn
Artists can connect their LinkedIn profiles to prove they own the image. LinkedIn’s verification system helps stop fake claims.
- Works Across Platforms
A free Chrome extension shows a tiny “CR” badge on images with these tags. This helps people spot protected art on sites like Instagram, which do not support the tags yet.
Why Some People Are Skeptical
The big problem is getting AI companies to respect the tags. In the past, many AI bots ignored website rules like robots.txt. Experts worry the same could happen here.
- No Promises From AI Companies
Adobe has not signed deals with big AI firms like OpenAI or Midjourney to follow the tags. Without their agreement, the tool might not work.
- Hackers Could Remove Tags
While Adobe uses special tech to keep tags safe, skilled hackers might still delete them.
- Confusing Labels
Last year, Meta (Facebook’s parent company) accidentally labeled human-edited photos as “Made with AI,” causing anger. This shows how hard it is to tag content correctly.
What This Means for Artists
Small creators and big agencies have asked for tools like this. They want control over how their work is used. For example, a photographer could share nature photos online without worrying that an AI would copy their style.
According to Adobe, the current strategy does not function as an art authentication process. The measure provides creators the freedom to state that they are the original authors of their work while preventing AI systems from utilizing it for learning purposes.
The upcoming feature expansion of the company includes video and audio protection measures. Adobe continues to concentrate on image protection during this specific time period.

Will AI Companies Listen?
Adobe is talking to AI developers to convince them to respect the tags. Artist challenges will exist when companies fail to address their concerns. The governments of Europe are taking steps toward requiring better visibility in artificial intelligence training methods. The tools developed by Adobe match regulatory requirements in this domain.
Artists now have access to a valuable forward-moving development. The tool enables artists to express themselves during periods when their artistic battles remain unresolved. The evolving nature of artificial intelligence in art creation enables Adobe’s tools to find solutions for maintaining artistic freedom while dealing with programmed limitations.