AI Is Sampling Our Work Without Paying: Why Creators Should Be Compensated

For years, creators have lived under a straightforward, if sometimes painful, ruleset. If you distribute copyrighted work without permission, you get shut down. If you borrow from someone else’s creation, you compensate them. Napster learned that lesson the hard way. The Pirate Bay built an entire mythology around defying it and paid the price. Musicians who sampled another artist’s work quickly discovered that inspiration becomes a liability the moment recognizable value is extracted. Revenue sharing was not optional; it was the cost of doing business in a creative economy.
Now compare that to what is happening with large-scale AI systems.
AI companies have trained their models on vast quantities of copyrighted material, including books, journalism, photography, music, and entire websites, without permission, disclosure, or compensation. In my case, OpenAI is settling claims related to books it trained on without authorization, one of which is mine. Yet my books are only one slice of my intellectual property. My website, like millions of others, is copyrighted, original work created at high cost and effort. That content has also been consumed, transformed, and monetized.
The difference is not technical. It is rhetorical.
When Google was a connector, publishers accepted the tradeoff. We invested in SEO, performance, structured data, and quality content because Google sent traffic. Google indexed, summarized, and linked. The relationship was symbiotic. A headline and snippet were the equivalent of a movie trailer or a radio clip. The value exchange was obvious. Publishers got audience. Advertisers got reach. Google got relevance and trust.
That model is breaking.
Zero-click search and AI-generated answers have crossed a line from indexing to replacement. Instead of pointing users to creators, AI systems ingest creator work, remix it, and present it as a finished product. The user never visits the source. The creator never sees attribution, traffic, or compensation. Meanwhile, the AI company monetizes the interaction through subscriptions, enterprise licensing, or ad-adjacent ecosystems.
That is not search. That is sampling at industrial scale.
If a musician sampled an entire catalog, learned the style, structure, and emotional cadence of another artist’s work, and then released songs that eliminated the need to listen to the original, no court would accept the argument that this was fair use. Yet AI companies are effectively doing precisely that, just with prose instead of beats, and at a scale no human artist could ever replicate.
The most surreal part is how this skirts the protections we have relied on for decades. If someone scrapes a single article from my site and republishes it, I can file a DMCA takedown and have it removed. The law recognizes the harm immediately. But when AI agents devour years of my work, absorb it into a model, and regurgitate it in paraphrased form, that same protection suddenly evaporates. The outcome is the same. My work is used. My value is extracted. My compensation is zero.
Creators are told this is progress.
It is not.
It is piracy. It is theft.
What makes this especially reckless is that AI companies are undermining their own future. By cutting creators out of the economic loop, they are cutting off their source of training. Quality content does not appear magically. It exists because writers, researchers, photographers, developers, and publishers can justify the time and expense required to create it. Remove the incentive, and the ecosystem collapses into a swamp of recycled summaries, shallow rewrites, and synthetic noise.
AI trained on AI output is not intelligence. It is entropy.
Models decay when fed their own exhaust. Accuracy degrades. Original insight disappears. Cultural nuance flattens. The very thing that made large language models impressive in the first place, the breadth and depth of human-created work, becomes inaccessible because the humans stop creating.
This is not a theoretical risk. We are already seeing it. Content farms optimized for AI consumption. Articles written to be summarized instead of read. Knowledge loops that feed on themselves. The long-term outcome is not abundance. It is dilution.
Creators are not anti-AI. Many of us actively use these tools. We understand their potential. We also understand precedent. Every creative industry that survived technological disruption did so by enforcing compensation, attribution, and consent. Radio paid royalties. Streaming pays fractions of pennies per play. Sampling required licenses. Even search, at its best, respected the boundary between discovery and substitution.
AI giants are trying to leap over that history.
They should not be allowed to.
There should be compensation. Period. Opt-in training. Transparent usage. Revenue sharing tied to measurable value extraction. These are not radical demands. They are the minimum standards that every other creative medium has already accepted.
This is not about nostalgia or fear. It is about survival.
If creators disappear, AI loses its foundation. And when that happens, the same companies now declaring victory over content will discover that they have optimized themselves into irrelevance.
Creators built the library. AI companies do not get to burn it down and sell tickets to the ashes.
©2025 DK New Media, LLC, All rights reserved | DisclosureOriginally Published on Martech Zone: AI Is Sampling Our Work Without Paying: Why Creators Should Be Compensated

Scroll to Top