Copyright in the Age of Generative AI, Part III: Structuring Residual Rights in the AI Economy

In past articles, I argued that outdated copyright laws place disadvantages on digital artists trying to secure intellectual property rights. Moreover, slow-paced litigation over the DMCA 1202b delays resolution in copyright regulation for gen AI training datasets. In this article, I discuss ongoing artist-led initiatives to protect intellectual property rights and stipulate guidelines for emerging copyright policy.

Without a federal policy tailored to gen AI, court outcomes from Andersen v. Stability AI Ltd. are unlikely to reverse damage that has already been dealt by data harvesting to artists. Amid legal ambiguity, artists are clambering to defend their works through innovative technology, reflecting the plaintiffs’ turn towards AI expert-witnesses in Andersen v. Stability AI Ltd. and Doe v. GitHub, Inc. Artists have taken to online forums to educate each other against the risks of data scraping, promoting technology such as “masking” and Nightshade that poison images to prevent them from being encoded into datasets. Unfortunately, these defense methods are hardly foolproof against versatile data scraping methods developed by technology corporations. Moreover, they are completely futile against already existing datasets created by nonprofits such as LAION, funded by gen AI corporations looking to bypass the fair use clause. Through nonprofit-made datasets, corporations take advantage of copyright criteria that allow reproduction of copyrighted works for “nonprofit educational purposes.” Reactionary litigation is hopelessly insufficient in regulating the worsening intellectual property theft committed by gen AI.

Partnerships between gen AI platforms and data distributors—including Getty Images and Nvidia, Universal Music and BandLab Technologies, Google and Reddit, OpenAI and Le Monde—leave smaller, unlicensed creators increasingly exposed to exploitative deals. Systemic imbalance is entrenched in information asymmetry: media corporations can negotiate competitive deals with “big tech” and leverage new software while eroding independent creator agency. In a stream-geared, digital market exacerbated by gen AI, artists are stranded in an economy that values scalability, efficient production, and mass distribution over labored effort and curated style. Backed by an inundation of private equity and venture funds, countless new gen AI start-ups threaten to uproot the integrity of cultural production under a presidency focused on establishing U.S. AI dominance at the cost of minimizing federal regulation.

Addressing increasing lawsuits against Silicon Valley-based gen AI companies, in September 2024, California Governor Gavin Newsom signed into law Assembly Bill No. 2013. Set to take effect in 2026, the law mandates existing gen AI developers to disclose details regarding the data used to train their AI models, except those used for security and national defense. According to this law, developers must publicly display whether the data was licensed or purchased and disclose any processing or modifications. These transparency requirements curb excessive data scraping, forcing gen AI companies to be vigilant in monitoring copyright violations that could result in costly lawsuits.

For gen AI firms and their distribution partners, such laws pose an unwelcome obstacle to concealing competitive methodologies. How companies train gen AI and what dataset they use is similar to a secret recipe. A bill that hinders data scraping could directly diminish the performance of incompletely trained gen AI. However, for the artist community, it is a small but vital step toward reclaiming creators’ income and upholding an environment that bolsters authentic independent creativity. Assembly Bill No. 2013 tentatively lays the groundwork for future legislation guided by legal analysis and stakeholder feedback, working toward a balanced ecosystem where creativity can be both rewarded and harnessed.

Concerned by the limitation of courts and unions to negotiate fair payment against AI companies, some artists are advocating for the enactment of a residual payment system. Drawing momentum from Hollywood unions where writers and actors negotiated streaming residuals, actor Joseph Gordon-Levitt called for new laws to ensure that residuals from AI made their way to artists. In this system, creators would receive compensation whenever their work is used to train gen AI. He outlined, “An AI system would have to track every piece of its training data, and then be able to determine which of those pieces influenced any given output generated and to what degree. On top of that, each of those pieces of data would have to be attributed to a verified human or set of humans, and there would need to be a channel of payment for those humans to receive their residuals.” This forward-looking solution not only addresses income disruption for digital artists but also aims to embrace the growth of AI while embedding proportional reward for creative labor. 

Viewership-based paychecks and subscription models are just a few examples of policy proposals seeking to ensure that creative labor continues to be valued in an AI-driven economy. These initiatives—which echo the broader narrative painted by the plaintiffs of Andersen v. Stability AI Ltd.—reveal artists’ wishes for policy that complements economic reforms as gen AI reshapes production and monetization streams for creative industries. Copyright law must adapt alongside structural shifts to safeguard creative authenticity and fair compensation.

Traditional applications of the DMCA 1202 have proven insufficient to protect independent creators against gen AI, as seen in both Andersen v. Stability AI Ltd. and Doe v. GitHub, Inc. The multitude and scope of lawsuits indicate that regulating copyright within gen AI products requires industry-specific market analysis and contextual interpretation of the law. Policy initiatives, such as Assembly Bill No. 2013 and Hollywood artists’ rally toward residual payments, offer hope that federal legal frameworks could be implemented to protect artists’ income and creative authorship across states. Until then, the final verdict of Andersen v. Stability AI Ltd. would serve as the first of many pillars that would recompose how authorship and authenticity in art are valued in the digital marketplace.

Yunah Kwon