This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 6 minute read
Reposted from Advertising Law Updates

The Generative AI Journey Continues

When it comes to tech powered by generative AI, we are on the verge of a Cambrian Explosion, a period during which we can expect an unprecedented variety of innovative and spectacularly useful tools to emerge from the pre-AI primordial muck. And, at the dawn of this era, courts already are issuing preliminary rulings in cases where plaintiffs are challenging, and defendants are championing, these tools, cases that could determine how and to what extent these tools are allowed to flourish. Jeremy Goldman wrote about a recent case (see this post), and today I write about another.

Andersen v. Stability AI, LTD is a class action lawsuit brought by three visual artists against Stability AI, the developer of Stable Diffusion, a latent, text-to-image model capable (in its words) of “generating photo-realistic images given any text input.” The plaintiffs also sued DeviantArt and Midjourney, whose generative AI tools (respectively, DreamUp and the eponymous Midjourney) are alleged to incorporate Stable Diffusion technology. The plaintiffs contend that these tools infringe upon their copyrights, rights of publicity and other rights. 

Last month, Judge William H. Orrick dismissed the majority of the plaintiffs' claims. But this is not the end … not by a long stretch. The court let stand arguably the most significant of the plaintiffs' claims (the claim for direct copyright infringement arising from the use of billions of images to train Stable Diffusion), and the court gave the plaintiffs leave to amend “to provide clarity regarding their theories" underlying their other claims. 

Below I summarize the court's ruling on the motion to dismiss the plaintiffs' direct copyright infringement and right of publicity claims.

Direct Copyright Infringement (Input) Against Stability

The plaintiffs alleged that defendant Stability was liable for direct copyright infringement because it had used their works (along with billions of others) as training images for its Stable Diffusion product. Specifically, the plaintiffs alleged that Stability paid Large- Scale Artificial Intelligence Open Network (LAION) - a non-profit that (in its own words) aims “to make large-scale machine learning models, datasets and related code available to the general public" - to scrape from the internet over five billion images (among them the plaintiffs') for training.

Because two of the plaintiffs (McKernan and Ortiz) had not registered their works with the Copyright Office (a prerequisite for suing, 17 U.S.C. § 411), the court dismissed their claims with prejudice. (It's not clear why the court dismissed these claims with prejudice - perhaps because counsel stated during oral argument that they were not pursuing copyright claims on behalf of these plaintiffs?) The defendants' argued that the claims by the third plaintiff (Andersen) - who had properly registered sixteen collections of her works - should be dismissed because she had failed to specify which of the works in those collections had actually been used by Stability in training. The court disagreed, finding that Andersen's reliance on the search results from the website “ihavebeentrained.com” (which indicated that many of her works had, in fact, been used in training), combined with the complaint's allegation that LAION had scraped over five billion images for its training datasets, supported “the plausibility and reasonableness of her belief" that those of her registered works that were posted online had, in fact, been scraped into the training datasets.

Finally, the court held that the plaintiffs' allegations that Stability had, without permission, “downloaded or otherwise acquired copies of billions of copyrighted images without permission" and caused those “images to be stored at and incorporated into Stable Diffusion as compressed copies" were sufficient to plead direct copyright infringement by Stability. According, the court refused to dismiss the plaintiffs' “primary theory” of direct copyright infringement.

Direct Copyright Infringement (Input) Against DeviantArt

Defendant DeviantArt hosts an online community where digital artists can share their works; it also offers its own AI image generator, DreamUp, which is powered by Stable Diffusion. Plaintiffs alleged that one of the LAION datasets used to train Stable Diffusion was created by scraping DeviantArt’s site. At the outset, the court held that merely “being a primary source” for training images did not support a claim for direct infringement. 

However, the plaintiffs also alleged that (1) “compressed copies” of training images are embedded within Stable Diffusion, and (2) DeviantArt was liable for direct infringement because it distributed Stable Diffusion (and, therefore, the “compressed copies" of training images embodied therein) as part its own DreamUp product. The plaintiffs did not appear to be alleging that actual copies of all of the training images were incorporated within Stable Diffusion. Indeed, as the defendants noted, that would be impossible since no active application could compress five billion images. Instead, the complaint described Stable Diffusion as providing “an alternative way of storing a copy of [training] images” that used “statistical and mathematical methods to store these images in an even more efficient and compressed manner.” 

Ultimately, the court agreed with the defendants that plaintiffs' allegations were unclear and contradictory and invited the plaintiffs to amend the complaint to allege with greater clarity precisely how the training images were “embedded” within Stable Diffusion:

“If plaintiffs contend Stable Diffusion contains “compressed copies” of the Training Images, they need to define “compressed copies” and explain plausible facts in support. And if plaintiffs’ compressed copies theory is based on a contention that Stable Diffusion contains mathematical or statistical methods that can be carried out through algorithms or instructions in order to reconstruct the Training Images in whole or in part to create the new Output Images, they need to clarify that and provide plausible facts in support.”

Direct Copyright Infringement (Output) Against DeviantArt

Plaintiffs also alleged that DeviantArt's DreamUp program produces and distributes output images that are infringing derivative works of the training images. The defendants urged the court to dismiss the claim because the plaintiffs had failed to allege that the output images were substantially similar to the plaintiffs' copyrighted works: on the contrary, the plaintiffs had admitted in the complaint that “none of the Stable Diffusion output images provided in response to a particular Text Prompt is likely to be a close match for any specific image in the training data.” In response, the plaintiffs contended that all elements of plaintiff Anderson’s copyrighted works (and the copyrighted works of all others in the purported class) "were copied wholesale as Training Images and therefore the Output Images are necessarily derivative.” 

Once again, the court found there were numerous defects in the plaintiffs' complaint. As with the direct infringement (input) claims, the plaintiffs' “theory regarding compressed copies and DeviantArt’s copying needs to be clarified and adequately supported by plausible facts.” Moreover, the court found it “simply not plausible” that all of the images used to train Stable Diffusion were copyrighted (as opposed to copyrightable), or that all the output images were derivative of copyrighted training images. And, perhaps most important, the court was “not convinced that copyright claims based on a derivative theory can survive absent ‘substantial similarity’ type allegations.” (See this post.) Accordingly, the court dismissed the claim, with leave to amend.

Direct Copyright Infringement (Input and Output) Against Midjourney

The court also dismissed (with leave to amend) the plaintiffs' direct infringement claims against Midjourney, which were nearly identical to those it brought against DeviantArt. However, the court called out that the plaintiffs had failed to allege facts regarding what training, if any, Midjourney had conducted for its Midjourney product, and that the plaintiffs needed to clarify if their theory of liability “is it based on Midjourney’s use of Stable Diffusion, on Midjourney’s own independent use of Training Images to train the Midjourney product, or both?”

Plaintiffs' Right of Publicity Claims

In their complaint, the plaintiffs asserted that the defendants misappropriated their names and their “artistic identities," in violation of their statutory and common law right of publicity, because the defendants' AI tools allow users to request art “in the style of their” names. In their brief and at the hearing, the plaintiffs “clarified” that their claims were based on the defendants’ use of their names to advertise and promote their DreamStudio, DreamUp, and Midjourney products. 

The court dismissed the claims, once again with leave to amend:

“The problem for plaintiffs is that nowhere in the Complaint have they provided any facts specific to the three named plaintiffs to plausibly allege that any defendant has used a named plaintiff’s name to advertise, sell, or solicit purchase of DreamStudio, DreamUp or the Midjourney product. Nor are there any allegations regarding how use of these plaintiffs’ names in the products’ text prompts would produce an “AI-generated image similar enough that people familiar with Plaintiffs’ artistic style could believe that Plaintiffs created the image,” and result in plausible harm to their goodwill associated with their names, in light of the arguably contradictory allegation that none of the Output Images are likely to be a “close match” for any of the Training Images. Plaintiffs need to clarify their right of publicity theories as well as allege plausible facts in support regarding each defendants’ use of each plaintiffs’ name in connection with advertising specifically and any other commercial interests of defendants.” (Cleaned up.)

Since it had dismissed the right of publicity claims with leave to amend, the court refused at this juncture to consider the defendants' First Amendment defense - i.e., that the output of these tools was “transformative” under Comedy III Productions, Inc. v. Gary Saderup, Inc., 25 Cal.4th 387 (2001). The court invited the defendants to raise this defense again after the plaintiffs have amended their complaint and clarified their theories of liability for the right to publicity claims.

Andersen v. Stability AI LTD, No. 23-cv-00201-WHO (N.D. Cal. Oct. 30, 2023)

Tags

copyright, copyright infringement, artificial intelligence, ai, art, derivative works, stable diffusion, right of publicity