AI content raises difficult ethical questions

  • Print
Listen to this story

Subscriber Benefit

As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe Now
0:00
0:00
Loading audio file, please wait.
  • 0.25
  • 0.50
  • 0.75
  • 1.00
  • 1.25
  • 1.50
  • 1.75
  • 2.00

At a lightning pace in recent years, generative artificial intelligence has struck the creative industries with an explosion of new writing, music, images and video.

Indeed, markets have been buzzing with AI investments, with leading companies such as OpenAI exceeding valuations of $80 billion. Yet, society hasn’t agreed on AI’s place in our lives.

Just weeks ago, the popular blues rock group—Tedeschi Trucks Band—found themselves in hot water with fans who discovered the artist who designed their Red Rocks Amphitheater concert posters concededly used AI to create the images.

People don’t like to feel bamboozled and struggle with the artistic value of AI-created content. But AI is inevitable—and once the markets and society find an equilibrium, the law will still have to define ownership. Can it?

Claiming ownership is a tricky question. Rightsholders such as artists, authors and comedians have filed lawsuits against the tech companies behind AI tools. The cases allege that tech firms have violated creators’ rights by using copyrighted material to train their AI models.

Developers argue that generators mostly create new output, so their products do not infringe copyright. The tension here exists in defining a novel idea or expression. But before the courts can decide who owns what, they must first decide whether certain AI products constitute novel expressions worth protecting.

The most recent example is New York Times Company v. Microsoft Corporation et al, No. 1:23-cv-11195 in the Southern District of New York. The New York Times alleged that millions of its copyrighted works were used to create the LLMs of Microsoft’s Copilot and OpenAI’s ChatGPT, and that these AI tools generate outputs that recite NYT content verbatim, closely summarize it, mimic its expressive style, and falsely attribute outputs to NYT.

Microsoft and OpenAI are testing the limits of the fair-use doctrine under the theory that AI tools are pattern-matching technologies that write responses by predicting the likeliest next word based on what they have been trained on.

Even if tech firms convince the courts their products merely absorb and rearrange copyrighted work, they have yet to answer for how they acquired their inspiration.

The inspiration is the source material. In many cases, tech firms scraped their material from the internet without permission. In 2022, David Holz—founder of popular AI image generator Midjourney—admitted that his tool hoovered up 100 million images without seeking permission from their owners. Large corporations are getting around this issue by entering licensing agreements with AI firms, but the damage may already be done to individual creators if the fair-use doctrine ultimately protects the technology.

As industry experts nervously track these cases, it’s uncertain how the courts will rule. So, in the meantime, we should advise our creative clients to carefully consider the terms and conditions of the program they use. And if they are creating AI programs, they should create ironclad use agreements.

The biggest irony in this web of ownership is that AI-generated content is generally not protected by copyright. The US Copyright Office and federal courts have been affirming the bedrock principle of copyright law, human authorship.

Even in cases where people develop their own AI software to create art, the art is not protected by copyright laws because it is not human-made. Yet for centuries people have created tools to make art, such as a paintbrush or a violin.

Eventually, humans have developed machines that can create the tools they use to make art. If Congress fails to legislate, it may take the courts decades to unravel ownership and reflect societal views on what is protectable art.

At a certain point, it seems profoundly inhuman to distinguish the black box of creativity from complex code powering AI software.

Whether you believe AI is an opportunity or a threat to human creativity, the basic process is the same. On the human side of things, we have phrases like “good reading makes for good writing.” Similarly, if you ask any prolific musical artist about their creative process, they will tell you they are driven by consuming all the great music that came before them.

Whatever happens in our brains that make this happen, we somehow managed to replicate it in the form of complex algorithms. If we can do that, then surely we can use the law to once again define ownership and embrace new art. •

__________

Conner Dickerson is a member of the Business Services & Litigation and Real Estate Services & Litigation practice groups at Cohen & Malad. Opinions expressed are those of the author.

Please enable JavaScript to view this content.

{{ articles_remaining }}
Free {{ article_text }} Remaining
{{ articles_remaining }}
Free {{ article_text }} Remaining Article limit resets on
{{ count_down }}