Music publishers, including Universal Music Group, Concord, and ABKCO, recently filed an amended copyright infringement complaint against the $61 billion-valued AI startup Anthropic.
And, once again, Anthropic – which has the backing of Amazon and Google, among others, to the tune of billions – has filed a motion to dismiss much of the case.
But this time, the publishers say they come armed with a stronger case, and they argue Anthropic hasn’t.
“For its part, Anthropic’s motion to dismiss simply rehashes some of the arguments from its earlier motion – while giving up on others altogether,” a spokesperson for the plaintiff publishing companies said in an email to MBW.
The publishers’ amended complaint arrived a few weeks after they were dealt a setback in their initial proceedings against Anthropic.
The publishers said at the time, however, that they remained “very confident’ of winning the case and would “vigorously pursue” monetary damages.
Most recently, following the amended lawsuit (filed April 25) and Anthropic’s latest motion to dismiss (filed on May 9), the publishers issued the following statement to MBW:
“Our amended complaint bolsters the case against Anthropic for its unauthorized use of song lyrics in both the training and the output of its Claude AI models.
“For its part, Anthropic’s motion to dismiss simply rehashes some of the arguments from its earlier motion – while giving up on others altogether.
“Anthropic now concedes that real-world users have prompted Claude for lyrics and received output copying our lyrics — including one of Anthropic’s own founders, as we highlight in the amended complaint.
“Anthropic has never once challenged our direct infringement claims, and we expect its recycled challenges to our other claims will also fail.”
The publishers allege Anthropic committed mass copyright infringement by training Claude on copyrighted lyrics, and then by allowing Claude to regurgitate those lyrics when prompted.
The new complaint asserts the same four charges the publishers leveled against Anthropic the first time around:
- Direct copyright infringement (training AI on copyrighted materials and reproducing copyrighted materials)
- Contributory copyright infringement (copyright infringement by Claude users)
- Vicarious copyright infringement (making money off Claude users’ copyright infringement)
- Violation of the Digital Millennium Copyright Act (stripping out content management information from the files Anthropic used to train its AI)
“Countless users unaffiliated with [music] publishers have prompted Claude for lyrics… and Claude has responded by generating output that likewise copies verbatim or near-verbatim the copyrighted lyrics to publishers’ works – including, for example, the lyrics to [Bob Dylan’s] Highway 61 Revisited, [Neil Diamond’s] Sweet Caroline, [Thin Lizzy’s] The Boys are Back in Town, and [The Police’s] Message in a Bottle – as shown in the limited records Anthropic has produced so far in this litigation,” stated the publishers’ amended complaint, which can be read in full here.
“Anthropic’s motion to dismiss simply rehashes some of the arguments from its earlier motion – while giving up on others altogether.”
music publishers
In their new motion to dismiss, Anthropic’s lawyers restated many of the arguments they made in the first (and successful) motion to dismiss, including that the music publishers didn’t prove that Anthropic knew its users were violating copyrighted lyrics; that they didn’t prove Anthropic made any money off allowing lyrics to be ripped off; and that they didn’t prove Anthropic knew the content it used had content management information stripped out.
The publishers’ amended complaint addresses many of these issues, including that Claude users have been prompting the chatbot to regurgitate lyrics.
“During just a nine-day period in September 2023, the month before publishers filed this lawsuit, the term ‘lyric’ appeared in more than 170,000 Claude prompt and output records — nearly 20,000 every day,” the complaint states.
“In total, literally millions of Claude prompt and output records contain the term ‘lyric.’ Many of these are prompts by third-party Claude users seeking lyrics to publishers’ works and output by Anthropic’s AI models copying those lyrics.”
The complaint continues: “What’s more, Anthropic itself has repeatedly requested lyrics from its Claude AI models when developing and training those models.
“Anthropic’s internal chat records reveal that key Anthropic employees explicitly contemplated prompting Claude for the lyrics to publishers’ works and discussed various lyric-related prompts. In fact, Anthropic’s own co-founder and chief compute officer Tom Brown queried ‘@Claude what are the lyrics to desolation row by Dylan?, one of publishers’ works…
“In short, contrary to Anthropic’s repeated representations that its users do not use Claude to find lyrics, the evidence shows the opposite.”
“Contrary to Anthropic’s repeated representations that its users do not use Claude to find lyrics, the evidence shows the opposite.”
Universal Music Group et al, in legal complaint against Anthropic
The complaint also seeks to address Anthropic’s claim that the publishers can’t prove the AI company stripped content management information from the copyrighted music files they used.
The publishers’ complaint alleges that high-ranking Anthropic employees, including its co-founders Benjamin Mann and Jared Kaplan, concluded that a tool they were using to strip out additional data from text files “left too much ‘useless junk’ – such as copyright notice information contained in footers – in scraped web data,” the complaint states.
“Mann also expressed his desire that the AI ‘model will learn to ignore the boilerplate,’ like copyright notices.”
Anthropic has responded to the allegations with a new motion to dismiss much of the case. As it did with its previous motion last August, the AI company is asking the court to toss out three of the four charges and focus on just one – direct copyright infringement.
Anthropic’s plan – as it has implied in its court filings – is to focus on that one charge in order to argue that using copyrighted content to train AI should be considered “fair use” under US copyright law – an idea that’s vehemently rejected by much of the music business and other creative industries.
That strategy worked once before. In March of this year, Judge Eumi K. Lee of the US District Court for the Northern District of California granted Anthropic’s motion to dismiss all but one charge – but the judge left the door open for the music publishers to refile their complaint, which the publishers have now done.
In its new motion to dismiss, Anthropic argues that the music publishers still haven’t made their case for anything other than a claim of direct copyright infringement.
“After a year of discovery and two opportunities to plead their claims before this court, plaintiffs still cannot plausibly allege that Anthropic had the requisite knowledge of specific infringements for purposes of contributory liability, that Anthropic obtained a direct financial benefit for purposes of vicarious liability, or that Anthropic had the required mental state for purposes of their DMCA claim,” states the motion, which can be read in full here.
The hearing on Anthropic’s latest motion to dismiss is tentatively scheduled for July.
The motion is the latest in a court case that has taken various twists and turns, with some rulings coming down in favor of the music publishers while others strengthened Anthropic’s hand.
One significant victory for the music publishers came early this year when the court approved a plan to add “guardrails” to Anthropic’s AI to prevent it from spitting out copyrighted lyrics. The rule applies to Anthropic’s currently available AI tools and future tools as well.
However, the publishers’ new complaint suggests that Anthropic’s guardrails may not be working as intended.
“Anthropic’s post-suit guardrails continued to be ineffective at preventing infringing output copying publishers’ lyrics. For example, in November 2024, over a year after publishers filed the [initial] lawsuit, publishers’ investigators found that the latest versions of Claude continued to generate unauthorized copies of publishers’ lyrics when accessing Claude via Anthropic’s partners,” the publishers’ complaint states.
In March of this year, in a setback for the publishers, the court rejected a petition for a preliminary injunction against Anthropic that would have prohibited the AI developer from using the publishers’ lyrics to train its AI.
Judge Lee concluded that the publishers had failed to demonstrate “irreparable harm” from Anthropic’s (alleged) use of the lyrics – a prerequisite for this type of injunction.
But the court did rule in the publishers’ favor one another issue: It granted publishers the right to search through Anthropic’s records for any prompts that include a song title and the word “lyrics,” whenever the two terms appeared within 20 words of each other.
Anthropic had been arguing that it should only produce records of prompts when the song title and “lyrics” appeared within five words of each other, which would likely have surfaced far fewer cases of potential copyright infringement.Music Business Worldwide
www.musicbusinessworldwide.com (Article Sourced Website)
#Music #publishers #file #amended #lawsuit #firm #Anthropic #bolsters #case #companys #unauthorized #song #lyrics #Music #Business #Worldwide