11.5 C
London
Wednesday, October 15, 2025
HomeLATEST HITSMajor Music Publishers Accuse Anthropic AI of ‘Hallucinating’ Citations In Ongoing Copyright...

Major Music Publishers Accuse Anthropic AI of ‘Hallucinating’ Citations In Ongoing Copyright Dispute

Date:

Related stories

MUBS Unveils Graduation List Ahead of 16th Graduation Ceremony

Makerere University Business School (MUBS) to Host 16th Graduation...

Gulu University appoints Ruhakana Rugunda as new Chancellor

Gulu University welcomes Dr. Ruhakana Rugunda as its new...

Metropolitan International University kicks off their 5th Graduation ceremony

Metropolitan International University (MIU) celebrates it's 5th Graduation ceremony...

Gulu University Set For 18th Graduation

Gulu University's Academic Registrar announced that the 18th Graduation Ceremony...
spot_imgspot_img

The high-profile copyright lawsuit between major music publishers and Anthropic’s chatbot Claude just took a dramatic turn. Anthropic’s legal counsel is accused of submitting a court filing containing AI-generated hallucinations to an academic citation that does not exist. Today a federal judge in San Jose ordered Anthropic to address allegations that one of its expert […]

anthropic accused of submitting court documents with AI hallucinations

Photo Credit: Anthropic

The high-profile copyright lawsuit between major music publishers and Anthropic’s chatbot Claude just took a dramatic turn. Anthropic’s legal counsel is accused of submitting a court filing containing AI-generated hallucinations to an academic citation that does not exist.

Today a federal judge in San Jose ordered Anthropic to address allegations that one of its expert witnesses referenced a non-existent academic paper in the company’s court filing. The citation is purportedly from the journal American Statistician and was included in the filing to bolster Anthropic’s argument that the reproduction of copyrighted song lyrics is a “rare event.”

Attorneys representing Universal Music Group, Concord, and ABKCO discovered that the cited article from the court filing does not exist. Upon checking with both the alleged author and the journal, the plaintiffs confirmed the citation was a complete fabrication. Attorney Matt Oppenheim, who represents the music publishers, suggested that expert witness Olivia Chen relied on Anthropic’s own AI tool Claude to generate both the argument and supporting authority.

Oppenheim stopped short of accusing Chen of deliberate misconduct, but he emphasized the seriousness of submitting a court document citing AI-generated falsehoods in court. Meanwhile, Anthropic’s legal team has characterized the incident as an accidental citation mistake, noting the incorrect citation seemed to reference the correct article but linked to a different one entirely.

Music publishers allege that Anthropic unlawfully used lyrics from hundreds of songs from Beyoncé to The Rolling Stones to train Claude—and that Claude often returns the lyrics verbatim in response to certain user prompts. This isn’t the first time AI-generated hallucinations have ended up in court, either.

online pharmacy purchase nolvadex with best prices today in the USA

One of the first incidents was the Mata v. Avianca case in New York in 2023. Two New York attorneys representing a plaintiff in a personal injury suit against Avianca Airlines used ChatGPT to generate their legal research. The AI produced several non-existent cases, which the attorneys cited in their filings.

online pharmacy purchase levitra oral jelly with best prices today in the USA

After a judge discovered those fabrications, he issued a $5,000 sanction against both.

At least seven cases across the United States have seen courts question or discipline lawyers for submitting AI-generated hallucinations in their legal filings.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Related stories

spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

This site uses Akismet to reduce spam. Learn how your comment data is processed.