16.7 C
London
Wednesday, September 3, 2025
HomeLATEST HITSAnthropic Counsel Apologizes for Citation ‘Hallucination’ in Music Publishers Lawsuit — Pinning...

Anthropic Counsel Apologizes for Citation ‘Hallucination’ in Music Publishers Lawsuit — Pinning Most of the Blame on Claude

Date:

Related stories

MUBS Unveils Graduation List Ahead of 16th Graduation Ceremony

Makerere University Business School (MUBS) to Host 16th Graduation...

Gulu University appoints Ruhakana Rugunda as new Chancellor

Gulu University welcomes Dr. Ruhakana Rugunda as its new...

Metropolitan International University kicks off their 5th Graduation ceremony

Metropolitan International University (MIU) celebrates it's 5th Graduation ceremony...

Gulu University Set For 18th Graduation

Gulu University's Academic Registrar announced that the 18th Graduation Ceremony...
spot_imgspot_img

Time to lay off the use of AI in legal documents? Amid a high-stakes copyright battle with music publishers, Anthropic attorneys have apologized for an apparent citation “hallucination,” pinning the blame mainly on Claude. We broke down the citation crisis after counsel for the music publisher plaintiffs formally voiced related concerns to the court. Anthropic […]

Anthropic hallucination apology

Anthropic counsel has apologized for a citation ‘hallucination’ in an expert testimony submitted as part of a copyright battle with music publishers. Photo Credit: Igor Omilaev

Time to lay off the use of AI in legal documents? Amid a high-stakes copyright battle with music publishers, Anthropic attorneys have apologized for an apparent citation “hallucination,” pinning the blame mainly on Claude.

We broke down the citation crisis after counsel for the music publisher plaintiffs formally voiced related concerns to the court. Anthropic data scientist and expert witness Olivia Chen, the publishers maintained in more words, had seemingly referenced a non-existent academic paper.

Unsurprisingly, the serious allegation prompted the presiding judge to order an explanation on the part of Anthropic. And this explanation arrived in the form of a declaration from Latham & Watkins associate Ivana Dukanovic.

The way Dukanovic tells the story, an internal investigation confirmed “that this was an honest citation mistake and not a fabrication of authority.”

Running with the point, the Anthropic attorney indicated that the relevant American Statistician citation “includes an erroneous author and title, while providing a correct link to, and correctly identifying the publication, volume, page numbers, and year of publication of, the article referenced.”

So what happened? Well, according to the same declaration, Claude took some liberties when citing not just the American Statistician article, but other sources used in Chen’s testimony.

“After the Latham & Watkins team identified the source as potential additional support for Ms. Chen’s testimony,” Dukanovic penned, “I asked Claude.ai to provide a properly formatted legal citation for that source using the link to the correct article.

“Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors,” she continued.

Claude is also said to have fumbled with “additional wording errors” in different citations. Though so-called AI hallucinations aren’t exactly rare – including in legal settings – the situation certainly draws attention to the law firm’s review approach.

“During the production and cite-checking process for Ms. Chen’s declaration,” Dukanovic weighed in here, “the Latham & Watkins team reviewing and editing the declaration checked that the substance of the cited document supported the proposition in the declaration, and also corrected the volume and page numbers in the citation, but did not notice the incorrect title and authors, despite clicking on the link provided in the footnote and reviewing the article.”

These remarks may raise more questions than they answer. Chief among them: If one has to make all sorts of corrections to AI-powered legal citations, wouldn’t it be preferable to tackle the process without consulting a chatbot?

And at the risk of throwing salt on the imaginary-citation wound, it’s safe to say the reviewing team’s performance left something to be desired.

But as the (incorrectly) cited article actually exists, the “embarrassing and unintentional mistake” doesn’t mean “Chen’s opinion was influenced by false or fabricated information,” per the text.

“We have implemented procedures, including multiple levels of additional review, to work to ensure that this does not occur again,” added Dukanovic.

DMN asked Claude about the episode, and even it advised against using LLMs for legal citations.

“Regarding citation hallucinations more generally – this is a known limitation of large language models like myself,” Claude responded. “When asked to provide citations, if I don’t have perfect recall of specific sources, I might generate what seem like plausible citations based on my training patterns rather than accurate bibliographic information.

“For any situation requiring accurate citations, the best practice would be to use dedicated academic search tools and databases rather than relying on an AI system to recall specific publication details from memory,” Claude continued.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Related stories

spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

This site uses Akismet to reduce spam. Learn how your comment data is processed.