
No money
for AI
Expert opinion.
No money
for AI
Expert opinion.
from
Short introduction to the article ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy.
What is it all about?
An expert was commissioned to answer medical questions relating to his specialty of oral and maxillofacial surgery. Just over a month after being commissioned, he delivered a document entitled “Expert opinion”. He charged EUR 2,374.50 for this work.
The district auditor applied for a review – and the court took a closer look. It was established that the expert opinion had been prepared with the help of artificial intelligence .
Can a forensic expert use artificial intelligence to prepare his expert opinion – and can he then collect his usual remuneration?
The decision of the Darmstadt Regional Court
In a remarkable decision, the Regional Court of Darmstadt Decision of 10.11.2025 – Ref. 19 O 527/16 fixed the remuneration of an expert at EUR 0.00 because he had apparently prepared his expert opinion largely using AI and without disclosing this to the court.
The court set the remuneration to zero for several reasons:
The expert had not clarified whether he had even prepared the expert opinion himself. He responded evasively or not at all to questions from the court. However, an expert is obliged to inform the court if other persons are involved in the preparation of the expert opinion. The expert opinion was therefore already unusable.
The expert opinion was also useless in terms of content. No examination of the plaintiff had taken place and the expert opinion referred to an accident that had never happened.
The court was convinced – and this is where it gets interesting – that the expert opinion had been prepared in significant parts using an AI. This conviction was based on several pieces of evidence:
- The expert named himself with his full address as the addressee of the evidence order addressed to him – a typical AI error
- Threefold repetition of text modules – atypical for human-generated texts
- Almost exclusively main clauses with identical sentence beginnings – a common AI pattern
- Sentence fragments that are explained by AI-typical queries as to whether the prompt was understood correctly
- Passages that read like a generic AI summary of the files
- A distinctly different writing style between different sections of the document
The legal classification
The court made it clear that an expert must prepare his report personally. Anyone who largely uses an AI instead does not fulfill this obligation. However, personal reimbursement is a core requirement for the remuneration of an expert.
Since there are considerable doubts about the scope of the “AI work” in this respect, the “expert opinion” as a whole cannot be used.
Even if one were to take a different view of the points relating to the use of AI and the lack of declaration, according to the court, the amount of time billed was disproportionate to the mere 1.5 pages of actual explanations on the matter. A maximum of four hours would have been appropriate.
Significance for practice
The decision is likely to have a signal effect – far beyond this specific case:
For experts
Anyone using AI tools must disclose this to the court. Secret use of AI can not only lead to a reduction in remuneration, but may also constitute a breach of the duty to provide a personal expert opinion.
For courts
The decision shows how courts can recognize AI-generated content – through stylistic analysis, plausibility checks and critical assessment of text patterns.
For the AI debate
The case makes it clear that the use of AI is not automatically reprehensible – but transparency is absolutely essential. An expert who uses AI as an aid, but communicates this openly and critically examines the results and enriches them with their own expertise, would probably have been judged differently.
Conclusion
The decision by Darmstadt Regional Court shows that artificial intelligence in procedural law is not a sure-fire success. Anyone commissioned as an expert witness owes personal expertise – not the forwarding of prompts to an AI. The court has made it clear here that the duty to provide a personal expert opinion is taken seriously.
The decision is part of a growing body of case law on responsibility for AI-generated content. Whether in terms of the liability of platform operators for AI bots or the remuneration of experts – the principle is the same: those who use AI remain responsible. And those who do not accept this responsibility must expect consequences.
For experts, this means that AI can be a useful tool – but it is no substitute for their own expertise, personal investigation and conscientious expert opinion. And above all: transparency towards the court is not optional, but mandatory.
We are happy to
advise you on
AI!






