South African author Zakes Mda has confirmed that he is among the writers set to benefit from a landmark $1.5bn (R25.7bn) settlement reached between artificial intelligence company Anthropic and a group of authors whose books were allegedly used without permission to train AI systems.
“Don’t be deceived by the billions. They are not all coming to me. They are shared by many writers in the US,” said Mda.
The lawsuit, Bartz v Anthropic, was filed by authors Andrea Bartz, Charles Graeber and Kirk Wallace Johnson, with support from the Authors’ Guild.
The case accused Anthropic, the company behind the AI chatbot Claude, of downloading copyrighted books from online pirate sites such as LibGen and PiLiMi to train its large language models, a practice that violated the rights of writers and publishers.
In June 2025, judge William Alsup of the US District Court for the Northern District of California issued a mixed ruling finding that using lawfully obtained books for AI training might qualify as “fair use,” but copying books from pirate sites would not.
The court certified a class action for those piracy-related claims, paving the way for authors to seek collective compensation.
After mediation, Anthropic agreed to a proposed $1.5bn settlement, one of the largest of its kind.
Authors and publishers whose works were found to have been illegally downloaded will each receive about $3,000 per book, though the final amount will depend on the total number of valid claims.
“You may remember an article in a South African newspaper a few months back reporting that my books and those of Nadine Gordimer were used to train ChatGPT without our permission,” said Mda. “Well, we did sue and there was a case they call class-action in the US called Bartz v Anthropic which we [fellow writers whose books were used without our permission] won,” said Mda.
He said he was told by his literary agents in London about the outcome.
“I received an email from my agents which partly reads: ‘There was a major copyright lawsuit in the US brought by authors against AI company Anthropic for using books without permission to train large language models. A settlement agreement was preliminarily approved in September 2025. The settlement would see authors whose US publications have been used by the company to train AI models receive a share of $1.5bn,’” he said.
He revealed that six of his novels, Ways of Dying, The Heart of Redness, The Madonna of Excelsior, The Whale Caller, She Plays with the Darkness and Sometimes There is a Void were used without his knowledge.
“Damn! I didn’t know of my contribution to the development of AI but at least, however little, I will be compensated for each of the books that were used,” he said.
Mda used the opportunity to highlight broader concerns about South African authors’ rights in the age of artificial intelligence.
“The only reason I am telling you this story is to ask the question: what about our South African writers? You can’t tell me these AI developers only used books published in the US,” he said. “Does South African law allow South African-based writers to fight for their intellectual property rights in South African courts? Or does the so-called ‘fair use’ doctrine leave them out in the cold from enjoying the fruits of their labour? Find out from those who know,” said Mda.
The settlement covers only works proven to have been illegally obtained by Anthropic.
Eligible authors and publishers have until January 7 to opt out of the settlement if they wish to pursue individual claims, while the deadline to submit compensation claims is March 23 2026.
A final approval hearing is scheduled for April 22 2026 with payments expected later that year.
The Authors’ Guild has encouraged writers to verify whether their works appear on the official “Works List”, a database of titles allegedly copied by Anthropic and to decide before the deadlines whether to participate in the settlement.
TimesLIVE






Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.