The ONIX 3 debacle
Why the transition now becomes a necessity. By Thad McIlroy
“ONIX” is known by parents, and even grandparents, as “a very aggressive Pokémon that will constantly attack humans.” By publishers, more importantly, it is known as the international standard in metadata distribution. Without a set of metadata, not a single copy of any book can be sold – at least not where digital book catalogs are involved, that is, via wholesalers, big chains, or internet-based retail.
Released in 2000, the current version of ONIX is 3.0, developed in 2009. ONIX 3.0 was a huge leap forward in terms of sales performance. (Besides, it is indispensable to meet legal requirements concerning accessibility of web content in the EU.) Implementation was, and is, as easy as the implementation of its predecessor, ONIX 2.1, which was sunset in 2009.
Yet, an unknown but probably relevant number of probably relevant publishers on both sides of the Atlantic still distribute metadata in a format created in the earliest days of the millennium. As late as mid-January 2025, the commercial wing of the “Börsenverein des Deutschen Buchhandels” published its last edition of a mapping guide for the transition from ONIX 2.1 to ONIX 3.0. And in November 2025, Amazon set a strict deadline of March 2026 for a full conversion and won’t accept book metadata in a format earthed 17 years ago.
Publishing consultant and industry blogger Thad McIlroy shakes his head in disbelief.
Beware: This post talks about metadata for books, and it talks about ONIX, the data format used to convey metadata for the retailing of books. But it is also about technology in the age of AI.
It would be impossible to recap the whole story of metadata for books here, except in the clumsiest of fashions, and so if this is a topic that still mystifies, or that you disdain, move on. (Or, alternately, have a read of John Warren’s “Zen and the Art of Metadata Maintenance,” probably the best intro out there.)
- Werbung -
Will the book publishing industry finally fully embrace a sixteen-year-old version of a standard?
Here’s the debacle:
ONIX 2.1 was ‘sunsetted’ (i. e. no longer actively supported) by EDItEUR at the end of 2014, nearly 12 years ago — it has not been revised or improved since. This is not a slight on EDItEUR — why update an old format, when something new and improved is readily available? ONIX 3 is so much more robust than 2.1 ever was. A lot of information about a book can be communicated in ONIX 3 that could never be expressed in 2.1. Using the data contained in an ONIX 3 file, publishers can sell more books than they can using 2.1. And if you already know how to use 2.1, ONIX 3 is not technically difficult to implement.
BISG used to publish a sort of shame list showing who was supporting each version of ONIX. The last copy of the list I’ve got is a decade old. At the time, only 40 % of respondents were using ONIX 3. I know it’s a much higher adoption now. But still…
The ONIX Timeline
2000: ONIX v1 released
2001: ONIX 2.0 released
2003: ONIX 2.x revised (2.1)
2009: ONIX 3.0 released
2014: ONIX 2.1 sunsetted by EDITEUR
2026: Amazon ONIX 3 deadline
On October 28, 2025, the Book Industry Study Group (BISG) published a post, “Time to Act: The ONIX 3 Transition is Actually Here.” It’s written by Claire Holloway, the hard-working chairperson of BISG’s Metadata Committee. Claire is, as always, reasonable in her outlook and fully supportive of the publishing community. But the story she’s telling is, to me, chilling, in the way it reminds one of ONIX’s slow journey across two dozen years.
Apparently, Amazon is finally going to ‘force’ the publishing industry to transition to ONIX 3 — it has set a deadline of March 2026 for a full conversion.
But Claire points out also that the transition to ONIX 3 “isn’t just a technical upgrade—it’s becoming a legal necessity. The European Accessibility Act (EAA) requires metadata about an ebook’s accessibility features to be provided and communicated throughout the supply chain for any ebook being sold in the EU.” (The prior versions of ONIX have no ability to express this information because it wasn’t a factor way back when.)
So, in 2026, will the U. S. book publishing industry finally fully embrace a sixteen-year-old version of a standard that can make their businesses more profitable? Particularly when EDItEUR continues to update and improve the standard (we’re actually at version 3.1.2 now).
We’ll see.
By courtesy of Thad McIlroy. Original text

Thad McIlroy (LinkedIn profile page) is one of the United States’ most visible digital publishing analysts and authors, and principal of The Future of Publishing, based in San Francisco. He is a contributing editor to Publishers Weekly, covering artificial intelligence, digital innovation, and publishing startups. His latest book, The AI Revolution in Book Publishing: A Concise Guide to Navigating Artificial Intelligence for Writers and Publishers, was published in 2025. McIlroy has authored a dozen other books and over five hundred articles on publishing technology. In 2024, he joined the advisory board of Johns Hopkins University Press and became a visiting scholar at the Publishing Master of Professional Studies program at The George Washington University.
Mehr Artikel aus dem Channel entdecken
Dieser Artikel ist Teil des Channels Digital Publishing Technologien, der sich mit Content-Strategien und Prozessen beschäftigt. Der Channel wird gesponsert von Fabasoft Xpublisher.

