What is next for Blueprint Two after the delays?
Re-engineering London’s ancient infrastructure is a Herculean task
The market cannot afford to sit back and wait – participants must work together on a strategy for pre-bind digitalisation
Another day, another Blueprint Two delay. September’s news pushed the bureau replatforming element of the strategy out to “at least 2028”. By then, we will have artificial intelligence (AI) chatbots writing better novels than we can, but will we have a fully digital London market? No, we will not.
Let us not forget this is a delay to the redevelopment of the “heritage” bureau systems we are talking about here. Phase Two’s new digital services, which will bring those core systems and how we interact with them into the digital age, are completely dependent on Phase One and they remain resolutely TBC – 2030 perhaps? Who knows?
Now, while we are all throwing our hands up, it is worth reminding ourselves this is not a straightforward project. The job of re-engineering London’s ancient and venerable bureau infrastructure is nothing short of Herculean (but let us all hope it is not Sisyphean).
Re-engineering 40 years of cumulative and poorly documented mainframe system developments is not a simple job. And if the team misses a beat, we are not just talking about late invoices and a bit of egg on the face; we risk bringing the money flow in the world’s largest insurance hub to a grinding halt. Nothing trivial then.
So, if it takes more time, it takes more time. Accuracy trumps speed every day of the week.
But when it does come, as I am confident it eventually will, with its APIs exchanging CDR core data to replace the PDFs, and once we are all connected with APIs to the new Premium and Claims Orchestration Services, will this make us a digital market?
What about the pre-bind process?
Only in part. Most of those systems concern themselves with point-of-bind and post-bind processes. But what of the pre-bind processes? By and large, that is where the data must come from in the first place: from the client, through the distribution chain and the brokers, through the trading platforms and into the carriers and bureau. We have nothing defined for this part – there is no strategy at all.
But, as nature abhors a vacuum, those market participants that have an interest in becoming fully digital are inventing for themselves how they are going to do it. And unless there is a serendipitous coincidence of independent ideas, miraculously arriving at the same solutions, our end state will consist of multiple incompatible techniques.
Take digital contracts, for example. We already have the Market Reform Contract (MRC) v3, which mixes traditional contractual text with data elements. But to call MRC v3 a digital contract would be stretching the definition beyond breaking point – it is a Word document that can be queried without AI and data extraction tools. And that does not sound very digital to me.
It is not enough for each firm to digitalise in isolation; we must make sure systems can talk to each other cleanly, contracts and claims flow seamlessly and everyone is working from the same playbook. Otherwise, all we have done is replace today’s paper friction with tomorrow’s digital friction
Better, we have the newly emerging digital contract builders, which construct contracts entirely from clause libraries and data elements input by the user or via APIs from underlying systems. Some of these are standalone applications, others are built into the trading platforms. This sounds much more like it, except they are all completely different and not interoperable as there is no agreed market standard for their inputs and outputs.
Worse, we have no standard market data model for a digital contract. Data models are crucially important in defining how the data is structured to correctly describe the contract – think sections, sub-sections, clauses, paragraphs and so on. We now have several technology companies and “self-builder” brokers, each with their own different interpretations of a model or even no data model at all. It is the Wild West.
Then what of the schedules of values that accompany the contracts? And the dreaded delegated authority bordereaux? We are still nowhere close to having a standard for these. And so it goes on.
A long journey ahead
This does not feel much like the slick, data-first marketplace we were promised. It is more like digital spaghetti.
It is not enough for each firm to digitalise in isolation; we must make sure systems can talk to each other cleanly, contracts and claims flow seamlessly and everyone is working from the same playbook. Otherwise, all we have done is replace today’s paper friction with tomorrow’s digital friction. Interoperability is not a buzzword; it is the foundation of a functioning digital market. One that must be agreed, designed and implemented as a market together.
Yes, the Blueprint Two timeline has slipped. But let us not lose sight of the bigger picture. The point of Blueprint Two was never to hit a 2024 deadline, a 2026 one or even 2028.
The point is to build a modern, digital London market post-bind process to allow us to be faster, smarter and more competitive on the world stage. So, while Velonetic is getting on with that critical strategy, the market cannot afford to sit back and wait. Nor can it afford to let every participant go running off in different directions trying to solve pre-bind digitalisation for themselves.
We must use the time to agree on a strategy for that. And execute it. Together.
Jeff Ward is head of growth at Ebix Europe