BOOK A DEMO

Building Data Ecosystem for Delegated Authority: 5 Considerations

Written at Jul 31, 2023 1:57:07 PM by Dave Connors

“…we started talking about the whole data ecosystem for DDM. We decided to step back and ask ourselves questions like: what should the data ecosystem for delegated look like?”

— Bob James, Market Transformation Director, Future at Lloyd's

 

So, well – what should a data ecosystem for Lloyd’s DA look like?

I have – across a few forums – been pretty vocal in my belief that the Lloyd’s approach to DA reporting is not just wrong but a serious mistake and a risk to the market’s standing. But while it’s easy to criticise, it’s harder to propose solutions. So, what do I think the Lloyd’s DA ecosystem should look like?

Well, a glib answer would be that it should look a helluva lot like what we’re building at distriBind… and well, yeah it really should… But looking deeper, here’s 5 concepts in particular that the market needs to get to grips with.

1.        Any standards must be internal. When I say “DA is more important to Lloyd’s than Lloyd’s is to DA”, this is not (just) to be provocative, but to recognise that while DA makes up around 40% of Lloyd’s market premium, the global non-Lloyd’s DA premium dwarfs the market’s slice of the pie. Forget the prestige name, Lloyd’s simply cannot impose a standard (particularly one as irrational and unwieldy as the v5.2) on participants outside the market. But this is about more than spreadsheet formats: risk code sets, transaction codes, reference data lists – none of this needs to be externalised, but can be translated and transformed internally. The market needs to make it very simple to send data to it. Standards-free data exchange is possible - don’t let legacy providers tell you otherwise.

2.       We’re not training for the World Excel Championships. These are, apparently, a thing. By contrast, the average bordereaux is more Pub League than World Cup. The inability to develop any alternative method of data transfer beyond spreadsheet is not so much a failure as an embarrassment. One of the market’s great successes in recent years has been its ability to provide capacity to InsurTechs and make them Coverholders… taking these modern digital platforms and asking them to submit data via spreadsheet is absurd. Building on the idea of making it very simple to send data to Lloyd’s, that means recognising that one size doesn’t fit all. Nobody should have to slow down, or speed up… let data flow in whatever format is easiest for the sender in that transaction: API, XML, PDF and yes, even spreadsheet.

3.       So some transactions happened in the same month... Nothing is more baffling to me than the idea that transactions in the same month are inextricably linked… 1,000 risks on a bordereaux; 990 are absolutely fine, 10 have errors/invalid data… so the 990 have to wait until the 10 get fixed. And this occurs by sending back and forth incremental versions of the bordereaux – all 1,000 risks ping-ponging back and forth. One consultant I spoke to said their Coverholder client’s Premium bordereaux would go through an average of 14 versions before it would be signed at XIS. Any transaction, regardless of ingestion format, should be de-coupled from other transactions ingested at the same time to allow good data to pass through automatically. Data separation is critical to speeding cashflow and claims service. It will always be possible to see what happened in January, but if you’re thinking about “what happened in version 7 of January”, you need to go have a lie down.

4.       You don’t need a data repository, you need a data exchange. Submission is not the end of the journey, particularly in a market-place like Lloyd’s. Data needs to be shared with various partners, internal and external, for reasons such as tax, reinsurance, and claims. Part of this should recognise that different data is needed at different times, so it should be possible to submit data from different sources at different times, link them to a parent object (such as a risk or a claim) and pass relevant data (and just the relevant data) to the other parties, or make it easy for them to access what they need. “Submission” in itself needs to be re-thought. In a connected world, data can be retrieved automatically when on trigger events… Worried about legacy? Retain a method for manual submission while allowing  automatic retrieval for the digitally transformed – this is how you’ll drive adoption and automation, through organic evolution as the slow movers see the efficiencies their rivals are getting. A repository designed mainly to process an arcane spreadsheet format is never going to achieve this, and shouldn’t even be part of the conversation.

5.       Consider what can be achieved beyond reporting. Beyond the obvious of the tax and regulatory reporting, what does the market even want to do with the information it has? What benefits can it pass to coverholders and other stakeholders? We’ve seen progress on a faster claims payment solution but this is so dependent on the data being right that there has to remain an element of scepticism on how impactful it will be, but it’s a step in the right direction. Automating coverage verification has huge potential benefits, especially the eradication of loss funds. What about on the capacity side? How about proactively monitoring capacity utilisation and helping successful coverholders find additional capacity when nearing premium income limits, and engaging with capacity matching tools to help match facility seekers with risk appetite.

Though I’ve been very critical of their approach, I’ve no axe to grind with Lloyd’s – my criticisms come from a place of wanting the market to thrive. Being selected for the Lloyd’s Lab last year was a hugely proud moment, and something I remind people of every chance I get (if they don’t already get the message from my Teams virtual background…). But for Lloyd’s not to be left behind it needs to abandon the obsession of common data standards (a crutch for bad engineers), and avoid the sunk-cost fallacy of pouring more into trying to get v5.2 & DDM to work – they don’t, and never will.

So the market needs a modern data exchange that can accept any data format for any transaction, link data created at different times and from different sources, and has the flexibility to support an internal standard without externalising that to business partners for whom it is not relevant, and makes it easy to receive and share data. Something that can eliminate bordereaux immediately for digital players but supports bordereaux and other formats for those who’s modernisation is not yet complete.

Sounds a helluva lot like what we’re building at distriBind. Bet you didn’t see that coming.

Share article

Dave Connors

CEO & Founder

Comments