Taking the trading test – pass or fail?
Liquidity, connectivity, real data and correct price are the four simple tests that each trading environment of the future must achieve, says BGC Partners’ Jonathan Prinn.
Every month there seems to be a new trading system launched into the (re)insurance space. While these used to be focused on the middle market arena, increasingly, they are focused at large and complex risks and they all seem to make bold claims regarding efficiency, ease of use, and the clients they have already “signed up”.
I suspect that the industry is large enough to allow for numerous different trading systems at this time, but it will be interesting to see how they develop. No question, the number of systems out there is good from a competition perspective – we need diversity of thought. That said as our market hardens, gone are the days of brokers or carriers signing up to seat licenses year after year unless they see liquidity – deal flow to you and me. Ultimately, despite great claims of benefits they will all be judged on whether they are used – but is that the only metric?
Ten years ago, I recall conversations with carriers where they were concerned that they would be swamped by a “proliferation of icons on their desktops”. It didn’t happen then, but today it’s becoming a risk. I don’t believe it’s realistic to expect underwriters to log into numerous different systems, and trade in a different way depending on what class of insurance or broker they are dealing with. This is especially true when the majority have very low volumes.
Hence another metric to measure should be connectivity. Connectivity is king. Every broker and every underwriter must have their own workflow that captures data and an ability to accept and send such data to others. Woe betide those that are not working on that approach. That type of ability will enable the seamless flow of data and the toing and froing of negotiation. Where ‘blockchain’ seemed to be the most used word in insurance in 2019 – this has now been replaced by ‘API’ and I suspect that API will provide more value over the short term.
Data pipeline
The problem is, of course, not the pipe that connects brokers to underwriters, but what can actually flow down that pipe – and that is data.
From my experience, all companies talk of API ability, but either don’t have the data (brokers) or the ability to use the data (carriers). Worse, many of the trading solutions out there don’t collect the data in the way our industry needs it – and what little data they do collect is focused on the deal not the exposure. We need both – and we need it broken down into each variable. “Premium: $100k paid annually” is not one data element! It’s four – type / currency / number / trigger.
“Where ‘blockchain’ seemed to be the most used word in insurance in 2019 – this has now been replaced by ‘API’.”
So, real data must be another evaluation point for any trading environment. But is that it? No – lastly there is cost. We all know that our industry has a cost problem. If some of these platforms do genuinely offer efficiency, then we as an industry can’t pay that away in seat licenses and the like. And there lies the biggest problem: to connect to an environment and build and implement the APIs takes time, and to really use the data, takes even longer.
Seat licence cost is an old model. We need to look at liquidity flow charging, but some startups and insurtechs I see are charging plus 1 percent for one element of the workflow. I just don’t see how that can work? We need to look at other industries trading platforms and realise that premium flow charges for the full soup to nuts process needs to be in the region of 50 basis points or less. Liquidity, connectivity, real data and correct price. The four simple tests that each trading environment of the future must achieve. I wonder how many of those would pass that today?