In its latest report, InsTech London warns that too much data can be counterproductive and pose problems for brokers
More data is not always better, warned InsTech London in its latest report on location intelligence.
The report, Location Intelligence 2021 – the Companies to Watch: Where, What and How Risky?, stated: “The first rule of insurance should be to know what is being insured.
“With better information about the characteristics of an insured item, or the behaviour of the policyholder, insurance premium can be more highly correlated to risk.
“Higher prices for riskier assets encourage behaviours by the policyholder to reduce the risk. Installing a burglar alarm, for example, reduces the likelihood of theft from a property.”
Data is key. The availability of data is beneficial in terms of opportunities for insurers and their clients, but there are also pitfalls.
More data, more risk
The report continued: “The rising tide doesn’t float all boats. Access to data, and the insights it reveals, does not confer the same advantages to all parties.
”In an idealised world of full transparency and fair pricing, we would all seek to maximise data and share it widely. That’s never going to happen.”
In part, this is because brokers are not always incentivised to collect and pass on detailed data about clients.
On top of this, performing analysis on behalf of clients has a cost attached to it.
For example, over the last two decades, reinsurance brokers have been required to spend a great deal of money per year running analytical teams and licencing catastrophe models to support their insurance clients, at a time when pricing and commissions have been under pressure.
And lastly, more data may highlight the greater loss potential of specific risks and subsequently result in a higher premium.
Too much data
The report listed the following reasons to explain why more data is not always beneficial:
- More detailed information shows that many properties have higher risks than the average, or greater risks than were suggested by simplified assumptions. An example could be proximity to a flood plain or poor property maintenance, which would mean the underwriter then has a choice to charge a higher premium or decline the insurance. In a competitive market, the insurer loses a client, or the policyholder pays more for their insurance.
- UK insurance brokers are now expected to offer clients advice based on the information they have about their client. The more information the broker is given, the more options and considerations they need to provide to their client. If they give bad advice, or they ignore some information that subsequently results in their client not being paid, they may be liable.
- More information also takes longer to analyse and requires new tools and people to run them. Standards and the means of sharing data tend not to keep up with the volume or variety of data being exchanged. When PDF attachments or irregularly formatted spreadsheets become the default means of exchanging data, costs increase to extract the new data, or it is ignored.
- In the US, carriers must file rates for approval by the state. It takes time - months or even years - to gain approval and it’s an expensive process. Rating is also susceptible to political influences. This creates high barriers to entry to new pricing models.
- In the UK, Europe, Bermuda and other regimes, Solvency II or local equivalents require model validation. Validation is expensive and lengthy, creating a barrier in time and cost to the use of new data and tools. Regulators are now requiring up-to-date views on changes in climate risk that are harder to assess than traditionally accepted historical validation data.
- Policyholders, in both personal lines and commercial lines, are put off by requests for more information. If it takes too long to get a quote or they don’t know the answer, people will look for an insurer that is less inquisitive.
No comments yet