While insurance has been viewed as ‘behind the curve’ on AI adoption, a ‘significant’ amount of effort has been made to upskill, says technical product lead
The growing use of data science will transform the way the insurance industry analyses the risks it faces, but the complexity of those risks will demand greater data transparency.
Rachael McNaughton, technical product lead with a focus on analytics and machine learning at broker Willis Towers Watson, said the industry had made real strides in its use of artificial intelligence (AI) and machine learning over past years – but that there was more to come.
“Insurance had always been viewed as being a little behind the curve,” she explained.
“However, we have seen a significant amount of effort to upskill, with the use of data scientists.”
McNaughton explained that the industry could not simply plug in and utilise black box AI and machine learning solutions, as the specific tasks they are required to carry out require a more bespoke approach.
“It was not really a case of being behind the curve, insurance has a number of different challenges due to the work it carries out that need more work when it comes to machine learning integration,” she explained.
“The industry has invested in people with a skillset, which means they can modify machine learning to carry out the tasks that are required and are often specific to the insurance process.”
She added that the key to the advantageous adoption of AI and machine learning was ensuring that systems delivered useful data in a way that auditors and underwriters could understand and utilise.
An ongoing process
While there has been major progress, McNaughton said system development is an ongoing process.
Read: Is the insurance industry using too much data for underwriting?
Read: School creates AI-based method to terminate proxy discrimination in insurance pricing
Explore more AI-related content here or discover other news stories here
At present, it is the major brokers and underwriters who are leading the way with the development of the systems, she said.
“It is the larger firms that have the capabilities to develop the systems and – at present – the smaller firms do not have the scale,” she added.
“In the future the efficiencies that technology brings may provide brokers with the ability to devote more resources to AI and machine learning.
“However, they and their clients are already benefitting for the use of analytics by the insurer in order to deliver accurately rated cover.”
McNaughton added that the industry has yet to make the progress some had hoped in areas such as telematics.
“For insurers the issue are a lot more nuanced and as such areas such as telematics, imaging and neural networks have not developed as expected – but companies are getting a lot more effective and are seeing the value [of these technologies].”
Additionally, McNaughton said that while AI and machine learning were starting to remove the need for insurers to perform some mundane tasks and allow staff time to focus on more productive areas of the business, insurance was the sort of sector where the human touch was unlikely to be wholly removed.
“For insurers, it is a case of adaptation rather than automation,” she explained.
“Risk is incredibly dynamic. Machine learning can analyse data and pick up things which may not be picked up by humans. It can only tell you so much.
“However, it also needs to be constrained – with the regulatory need for transparency it needs to be constrained to deliver clear information that can then be used to allow humans to deliver additional insight and value and think about complexity and factors that we are currently not able to integrate.
“The aim is to create clarity, which allows a fully informed decisions to be made.”
No comments yet