top of page
Search
David V Cabral

The big data fumble:How post-catastrophic events highlight data sets are costing the (re)insurance


I started my career as a claims handler for catastrophe claims in October 1980. Throughout my career, the industry message has always been the same: “we learned from previous events, updated our models and now better understand our risks”. Each subsequent catastrophic event has proven this is not the case.

In this age of advanced technologies, it’s hard to believe many insurers still rely on manual or semi-automated data quality checks which are expensive, labor intensive and prone to an unacceptable high margin for error. I spent time with a number of global (re)insurers post-Harvey, Irma, Maria (HIM) and discovered, to no surprise, that many of the (re)insurers identified significant differences between their original data geocoded to street address and the data supplied under claim notifications. The post-claim review clearly highlighted a significant percentage of properties were incorrectly located. During the review, it was also apparent that (re)insurance companies’ records lack risk-related features on several thousands of buildings, resulting in unreliable damage estimation (a recurring theme noted throughout my career).

This negative and significant impact to geographical location changes, at an individual building and portfolio-level, raises a series of key questions:

  • Would the risk have been underwritten at the in-force premium and terms and conditions?

  • How effective is your pricing model if risks are incorrectly geocoded at the time of modeling?

  • Do you really understand the impact of exposures/claims to your balance sheet?

  • Do you have appropriate reinsurance protection?

  • Is there a potential for a ratings downgrade?A financial downgrade is not only derived from losses alone to your balance sheet – it can also reflect a company’s inability to correctly model underwriting exposures.

  • If insurers’ data and models continue to deteriorate, is there a possibility of reinsurers denying claims due to continual data failures? A stretch I know, but I’ve seen stranger claim denials in my career…..

For those (re)insurers that do rely on geocoding engines, this shift in property location was a complete surprise; however, using a single geocoding engine does not always provide you with the correct location position as we can evidence from our own personal experience of using map services on mobile phones!

The Impact on Claims

The negative consequences of using a single geocoding engine or manually adjusting geocodes post-event also has an impact on claims: for example, the appointment of adjusters can be delayed due to incorrect addresses. This can result in longer claims' settlement times, further claim deterioration (can’t rebuild in a timely manner, etc.) and increased costs. Furthermore, reputational risks for insurers increase: no claimant wants to renew insurance with a company that substantially delays claim payments.

Stop fumbling – your data can be improved!

So how can (re)insurers close the data gap, correct geocoded street addresses and produce new insights on risk, pricing and customer engagement? By partnering with Ticinum Aerospace. The award-winning company is led by a stellar team of technicians who provide reliable, customer-oriented solutions using machine learning techniques and heterogeneous remote sensing datasets, with the final aim of cutting the costs of uncertainties. Multispectral, very high resolution satellite images provide a clear overview of the buildings from a nadiral observation point and include, but are not limited to, street-level pictures of buildings and extraction of several risk-related building features. These building features include:

Street-level pictures are also used to describe the urban environment, e.g. identify trees, street-lamps, hydrants, sidewalks, etc.

As noted above, the images are used to retrieve the physical features of the buildings but, from a risk analysis perspective, they provide a broader context to deduce additional non-property related elements (e.g. average neighbourhood revenue, vegetation indexes, etc.). This enhances or replaces outdated models that rely on zip (post) codes to identify neighbourhood characteristics and flooding exposures.

Ticinum Aerospace’s proprietary tools also incorporate GIS Data to analyse and compute distance of each property from two permanent water types: inland (e.g. rivers, lakes, reservoir) and ocean and help underwriters refine the risk of flooding. This increased understanding of exposures, enables underwriters and to better apply appropriate terms and conditions and improve pricing models. Through satellite imagery, buildings are identified before and after an event. No manual adjusting is required to reconcile geocoded street addresses: the company can process and match records post-event faster than any manual process can, thereby responding to claims faster and lowering claims costs.

For post-catastrophic event, new GIS data can identify the new parameters of the impacted area, ie. scale of damage (including maximum damage as a percentage of a (re)insurers portfolio), different foundation height, ground elevation, damage percentages below ground, etc.

Data and new business opportunities

In a number of geographical regions, underwriting data and models are notoriously poor. For those (re)insurers looking to expand their underwriting appetite in these regions, Ticinum Aerospace can help meet the underwriting challenge by dramatically improving existing data quality to produce new insights on risk, pricing and new customer engagement.

So why wait for the next data fumble? Contact the team at Ticinum Aerospace and change the way you underwrite, price risk and protect your balance sheet.

The author is currently an advisory board member for Ticinum Aerospace.

Comments


bottom of page