What’s the next frontier for assessing property risk?

Satellite taking images used for risk assessments

Next-generation data, analytics and risk models are giving property risk assessment a much-needed facelift.

The increased frequency and intensity of natural disasters, combined with population growth in NatCat-prone regions, is escalating property and casualty (P&C) insurers’ claims and losses. A lack of precise data on potential risks and loss exposures only worsens the situation.

Fortunately, innovations in data collection and use are helping to reshape Canada’s P&C insurance industry, thanks to the rise of digitalization, cloud migration, and growth in the availability and diversity of data.

Insurers’ strategic decisions around the use and application of these emerging analytics and risk models will profoundly impact their future competitiveness and profitability.

 

Greater granularity

Traditional territory definitions and ratemaking methods are problematic because insurers must balance the need to create both statistically credible territories and represent homogeneous regions that have relatively uniform exposure to loss.

Although actuaries have used traditional territory ratemaking for decades, those methods present difficulties for underwriters, actuaries and IT teams. And many historical territory definitions aren’t sufficiently correlated with the propensity for loss.

But advancements in Geographic Information Systems (GIS) and Geospatial Artificial Intelligence (GeoAI), coupled with greater accessibility to location-specific data and risk scores, offer an opportunity to overcome these failings.

Applications leveraging GIS and GeoAI allow insurers to use location-specific data and hazard risk scores. This Geospatial Hazard Rating (GHR) method is more granular and accurate. A rating is generated for each individual address, by collecting historical data and events for a specific peril within a defined geographical radius.

The ability to provide fine-cut hazard ratings lets GHR significantly speed up insurers’ ability to conduct property assessments and generate policy quotes. And, because geospatial data is highly structured, objective, and collected at a large scale, insurers can better analyze data and uncover novel risk insights that would not be feasible with traditional territorial ratings data.

See also  3 Workers’ Comp Policies for your Health Care Staffing Organization

 

Improving fire risk scores

A prime example of GHR’s value is its potential impact on fire risk models. Fire-related events can trigger costly insurance claims, so adding data elements and innovating the approach to fire risk models should be a priority for insurers.

Traditional fire risk scoring methods often overlook critical data that are highly correlated with actual fire risk and loss. For example, the most common applications typically rate fire protection on just four data elements: local fire department capabilities, local emergency communications systems, fire prevention codes and water access.

Newly available data sets allow insurers to calculate fire risk using metrics that better correlate with risk and loss; and let them assess risk at the individual property level. Among the new data elements available to insurers are:

More accurate and granular property distance-to-water numbers.
Drive times from the closest fire station to the property.
Local fire station staffing data (as opposed to a general community rating).
An area’s station density (e.g., how many fire stations are within a six-minute drive of the property).

Applying address-level data helps insurers improve underwriting decisions, set accurate pricing based on actual risks, and gain a competitive marketplace advantage.

 

Next-generation data

Better data points enhance insurers’ understanding of properties and related risks. And leading-edge companies are using this finely cut data to make incremental improvements across the insurance lifecycle — including underwriting, distribution and claims.

For example, by integrating and cross-analyzing internal and external data, underwriters can make more informed decisions about coverage, policy renewal, and setting appropriate premium amounts. Plus, they can better identify favourable risks and eliminate highly probable losses.

See also  Pario Welcomes Industry Expert Alan Morris

Next-generation data helps insurers accurately price policies based on risks specific to each property. It eliminates the one-size-fits-all approach that often disregards risk differences within the same postal code.

Especially for small and medium-sized players, insurers gain a competitive advantage by leveraging the more granular data for marketing purposes. Smarter, targeted marketing strategies can yield stronger, less risky and more profitable leads.

A recent study by global consulting firm McKinsey finds using next-generation data, analytics and risk models has helped leading insurers reduce their loss ratios by three-to-five points. Those same firms are seeing between 10% and 15% growth in new business premiums, and a 5% to 10% rise in customer retention.

These results suggest early adopters of property risk data and next-gen analytics will have a significant advantage as the underwriting landscape evolves.

 

John Siegman is the co-founder of HazardHub, a property risk data company acquired by Guidewire in mid-2021. He is now a senior executive at Guidewire. This article is excerpted from one appearing in the April-May 2024 print edition of Canadian Underwriter. Feature image courtesy of iStock.com/iLexx