Cloud storage the silver lining for insurers

Report proposes 'self-funding' insurance model for export industries

Adopting cloud storage frees insurers to focus on customer needs and expectations and core value creation for policyholders, Frazer Walker Partner Ian Chisholm says.

The cloud is becoming the platform of choice for core insurance applications as it is much more cost effective than running in-house private data centres, as most insurers did in the 1980s-1990s.

“The costs of owning and operating these large, high-cost assets to the standard now required by regulators, and to meet customer expectations, is prohibitive for most organisations,” Mr Chisholm said.

“It’s inevitable that insurance organisations will eventually fully embrace cloud services.”

The specialist skillsets required from “a small army” of IT staff to operate data centres around the clock is beyond the reach of most insurers, brokers and other industry participants, he says, and keeping those skillsets current and at an appropriate backup level would stretch staffing and training budgets.

Services offered by public cloud providers, such as artificial intelligence, machine learning, and Internet of Things data processing, is “well beyond the reach of even the largest corporations to establish inhouse,” Mr Chisholm says.

“Insurers need to focus on their core business and value creation. Being experts in operating data centres is not part of that,” he said, adding that external suppliers are better placed to build, own and operate the services provided via the cloud.

Sydney-based management and technology consultancy Frazer Walker has had discussions at executive and board level on the risks and benefits of cloud services and trade-offs between cyber risks and operational and financial risks, and Mr Chisholm says while some insurance organisations may hesitate to embrace cloud services, the economics and feature-rich environments are “just too strong a proposition”.

See also  Amwins acquires Texas benefits general agency

Inhouse data storage remains an option for non-core, peripheral systems that don’t require rigorous up-time, redundancy or resilience, he says, for example, proof-of-concept systems, non-critical spreadsheets and some development or low-level testing environments could be run on local servers. Local data storage centres are not immune from cyber attacks though because they are generally connected to the internet-enabled corporate network.

“It’s always a case of test and learn first, then redevelop and move to a more rigorous, robust environment as the system becomes more important for the organisation,” he said.