Prevent legacy data sources infecting new systems, specialist says

Report proposes 'self-funding' insurance model for export industries

Underestimating data migration leads to costly failures and threatens to “infect” new systems with legacy problems if not tackled thoroughly, PBT Group Australia GM Will Erskine says.

Insurers face cost and time blowouts and customer dissatisfaction when they underestimate the data migration aspect of moving to new systems, he says, yet this is frequently not given the priority it deserves.

“Address data quality before and during the data migration and transition process to prevent old problems from legacy source systems infecting the new system,” Mr Erskine said.

“An insurer cannot achieve its goals if a new system is incapable of digesting data and presenting it in an understandable format and a timely manner with minimal or no business outage.”

Data migration is common as insurers constantly upgrade from legacy systems, move data into the cloud or between cloud-hosted systems but Mr Erskine says the complex process is often the most underestimated part of a new system implementation.

“It’s seen as an afterthought, instead of being afforded top priority,” he said.

The quality of data an insurer collects, stores, manipulates, manages, reports from and analyses daily is key to success, he says, and access to quality, timely data is essential, PBT Group says.

“The consequences of not giving data migration top priority in any system transformation and implementation project are dire.”

Data migration fails lead to costly delays, missing or wrongly-converted data, risk of regulatory intervention, bad customer experiences (“if the underlying data is incorrect the new system is useless”), and jeopardised claims processing and policy renewals.

See also  APRA sets cybersecurity and data backup standards

PBT Group says it recently managed a complex data migration for a large specialist Australian insurer, including establishing complete data sets in a new system and archiving the old – while maintaining ready access – due to decommissioning of an existing mainframe.

It eliminated “error-prone” manual copying and pasting and improved productivity by automating generation of scripts it had used in the data migration for the insurer’s ongoing use.

“This was a major project that was achieved on time and on budget because of the priority the insurer gave to it and its early appointment of a specialist familiar with designing end-to-end migration solutions,” Mr Erskine said.

“Data migration is more than just mapping source to target. Previous practical experience is a tremendous asset in knowing where to apply focus.”