(Image credit: Shutterstock.)
As insurers strive to launch new strategic initiatives, especially AI-driven programs that rely heavily on data for success, the limitations of their legacy core systems become apparent. Among the greatest shortcomings of their existing core system environments is that key data is often locked away in silos, only truly useful in its originally intended function. For real transformation to take place and innovation to flourish, these walls need to be broken down to liberate data so that it can be exploited to the fullest.
That’s easier said than done, of course. Integrating new solutions with an insurer’s existing IT systems is often a major challenge, slowing the pace of modernization. There are several reasons for that; for example:
- Core systems are often years, even decades old, and were never intended to support integration with other technologies. This can slow the pace and raise the cost of integration projects.
- Core systems can tend to be highly specialized, leading to the proliferation of multiple, disparate systems that are not integrated with each other. For example, many insurers support one system for customer information, a second system for policy information, and a third for claims management.
- IT budgets and personnel are stretched thin by necessary maintenance and evolution of the core systems, leaving little time and money to support innovation.
To address these challenges, insurers need to find some way to bridge the gap. Core systems are the heart and soul of the insurance business. Any proposed initiative, or new technology solution that does not allow for communication between new and existing technologies is doomed to failure.
So what can insurers do to effectively achieve integration and help accelerate innovation? The good news is that there are several options. And while some of them are only short-term fixes they can buy the time needed to invest in a longer term solution. They are:
These different approaches are evaluated on four axes to help better illustrate the overall impact on the organization:
- IT investment: The IT cost imposed on the insurer.
- Team investment: The workload imposed on the team impacted by the new solution.
- Transformation impact: The degree to which this investment helps the insurer progress in the transformation of its IT systems.
- Input or output: Whether the approach is useful for input (transferring data into the insurer’s IT system) or output (transferring data out of the insurer’s IT system).
Let’s take a deeper dive into each integration technique to better understand the pros and the cons.
In some cases, the simplest way to integrate with an existing system is via existing users. This approach by its very nature imposes an ongoing cost on the team, but if the cost is small and the value added is large, it can be worthwhile.
For example, if you have a new solution that adds an important score to a claim, and there is no API available to add that score directly into the existing claims management system, you might be able to piggyback on the claims handling team. When they make their call to the customer to discuss the claim, their script can include accessing the score in the new solution and including it in the claims management system.
Naturally, this is not a long term solution. However, it can be a great way to pilot a new solution and prove its impact. It is important that the new solution is adapted to make the cost for the team as low as possible. For example, allow a claims handler to look up the score applied to a claim by searching for the customer phone number, which they are already copying and pasting into the telephony system. It is also important to make the impacted team part of the overall innovation project, so they understand that not only what they’re doing is important, but also why. This strategy also helps ensure the team feels bought in to the initiative.
Human integration can be used for both input to, and output from, an insurer’s existing IT systems. However, as the relevant data—e.g. decisions about a claim or inclusion of a URL to open full results in a new tool—tend to be small, this approach is most useful for input. Further, the UI of the new tool can be easily adapted to make input convenient, for example ordering and naming the fields in the same way as in the insurer system.
This approach has no impact on the overall transformation of the insurer’s systems. Another approach has to be undertaken in parallel—probably once the positive impact is established—in order to provide a more long-term solution.
Robotic process automation (RPA)
RPA is the automated version of human integration. There are multiple providers offering software which can be taught how to carry out tasks within the user interface. Once deployed by the IT team, the relevant team can teach the software how to enter the data from the new solution into the existing IT system (or vice-versa).
The main requirement for using an RPA system is that there is an existing user interface in the insurer’s IT system that is suitable for the task. If there is no existing user interface, or the existing user interface would need to be changed, then RPA alone is not sufficient.
As it is automated, RPA can transfer much large volumes of data, and hence can be deployed for both input and output. However, it does not advance the overall transformation process. It depends upon the existing systems, creating another integration which needs to be addressed during the transformation process.
Batch Transfer/Extract-Transform-Load (ETL)
Batch transfer involves regularly exporting the required data from the insurer’s IT system and sending it to the new solution. This can be done on a regular basis to create a data flow between the insurer’s IT system and a new solution. This implies that the data may not be completely up to date at any given time, but this may not necessarily pose a problem for a given solution.
Most insurer IT systems are capable of dumping data out to a file, and the IT team can then script the automatic transfer of the files to the new solution for processing. Some insurers have dedicated ETL systems that enable them to set up this process with a simple configuration.
In general, this approach works best if the new solution does not require a particular format, but instead takes on the work of transforming the data from whatever format the insurer provides. However, if there is a sophisticated ETL solution in place, and the insurer has the IT resources to do some configuration, they may be able to conform to a standard data model, reducing the integration cost for the new solution.
This approach is most applicable to output as writing information back into the system is much more complicated (but may be possible with some systems).
Some insurers have adopted a strategy of extending their existing systems with point API solutions to enable specific projects. The IT team generally has access to a web services framework, and uses it to provide an API which address the specific needs of the new solution with which they wish to integrate.
This is an effective strategy to enable innovation on top of a large existing system. It ensures that the IT effort is optimized towards supporting specific innovation projects, aligning the business objectives and the IT investment.
The limitation of this approach is that it can’t address the overall shortcomings of the underlying IT systems. If the underlying systems take a long time to process data, or can’t execute complex queries, or can’t store large files, then the APIs will be bound to reflect this.
This approach can have some positive impact on the overall transformation process. It acts as a prototype for the insurer’s future system, enabling the insurer to understand what kind of APIs will be useful and how they should work. This is critical knowledge for a future system overhaul. It can also ensure continuity when the eventual transition takes place, as solutions can continue to use these APIs even as the systems behind them are replaced.
This is the scaled-up version of the single-use API approach. Rather than investing in one or two APIs, the insurer implements a platform that sits on top of their existing IT systems, replicating the data and providing comprehensive APIs for new solutions to use.
This kind of platform can be a powerful tool in driving transformation. Over time, either the overlay platform can become the system of record, or the underlying systems can be replaced with newer equivalents. Meanwhile, new solutions can be implemented using modern API approaches and they are agnostic to the changes in the underlying system.
This is a powerful solution, but requires a significant investment and a sophisticated approach to syncing the data between the overlay platform and the underlying IT systems. The overall feasibility of implementing this kind of platform depends upon finding and building a technical solution to this problem, which will be specific to the insurer’s existing IT systems.
Here, the insurer moves to an entirely new system. This can address all of the limitations of the existing systems, if done right, but it is normally a years-long process, and in the worst case, it can block all innovation while it is being carried out.
Finding the right balance
Unsurprisingly, there is no one-size-fits-all solution for either insurers or InsurTech solutions. It is useful to highlight some approaches that do work well in practice:
- For stand-alone solutions that help a specific team be more effective, setting up a batch transfer process for output and using human integration for input (if required) can be an effective and low-cost way to integrate. For example:
- Fraud detection solutions can analyze claims in order to provide alerts on suspicious claims to the SIU team. Once the SIU team has triaged the alerts, they can access the claims management system to mark those that are under investigation in order to prevent payment until the investigation is resolved.
- Business intelligence solutions can analyze the progress of claims through the claims management system and provide feedback to the head of claims. The head of claims can then make process changes as desired.
- Risk analysis solutions can provide risk modeling tools to underwriting based on analysis of historical results, and provide advice for underwriters to apply in calculating pricing.
- Targeted APIs are an effective way to handle integrations that require real time input or output. It is important to choose the point at which the integration should be done to minimize the API work that is needed. For example:
- Claims automation solutions can offer a greatly improved user experience for claim declaration (eg by giving immediate feedback to the user that a required document is missing). Allowing the solution to collect all the details of the claim and then write the result to the claims management system may be more efficient than integrating for each task (document validation, fraud detection, coverage assessment, etc).
- Document analysis solutions can automate the extraction of information from documents. It may be more efficient to build a single, generic API for analyzing documents than to build a specific one for invoices, another for accident reports, and so forth.
- An overlay platform can be a powerful solution when applied to a specific problem. For example, customer communication tends to suffer when there are a multitude of systems in place, as the person they are speaking to may not know or be able to access the information that they need. This problem can be addressed by building or buying a customer relationship management (CRM) platform that unifies the data for the customer-facing teams.
Increasingly, the insurance industry is embracing new technology and new approaches to drive innovation and transformation. Yet, many of these strategic initiatives are predicated on the need to access and make sense of the vast amounts of data siloed away in existing core systems. While there are several ways to break down the walls and make data accessible to the new technologies that will make use of it, not all approaches are created equal. Success requires understanding the requirements of the project and implementing an integration strategy that best meets those unique needs.