Harnessing the Power of AI in Insurance

For insurers to harness the power of AI, they need to embrace not just a new technology, but a new philosophy—call it IntegrationOps or DevIntOps.

(Image credit: Shutterstock.)

There are few trends that stand to revolutionize the insurance industry so much as widespread adoption of artificial intelligence (AI). Rising InsurTech superstars demonstrate how AI technology will redefine the insurance landscape. Lemonade, for example, uses AI for everything from underwriting through claims processing and customer care. Root Insurance is one of the best-known auto insurance providers to collect and algorithmically analyze driver behavioral data to price premiums.  The benefits of AI to insurers include competitive advantages in pricing and underwriting, improved claims management, data-driven development of new insurance products, and the ability to scalably deliver a superior customer experience. According to research from McKinsey: “AI and its related technologies will have a seismic impact on all aspects of the insurance industry.”

But not all players in the insurance ecosystem stand to benefit uniformly from the advances in data collection and analytics. Specifically, insurance providers with legacy IT systems (and data stored in monolithic core applications) will struggle to reap the benefits of the AI revolution.

Here’s why—and what they can do about it.

Insurers need a unified data foundation for AI

Because of the sector’s diversity, there are few silver bullets or “one-size-fits-all” approaches in insurance. An insurer providing life insurance faces a very different set of actuarial and distribution challenges than a provider of homeowners or renters insurance.

Nevertheless, every insurer faces a common set of data prerequisites when designing and implementing AI-driven solutions. Insurers operating monolithic core applications tend to come up against two massive challenges:

  • Centralizing data for analytics. Building algorithms—whether to optimize distribution, underwriting, claims management, or even customer interaction—requires the ability to collect and centralize all relevant data. For example, an auto insurer that wants to implement risk-based pricing must connect historical data on policies and claims to streaming real-time IoT vehicle data. This is relatively straightforward for cloud-native InsurTechs, but staggeringly difficult in practice for insurers whose core data may be stored on a mainframe and in core applications like SMART400. Even modern insurance applications (like DXC GraphTalk) are not often designed for easy data exchange. As a result, an insurer that needs to get disparate data to, say, an Azure data lake will find itself navigating a mess of large-grain APIs with internal dependencies.
  • Closing the loop on data capture. Insurers that store core customer, policy, and claims data in mainframe applications must build the pipes back to those systems. For example, an AI-powered customer web chat that results in the capture of key personal details (e.g., a customer’s vehicle details) must be able to write structured data back to the underlying data store. The notion of integrating external information back into core systems isn’t new. Enterprises have typically adopted an ad hoc approach to integration of external information over the years. For example, to accommodate a specific need they may have pulled in foreign exchange rates, zip codes, interest and actuarial tables on a regular basis via tape, disc, or FTP. But the volume and velocity of external data that must be integrated are pushing ad hoc approaches to their breaking point.

Traditional approaches don’t deliver

Traditional approaches to creating data pipelines for enabling AI in insurance involve middleware (e.g., an ESB or SOA) to access data from underlying applications. In practice this approach presents a number of challenges:

  1. Complexity and technical debt as more middleware layers become intertwined, requiring more specialized skillsets
  2. Negative impact on API speed and performance as data must pass through multiple complex ESB layers
  3. Legacy development expertise required for implementation and maintenance

The last point is an important one. Digital services developers often lack context into how to access and parse data from, say, a COBOL copybook, to build APIs. And the experience deficit goes both ways: mainframe developers rarely have expertise with creating digital services on top of, say, mainframe CICS transactions. The net-net is that development with data from mainframe applications can be slow and manual.

AI for legacy insurers demands a shift in technology—and mindset

Ultimately, the only path for insurers heavily dependent on legacy systems to exploit AI is to radically democratize access to their monolithic core systems. This requires both a technology shift and a revolution in mindset and practices.

On the technology side, insurers must embrace digital integration technology that’s specifically designed to:

  • Connect directly with the underlying mainframe or legacy architecture, bypassing the middleware and using standard tools and languages for deployment.
  • Automatically generate clean, standard microservices-based APIs or serverless functions so that data can be centralized into a data lake for modeling and enrichment .
  • Make the underlying application a “first-class” citizen by feeding AI-powered data back into a mainframe environment.

These technology requirements are relatively straightforward. But they’re not enough in and of themselves. They must be accompanied by a cultural shift.

Over the past few decades, there has been a groundswell of support for the practices to enable more autonomous, decentralized development throughout the enterprise. DevOps, for example, embraces the automation and standardization required for software development teams to contribute independently. DevSecOps marks the same revolution on the security side: providing the tools, frameworks, and ultimately responsibility to individual software teams to uphold enterprise security.

The sweeping democratization of IT has touched virtually every aspect of the enterprise—but not integration.

Integration, particularly with monolithic core applications, remains the final frontier. Integration is still the purview of specialized teams, largely overseen by centralized IT organizations executing against an 18-24 month roadmap. This places a chokehold on the ability of R&D teams to access critical data from core systems and use it to deliver business innovation and better customer experiences.

For insurers to harness the power of AI, they need to embrace not just a new technology, but a new philosophy. Call it IntegrationOps or DevIntOps. Whatever the name, it’s an organizational mindset that acknowledges that agile, automated integration with core systems is the only way to fuel the next wave of great AI insurance breakthroughs.

AI In P&C: Real vs. Hype

Zeev Avidan // As Chief Product Officer and VP of Product, Zeev Avidan defines the roadmap of OpenLegacy’s Microservice-based API Integration & Management Software, aligns software features to the market and brings the software to the market. Avidan ensures that the OpenLegacy product delivers the right features to meet customers’ growing needs. During his more than 15 years of experience, Avidan has held leadership positions, delivering information technology solutions within enterprise IT departments and in companies that provide consulting services, most recently at the leading credit card company Isracard and Hilan Tech. He also served as a senior consultant in the Israel Defense Force, dealing with advanced computing systems.

Leave a Comment