A Glance Back over the Last Two Decades of Insurance Technology

Novarica’s Matthew Josefowicz remarks on some of the greatest changes that have taken place in insurance IT since the year 2000.

(Image credit: Mohamed Hassan.)

As we stand at the pivot not only of a year but of a decade, IIR took the opportunity to talk to some of the insurance technology’s industry’s most prominent analysts. For this article, I had the great pleasure of speaking with Matthew Josefowicz, President and CEO of research and advisory firm Novarica (Boston). Josefowicz has been observing the industry for roughly 20 years, so the conversation presented an opportunity to explore not only the changing of the decade, but how technology has changed in the new millennium, and how some of those changes have shaped the role of technology in the insurance enterprise.

For example, at the turn of the millennium insurance along with all other industries relying on information systems faced the prospect of the dreaded Y2K issue, where a lack of foresight had ignored the calendaring implications of the year 2000 to systems built to recognize years by only the two last digits. “Y2K remediation was a big spend in 1999, creating the idea that technology was a sinkhole of expense,” Josefowicz notes.

If Y2K was a main event as 2000 loomed, the Internet was a sideshow. Arguably, people started going online in large numbers around 1995, but participation remained somewhat compartmentalized to universities and tech enthusiasts. “In 2000, the idea that anybody would run anything but a mainframe was laughable, and direct-to-consumer for anything but non-standard auto would have been considered absurd,” Josefowicz says.

That began to change rapidly in the early 2000s, as “e-commerce” became a concern for business, including household name insurers. Also, as programming evolved, the limitations of mainframe systems was increasingly felt. “Mainframe went from being the quintessential core mission-critical system to something understood to be inflexible by nature, and thus an impediment to business flexibility,” Josefowicz elaborates. “The core system vendor market began to offer alternatives.”

Matthew Josefowicz, President and CEO, Novarica.

As the vendor market matured, carriers began to ask the “buy versus build” question, and the best-of-breed strategy emerged whereby insurers could select the best suited components from various vendors. By the end of the first decade of the new millennium, insurers saw the advantage of the “integrated suite” of components from a single vendor.

The first decade of the 2000s set the stage, with probably the most significant event being the launch of the iPhone in June 2007, which set the stage for the broader smartphone revolution in consumer behavior. “The expectations set by the smartphone were the biggest thing of the last decade, setting the stage for the rise of Amazon, mobile banking and social media,” says Josefowicz. “It completely reset consumer expectations.”

Over the last decade, the web morphed into digital and became mission-critical, according to Josefowicz. Insurers began adopting Agile development as an aspiration of how to develop new systems if not the established goal in every case. Infrastructure took a significant departure, with cloud adoption reaching a kind of tipping point. As a highly regulated industry, insurance shied away from the use of cloud for anything touching sensitive personal information. However, Josefowicz notes that that has changed as carriers concluded that cloud providers were better than insurers at information security. “Insurers have gone from a posture of, ‘We’ll never use cloud’ to ‘Of course, we’ll use cloud!’,” he says. “It’s gone from being a security and control concern to a flexibility and agility asset.”

Static IT Spending

One might be tempted, when recounting these secular changes in technology architecture and infrastructure, to think that the insurance industry’s view of technology has totally changed. However, Josefowicz observes, the way insurance companies leadership thinks about technology may not have changed as much as thought. “Insurers are still spending only 3 to 5 percent of premium on technology, though they now have business processes totally dependent on digital communications, data management and analytics,” he says. “The IT spending ratio hasn’t changed appreciably during the last decade and a half.”

Should it have? During that time insurers have been steadfastly focused on “doing more with less.” Is it possible that the gains of process automation have enabled spending levels to remain more or less constant? Josefowicz regards that as an improbable scenario. “I guess it’s possible that the ratio of cost decrease of technology widgets and increase of required technology widgets have moved in lockstep ration; however, it’s more likely that insurers have said to themselves, ‘We spent 3.5 percent on IT last year and must do the same next year,’” he says. “In other words, it’s more likely a case of management inertia than optimized capabilities spending in many cases.”

The Art of the Impossible

It’s not that insurers haven’t been able to do more with less—certainly they have. However, Josefowicz notes, while the first wave of innovation is about doing things more efficiently, the second wave is about doing totally new things. “It’s about things that never would have even occurred to us before, such as using drones to take a picture or a roof during a natural catastrophe,” he says. “It’s about things that would have been prohibitively expensive that suddenly become possible.”

Insurer IT Budgets Rising for 2020, Projects Focusing on Digital and Data

Anthony R. O’Donnell // Anthony O'Donnell is Executive Editor of Insurance Innovation Reporter. For nearly two decades, he has been an observer and commentator on the use of information technology in the insurance industry, following industry trends and writing about the use of IT across all sectors of the insurance industry. He can be reached at [email protected] or (503) 936-2803.

Leave a Comment

(required)