(XL focuses on large, bespoke commercial risk. Photo credit: Tuxyso.)
Mike Garceau, who was promoted to chief financial officer of XL Group’s North America P&C business this month, arrived at the company as chief operating officer of in 2010, taking on the duties of managing operational functions in support of XL Insurance’s North America P&C business units, including Property, E&S, Environmental, Excess Casualty, International Casualty, Global Risk Management, North America Programs and the company’s XL GAPS loss prevention consulting group. Garceau was charged with ensuring that operational functions were aligned to XL Insurance’s business strategy. That responsibility includes oversight of expense, performance and IT management, as well as leadership of the budget and planning process and implementation of strategic projects. One of these was XL Insurance’s recent Global Underwriting Platform, based on technology from FirstBest Systems and Accenture Duck Creek. Garceau discusses the initiative and shares his views on how emerging technology is shaping P&C underwriting.
Insurance Innovation Reporter: How important is emerging technology for P&C underwriting in general and large commercial risk in particular?
Mike Garceau: Technology has played different roles over time and will continue to evolve in ways that are dependent on what business you’re in. We’ve all seen how technology has changed personal auto, which is very different than the large, bespoke commercial business that we do. But one thing that’s just as relevant to a large risk-focused insurer like us is the use of technology to free up underwriters to focus on high value parts of the process, whether analyzing risks, meeting with customers, marketing or other activities rather than being tied up in an inefficient platform.
IIR: Relieving underwriters of lower-value administrative tasks is no doubt a welcome one, but underwriters have been resistant to other aspects of technological improvement – what we used to call the “art versus science” question. How do you see underwriters adapting to the era of big data?
MG: The best underwriters have always been good at using standard and non-standard information for a particular risk or baskets of risks, but the next wave of technology is enabling those underwriters to open up the box even further. Better underwriting platforms have been catching data, such as loss history, more efficiently and the sources of information and insights they could potentially tap is expanding significantly. For example, The Internet of Things is driving a proliferation of sensors, which you can even find in everything from buildings to modes of transportation and even produce these days. Technology still needs to evolve to harvest that data and derive insights from all of this, and I think that’s where we are today. The best underwriters will continue to be those that can bring the art and science together, and tomorrow’s underwriters will have a much wider palette of information sources to choose from.
IIR: What are some of the ways that palette is expanding today?
MG: Many companies are trying to do better with the information they already have. In the past, you might focus on your book of bound accounts and glean insights from loss history. With more efficient platforms we’re able to capture data from all accounts that are submitted to us – we’ve expanded the pool of data to include everything and not just the business we bind. We found that there are lots of interesting insights that you can derive from using technology with a big data approach to get more out of the data you have.
IIR: Insurers have long had the rap of being data rich and information poor.
MG: Our approach is that we have to be cutting-edge in our ability to recognize non-traditional indicators that are predictors of risk, while at the same time making the best use of traditional data. Everyone will work on accounts they bind, but in an environment of manual processes, they won’t find time to explore business they’ve decided not to bind. Our global underwriting platform allows us to efficiently gather information from every submitted account.
IIR: What was your strategy in implementing your underwriting platform?
MG: We took a best-of-breed approach, using technology from FirstBest Systems and Accenture Duck Creek. We leveraged FirstBest’s technology to deliver an underwriting desktop with integration to Accenture’s DuckCreek policy administration capabilities for rating and policy issuance. The global underwriting platform also integrates with our homegrown.NET pricing solution as well as additional technology tools within XL’s framework. One of our next releases will include integration with Xuber’s Genius policy administration solution, Location Management System (NIIT) and RMS .
The initial release of the system to support general liability was completed in 2012 and we have been continuing to deliver additional products and enhancements to the platform. Most recently we implemented the platform for our loss sensitive products and the next release is scheduled for Global Property. We plan to expand the delivery of select capabilities of the global underwriting platform to a broader set of products in 2015. By the end of 2015, our target is to have 90 percent of the NAPC premium supported by the platform.
Upon capturing the data from each of the aforementioned systems, we have established processes to correlate the data in our data warehouse infrastructure, where we apply Analytics tools such as SAS Visual Analytics to gain valuable insights into our risk exposures. The correlated data provides our actuarial and underwriting teams the opportunity to optimize our underwriting processes and refine our risk appetite, based on historical trends captured in our new technical infrastructure
IIR: What were your ambitions for the resulting platform?
MG: Our fundamental objective has been to create a highly efficient operating environment that allows us to process business and to capture the information that comes to us in a traditional way – from submitted all the way to bound accounts. You can’t do that if your environment is clunky. Bear in mind that we bind much less business than is submitted to us. You can imagine that if you don’t have an efficient environment, it can get ugly quickly.
IIR: How does one get from the creation of an efficient environment to reaping the benefits of big data?
MG: Well, I would observe first that big insights matter more than big data. It’s great to have a massive pool of data, but deriving insights isn’t as easy as one might think. That’s the challenge: how do you bring data together to get at what’s actionable – and I think that’s what everyone is trying to figure out today.
We have teams of people to look at the traditional data already come in, and we encourage them to think outside the box. The underwriters will identify predictors of risk that they lean on and then our statistical scientists will evaluate whether they prove out. We’ll look at, say, five predictors that they lean on today and then look for others that are more outside the box. We’ll confirm or reject certain indicators, and sometimes combinations of indicators by looking through our own loss history. That’s one way we’re trying to convert big data into big insights.
IIR: Where do emerging external sources of data come in?
MG: The most interesting, cutting-edge activity is related to non-traditional data sources, and the challenge is having an efficient environment and the right skills to mine the data. The emerging frontier is the potential sources of data that the industry is yet to correlate to loss. The Internet of things could yield a variety of insights from sources such as sensors used in buildings, for example, or machinery such as pumps and valves. The question is how the industry will tap into that to do a better job of risk selection.
It’s not a straightforward proposition. Statistical models are great in a steady-state environment, looking at a set of risks and identifying probability. However, models aren’t good at detecting inflection points. For example, some of the warning signs relating to the credit crisis that were discovered retrospectively would not have shown up on statistical models. On the other hand, some of the emerging exposures could have been foreseen through good judgment.
IIR: That brings us back to the science-versus-art issue.
MG: Yes, picking from the palette of available information in the right way still requires a big dose of art. The science allows the palette to be more expansive. There’s still a substantial amount of judgment related to the large commercial risks that XL is focusing on.
IIR: And how is that insight manifested in the way XL has invested in underwriting-related technology?
MG: As we’ve discussed, it shaped the development of an efficient base platform that gives us all the fundamental capabilities. As we analyze our own information, we want to make sure that the platform can deliver that to the underwriter in a usable way. Typically, an underwriter will find an interesting predictor of risk and then all of a sudden a spreadsheet gets created. We’re trying to use technology to deliver those insights to the desktop – it’s a question of usability: the more that we can deliver at the desktop when it’s needed, the more they can assimilate that into their underwriting. And as we tie into less traditional data sources, we’ll look at how we can best incorporate that, delivering it in the tools that they’re already using. We’ve designed the technology to enable that.