Ground-Level Data: the Missing Piece in Insurers’ Weather Analysis

While the industry’s standard weather data collection tools have enabled weather predictions to increase in accuracy over the years, they still lack one critical dimension: accurate ground-level data.

(Image credit: Adobe Stock.)

Last year was a flagship year for weather disasters, with costs for wildfires, hurricanes, winter storms, flooding and other severe events tallying at a record-breaking $306 billion. Hamstrung by inaccurate weather forecasts, many insurers reactively paid out damage claims without knowing the whole story of when and how severely a local weather event struck. This reactivity comes from the current industry’s standard weather data collection tools, which consist of radars, satellites, weather balloons and stations. And while these tools have enabled weather predictions to increase in accuracy over the years, they still lack one critical dimension: accurate ground-level data.

During a slow, uneventful week of blue skies and average temperatures, you may not hear much about the weather. But when a storm is approaching and threatening winds, hail or severe snowy whiteout conditions, you’ll invariably hear forecasters talk about the performance of the American model or the European model, depending upon how difficult the weather forecast is to make.

These models are the results of a super computer’s complex calculations that forecasters use to predict weather. And while these models have dramatically increased their accuracy over the decades, they still lack ultra-efficient and effective predictions to get precise forecasting around severe storms.

Before you accept or refute that claim on face value, let’s take a quick look at how weather prediction currently happens. Then we can examine what needs to happen to make short-term forecasting predictions good enough to know when to evacuate populations, seek shelter, fortify levees or any other actions necessary to protect property and human life.

Right now, enterprise institutions and agencies working off of current weather algorithms need more granular weather data to aid in predicting insurance segments for action rather than inaction. Make decisions about the impacts of storms requires an understanding of exactly how damage occurred.

The Science of Seeing the Future

While local forecasters are a maligned bunch in the public opinion sphere, forecasting the weather is a science that has undergone many generations of improvement. Forecasters first began predicting the weather in the early 1900s. In the 1950s, computers were introduced into the mix in hopes that computational power would improve accuracy, but initial data-based predictions were untrustworthy. By the late 1950s, however, M.I.T. mathematician and meteorologist Edward Lorenz noticed an issue that would lead to changes: one prediction could be thrown wildly off course from another with even the smallest changes in data—the rounding off of a decimal place, for example. This issue still speaks to the core of the problem with today’s algorithmic attempts to predict the weather—imperfect data sets.

Today, meteorologists don’t rely upon a single algorithm to predict the weather, but rather an amalgamation of several different algorithms filtered through the lens of their own knowledge and experience. As a result, popular weather models have slightly different tendencies when producing predictions. While this combination of algorithms with human intuition has led to an increase in accuracy, weather predictions still aren’t good enough. And the solution, like the problem, lies in the data.

Without the Past, the Future is Uncertain

As Nate Silver explains in the weather excerpt from his book, “The Signal and the Noise,” forecast uncertainty is so large because the “problem with weather is that our knowledge of its initial conditions is highly imperfect.” Much of our weather data is gathered by weather balloons, radar and satellite imagery, which can be plagued by large observation gaps. And, unfortunately, this problem is not getting any better. Recently proposed budget cuts to the National Weather Service could further inhibit weather data collection. In Alaska, for example, weather balloon launches are regularly being missed, and even when launches are consistent, gaps remain.

All of these shortcomings contribute to one central problem: current data sets are lacking. What we need now are ways to fill the observation gaps by collecting more complete and accurate weather information.

Instead of relying on decades-old weather technology, we need to work on using new methods to collect weather data, which can then, in turn be fed into these weather models to create precise short-term forecasts that can be relied upon to safeguard human life and property. In addition, better forecasts and storm understanding allows businesses to be more proactive instead of reactive to severe weather events.

Getting Data from the Ground Up

While weather observations from above, such as radar and satellite, have propelled weather predictions on their journey to helps save lives, accurate data can only be achieved by directly observing weather on the ground.

This is the missing link in current meteorological efforts—what we might call “ground-truth” weather data.

Ground-truth weather data provides real-time, hyper-local data on such conditions as hail, wind speed and direction, rainfall, temperature, humidity and more. And while crowdsourced weather stations network efforts, such as Weather Underground, work toward this end, uniformity in both collection methods and equipment is necessary to gather truly reliable data. Some of these crowdsourced weather stations are placed incorrectly–next to a home, near an air conditioner or often they never make it out of the garage–all of these siting problems have an impact on the overall quality of the entire data set. If you have compromising measurements in your data set, you effectively have data that rounds to a different decimal place.

What is needed is a network of weather-data collection devices that fill in the gaps left by traditional equipment. A network that has high quality measurements, but also an entity that is committed to installing and maintaining those devices. With forward-looking weather models that apply advanced physics and chemistry to account for the increase in quality data, not only will forecasts become increasingly accurate, but a variety of industries will benefit. Farmers will be able to better understand weather impacts to crops to make optimum decisions to maximize yield. Insurance companies will be able know which of their policyholders have damage from weather conditions like hail, so they can process the entire claim and repair process on behalf of their customers. Understanding weather accurately requires changes, and whether from private sources or the public sector, such changes are already starting to happen.

5 Ways Weather Data and Analytics Can Transform Your Insurance Business

 

Alex Kubicek //

Alex Kubicek is the Founder and CEO of Understory. He started the company after he received a master’s degree in Atmospheric Science. His lifelong interest in weather drove him to create awarding winning curriculum for Weather and Climate 101 at UW-Madison, receive “Outstanding M.S. Thesis” for his work on cloud microphysics and hail formation, and revolutionize the weather space with next generation weather stations. Kubicek has led Understory through an accelerator program, a hardware seed stage, and an institutional financing of $2M lead by True Ventures.

 

Leave a Comment

(required)