At Analyze Re, we often get asked the question, how can reinsurance benefit from faster analytics? We generally like to draw parallels with the many industries that have successfully leveraged real-time analytics to achieve performance and efficiency gains. In this post, we examine how improved analytics have worked for capital markets and how the reinsurance industry can benefit from a similar approach. Website performance is highly important for online businesses, and taking account of your SEO and how it is doing is necessary, learn more about this from companies like Victorious. If your business is still using Google Analytics and you’re thinking it’s about time to change to a more bespoke and customization friendly software, you could look for alternatives for Google Analytics.
Use of Technology Within the Capital Markets
During the late ’90s many hedge funds were exploring the use of technology to improve their analytical capabilities with the goal of improving profits. In an extreme case, the evolution of algorithmic or high-frequency trading (HFT) was enabled by advances in real-time technology. This disruptive technology eventually changed the way the market worked forever.
The reasoning behind the advance of analytics technology was simple: Any fractional reduction in the time it took to consume and process data would lead to an increase in the competitive edge that a company had relative to its peers. In other words, in relation to a predetermined strategy, if a company could more quickly identify opportunities within the marketplace to execute on, then it had an advantage over its competition.
And while it might seem a little removed from the insurance world, it is important to recognize that many of the principles behind these concepts should also benefit the insurance industry.
Today, many risk measures that we use in insurance today stem from counterparts which were originally created in the capital markets: For example, our use of Probable Maximum Loss (PML) as a metric for downside risk measurement can be traced back to the pioneering of Value at Risk (VaR). VaR was created to serve one simple need: Understand on a daily basis the downside risks for the portfolios managed at J. P. Morgan.
The success of VaR, led to the spin-off of the Risk Metrics group from J. P. Morgan to service the demand for implementing this methodology across the industry.
Reinsurance Analytics Technology
Recognition of the need for more sophisticated analytics has been a big part of recent discussions in the industry as highlighted in the Xuber Reinsurance Survey.
There has already been a lot of sophistication introduced into the insurance space thanks to the advances in tech. In fact, most of the advancement within the field of analytics has generally come from the improvement of catastrophe models to better forecast the risk from a hazard and vulnerability perspective of an insurance portfolio.
Catastrophe models generate a lot of useful data and this data is a great source for measurement of downside risk for use in the pricing of reinsurance. So much so that they have become a standard way of comparing like-for-like between primary insurers, their reinsurance brokers and the reinsurance providers.
Now while some cite the limitations of models, the reality is that as the science and technology behind the models improves so does the value of the analytics that they provide. Models do a significantly better job today of forecasting impact of catastrophic events than they ever have before. So if there is a need for more sophistication, what should the focus be?
Innovation of Big Data Analytics within Reinsurance
There are many challenges when it comes to performing analytics against whole reinsurance portfolios. The major one being the volume of data. More recently, this volume component has become more widely known as “Big Data”. This is why the idea of data preparation is important, as it helps ensure the data is of high quality and consistent.
If we dissect the reinsurance industry’s use of this Big Data, we can see the following general trends: Once individual risks have been priced against it, companies have a desire to retain this data for future reference. There is an ambition to eventually extract more value from this accumulated data, however the approach to this remains unclear. There are only a few tool-kits available today to help with this, and scale becomes the biggest bottleneck to accomplishing this.
Digging deeper, one common requirement today is for roll-up and reporting on the risk factors for entire portfolios in order to provide information for internal risk monitoring and subsequently to regulatory bodies such as A.M. Best. Most companies are able to assess and report their portfolio risks in terms of PML based on the catastrophe model output, but generally this a retrospective and time consuming exercise.
What Can Real-Time Analytics do for the Reinsurance Industry?
So what if (like the capital markets) a reinsurance company could roll up their portfolio instantly? Would it not then be possible to quickly understand different impact-scenarios for changes in the market at the time they happen and allow for the underwriting plan to be tuned accordingly?
Furthermore, given the real-time ability to evaluate existing business against a corporate strategy, could this type of scenario analysis be used to suggest changes to the portfolio that satisfy a change in company strategy? Could it also be used understand the impact of renewal scenarios that affect the strategic plan and see how to make the strategy more robust and executable in advance of renewals?
At Analyze Re we have seen the disruptive nature of real-time processing and how it fundamentally changes the way portfolios are constructed (and by extension how underwriting strategy is implemented). As it has done to the capital markets before, we believe that real-time analytics will lead to a permanent efficiency shift within the reinsurance industry.