In announcing what it describes as the “insurance industry’s first cyber risk modeling tool,” Willis Re’s new product may be the first shot in a battle among reinsurers and modeling firms to measure and underwrite the growing field of cyber risk.
The new model – PRISM-Re – will offer an analysis of the susceptibility to data breach events across the insurer’s portfolio using a ‘common shock’ methodology to encompass the possibility of contagion behavior, according to a company statement issued yesterday. Based upon the latest exposure data, the model estimates the “frequency of data breaches and the potential severity of insured losses arising from those events.
Revenue is used as an exposure measure in the new model, says Alice Underwood, executive vice president and Head of Analytics at Willis Re North America, adding that the model pulls information from the insurers’ portfolio of cyber policies. “If the insurer does not have it revenue data for each insured we will go and pull it for them.”
Working with a database of 16,000 cyber loss events provided by Risk Based Security (RBS), the Willis model then does a three stage analysis, according to Underwood.
First, it determines the likelihood of an event measured at an annual rate based on industry sector and company revenue using parameters determined by regression analysis. Then the number and type of records affected is determined for each simulated event. The third stage then calculates the costs to the insurer under the particular policy.
“Being a first generation model it will of course develop over time,” Underwood says. “It can also play with other third party models, so the client can see how cyber risk interacts with other perils.”
Many of the largest catastrophe modeling firms, including AIR Worldwide and Risk Management Solutions (RMS) are on the record as developing cyber risk models.
Any model attempting to measure cyber risk will need to accept myriad data points across several inputs in order to succeed, says Jame Cebula, manager of the Cybersecurity Risk Management Team within the CERT Division at the Software Engineering Institute at Carnegie Mellon University.
And while the mantra of cyber risk is “detect, protect and respond,” it is still early days in measuring the consequences of an attack.
“For business modeling cyber risk, you need to start at the beginning and reach out to all the databases and determine what information assets are tied to revenue generating business functions in order to determine the possible consequences of an attack,” Cebula says. “That’s not easy to capture in any model and that why our research is focusing on the consequences. That’s the endgame that everyone is trying to get to.”
The Willis model will attempt to keep up with the changing nature of cyber risk, especially when it comes to threat vectors.
“The RBS database is very robust and is constantly being updated,” Underwood explains. “Those threat dynamics are in a constant state of change.”
With any model on cyber risk, insurers will also need to understand the limitations of the output, says CERT’s Cebula, adding that while severity of an attack may be measurable the frequency is a more difficult achievement since data is limited.
“The hard repeatable data, that information is to hard to come by and hard to derive since it varies,” he explain. “Insurers are used to dealing with actuarial tables, and there is some confidence of the predictability of identifiable losses in range. In the cyber risk insurance space, that sort of determination is still a work in progress.”