Interconnected insuranceSeptember 22 2017 by Nick Ferguson
Dragging insurers into the digital age is a work in progress — and one that is being facilitated by technology providers such as Equinix, a data centre specialist.
In the financial services sector it has been particularly active in working with banks, investors and stock exchanges to provide the kind of extremely low-latency co-location that allows algorithms to trade stocks milliseconds faster than their competitors.
Solving the challenges faced by high-frequency traders is no mean feat, but at the end of the day it is a technological matter suited to a company that specialises in building interconnected networks. The insurance industry poses a very different set of challenges.
Four years ago, the company noticed that its roster of insurance clients was smaller than might be expected. Puzzled, it hired James Maudslay, who had been working with insurance clients in the telecoms industry, to help it understand how to sell its services to the insurance industry.
“The original messaging wasn’t working and needed updating,” says Maudslay, who is now senior manager of field development in Equinix’s enterprise vertical marketing division.
The standard financial services pitch about low latency, close-proximity hosting and the sorts of issues that were so important to traders, simply wasn’t valued in the same way by insurance companies.
“We turned things around by showing Equinix how to have relevant conversations with insurers that actually would be meaningful to them.”
Insurers care very little about the ability to shave milliseconds off the speed their data can travel at and care much more about the security of that data, as well as complying with strict regulations on data privacy and, increasingly, data sovereignty.
“There’s no question about that at all,” says Maudslay. “Security is always going to be an issue for them because they’re fully conscious of the fact there’s a powerful regulator that will take swift and vigorous action if you lose data or compromise it.”
This concern about security and regulatory sanctions has created a tendency for insurers to want to keep a tight hold of their data. They want to keep it in a server room in boxes of file servers where they can see it. But, with the right messaging being crafted by people like Maudslay, insurers are slowly starting to accept that external providers might be better at data security than they are.
“I think most of us have believed that’s been true for quite a long time, but the fact they’re coming round to this now is a major shift.”
Part of the problem is a lack of clarity from regulators with regards to using cloud services, for example. In Australia, where there are well defined regulations in place, financial services companies are confident about what they can and cannot do, and this has made it much simpler to take advantage of cloud services. Even in developed markets in the US and Europe, there is still not much certainty around the use of the cloud.
While sensitive customer data creates a regulatory risk for insurers, it is the analytics they perform on such data that provides their real competitive advantage as businesses — and because there is less of a concern from regulators surrounding this type of information, insurers have been more willing to push the boundaries and embrace innovative technological solutions.
One of the examples that Equinix has been involved with in the insurance industry was the development of an open framework for hosting shared catastrophe modelling services, known as Oasis Loss Modeling Framework. The system allows multiple underwriters to access the Hurloss US Hurricane model provided by Applied Research Associates. Instead of each of them having to run an expensive, processor-intensive model that was used infrequently, they can now use shared equipment operated by a service provider.
“It was interesting that the technical issue took less time to resolve than the emotional problems of getting them to agree to put their data on this particular system — because it’s their secret sauce,” says Maudslay. “So it was an excellent result when that came to pass earlier this year, and now about 10 of them are using it and we would expect to see that expand. Insurers are realising that hosting and operating systems is something they shouldn’t be doing, which is a development that bodes well for the digitisation process they’re undergoing.”
One of the biggest obstacles to digitisation is the problem posed by legacy computer systems. Almost all insurers currently operating in the market are either very old or, worse, the result of multiple mergers between very old companies. Consider, for example, that the establishment of Singapore Life in June this year was the first new life insurance company to enter the local Singapore market for 47 years.
This is not surprising. Forming an insurance company is complicated and incredibly expensive. But what it means is that these old insurance companies all have computing systems that were first put in place decades ago on mainframe computers that are now ageing. It is not just insurers that face this problem, of course. It can be seen within the banking industry when an ATM network goes down, or in the airline industry when a booking system collapses. But it is an acute problem for insurers because of the longevity of policies contained in these creaking old systems.
“They’re becoming increasingly difficult to maintain so regulators are getting concerned about it, but they’re also very difficult to change — very dangerous and very risky to change,” says Maudslay. “And insurers know that if you undertake a major mainframe replacement programme, the regulator will be sitting there waiting for you to make a mistake — and this is a view that’s often expressed to us. Most of them would love to get rid of these mainframes, but the reality is they can’t.”
What they are doing instead is building satellite systems outside the mainframes to handle a particular function, such as claims or customer relationship management, which can be handed off to data centre companies that can provide best-practice security, connectivity and so on.
“At some point, there will be sufficient satellite systems that the mainframe itself will have become, in theory, a glorified address book that can then be ported far more safely to a new and more contemporary system. But that is still a long way off.”
However, it is a task that needs to be done before the programmers who understand these old systems are literally all dead.
The impetus to make these upgrades is relatively strong in Asia, and particularly in Asean, due to young populations that are increasingly keen to interact with their insurance providers almost entirely through mobile devices, bypassing traditional models. They are also interested in new forms of insurance, such as on-demand policies that are more flexible.
As these types of insurance transactions are increasingly conducted over mobile devices, whether by agents or customers directly, the reality is that providing security at this “digital edge”, as Maudslay calls it, is going to be a challenge — and that will accelerate the transition to reliance on specialist external cloud providers who can genuinely deliver security at the edge.
Change is coming, but it is a slow process that leaves the door open for disruptors. Even though the barriers to entry are high, it remains to be seen if the industry can adapt fast enough.
- July 12
Lloyd’s has launched a new digital distribution platform – Lloyd’s Bridge – designed to quickly, easily and efficiently connect insurance businesses and entrepreneurs with Lloyd’s underwriters.
- June 29
Fuelling the growth will be the internet of things, as well as the shift of insurers' focus to customer-centric strategies.
- May 24
Consumers are favouring apps and third-party platforms ahead of through insurers' websites.
- April 17
The British insurer is aiming to boost investment in technology in the region over the next several years.