Our World

Back to all
Insights

Five possible ways to solve the ‘autopilot problem’

If you Google ‘How do you solve the autopilot problem’ you’ll get a tsunami of news articles on last year’s fatal accident involving Tesla’s self-driving car.

The victim, Joshua Brown, a former navy Seal, was allegedly watching a Harry Potter movie at the time of the accident. His car collided with a truck while travelling at between 60 and 100 miles an hour. The estimates vary.

As American motoring writer Alex Roy points out in his excellent analysis of the crash, a web search for “Tesla accident death” yields 200,000+ stories with Driverless or Self-Driving in the headline.

The piece goes into detail about what happened and why. As he points out, when you set the car to autopilot it comes with a strong warning: “The system is in beta. Be prepared to take over at any time. Pay attention.”

On the other hand, if you Google ‘How do you solve the autopilot problem in the insurance industry’ you’ll be bombarded with material about the insurance issues surrounding self-driving cars.

High up in the results you’ll also find MS Amlin’s White Paper which isn’t about self-driving cars at all. It’s about how to make sure that underwriters don’t become over-reliant on models when making their decisions about risk.

Maybe their models should also come with a warning: “It’s important to apply experience, knowledge and common sense to the results delivered by the model.”

As Anders Sandberg, an Oxford academic who worked on the project points out: “If you’re insuring a 28-storey sky scraper in a hurricane zone, it might be a good idea to check if it’s not made out of bamboo and being built two miles out at sea.

“If underwriters become over-reliant on models it might also result in a degradation of their skills because they aren’t using them anymore.”

Skill degradation is one of the main features of the ‘autopilot problem’ as outlined in our previous article on the topic.

Lack of awareness of their situation led to the pilots unable to stop Air France Flight 447 from Rio to Paris crashing into the Atlantic, killing all 228 people on board.

Over-reliance on what proved to be a flawed financial model led to 25,000 people losing their jobs at Lehman Brothers, a 158-year-old US bank which was forced to file for bankruptcy. It sparked turmoil in the financial markets that is still being felt today.

It is the latter event that has been the catalyst for MS Amlin’s working party, which has brought together leading academics from The Oxford Martin School and leading figures in the insurance industry to take a look at the problem.

One of the main drivers has been the need to make sure that the problems that have beset banking aren’t replicated in the insurance sector.

So how do you solve the problem of over-reliance on modelling? Well, the team came up with five main approaches that may help get closer to a solution.

Retrain the pilot
As the paper says: “Retraining the pilot is the most obvious solution. Since the ‘pilot’ performed at a higher level without the ‘autopilot’, it seems reasonable to assume the previous performance can be recaptured in some way, with the correct training, retaining old skills, re-developing situational awareness and so on.”

But there are some issues such as cognitive biases which cannot be overcome with training. Other approaches include increasing accountability of the ‘pilots’ – but that depended on how time pressures were accepted by those measuring performance.

Another approach might be to get traders or insurance underwriters to make decisions based on their skills alone without using models. It would also be useful for comparing human performance versus model performance.

Change the autopilot
Instead of re-training the pilot, another idea is to change the autopilot, by decreasing the level of automation, for example. The authors of the report point to evidence that the best autopilots are “the ones that support processes of information integration and analysis on the part of the pilots”. Another option is to have the autopilot display its confidence levels, but this is a complex issue. The model would need to have an accurate calibration of its own uncertainty – but of course calculating that the model is wrong cannot be done from within the model.

Change the pilot’s role
The Oxford Martin academics believe that this route is the most promising for the insurance industry. In the same way that airline pilots have become managers of their plane and are taking a more active role in certain situations such as take-off and landing, it’s thought that adjusting the underwriters’ role could be a way forward. As the report says: “Underwriters and other insurance staff are starting not just to make use of their models but to analyse and benchmark them as well.” This could involve simple ‘sanity’ checks…they could seek to find tools beyond the autopilot to measure or bound the variables there is uncertainty about.”

Accept the cost of the autopilot problem
“It may be worth keeping the autopilot, even if the autopilot problem can’t be fixed,” say the report’s authors. In financial markets and insurance sophisticated mathematical models have allowed the trading of previously untradeable securities or un-insurable risks. There seems no sign that automation in driving (with GPS systems) and piloting will be reversed. So it’s important the insurance industry will have to accept the costs of modelling. Costs which to be properly assessed and managed, which in itself might mitigate some of the autopilot issues.

Reduce or remove the autopilot
For many insurers this doesn’t seem a realistic option, particularly as regulators expect models to be used. Reducing use may be an option, if the necessary human skills can be developed to replace them. Another option is to choose different models, using human judgement to choose between them. In this field the lack of model diversity becomes another issue.  

The paper concludes that each situation is different and therefore there are no solutions that apply to every variant of the autopilot problem. It suggests that insurers undertake different types of intervention to try and improve performance in specific areas. “Continuous feedback and monitoring of the intervention is essential, to assess the degree of improvement and what form they take.

“Automation has contributed and will contribute much more to most fields of human endeavour, so attempts to eliminate or reduce the autopilot problem will ensure that it reaches its full positive potential.”

Read more insight stories

You may also be interested in...