This morning I was renewing my car insurance while reading the latest news on the “Google Car” – Google is using more advanced AI to allow cars to “think” more like humans – see CNET’s article. Anyway, I was thinking, would I still need insurance if I bought a Google Car? Well, it seems that the government has already given this some thought.
SDCs opening up car use to more people
According to the Department for Transport’s (DfT) ’Pathway to Driverless Cars’ report, one of the chief benefits of the driverless car is the way it could provide access to personal transport for people presently unable or unwilling to drive.
People such as the disabled, the elderly and those who simply haven’t learnt to drive yet could have access to personal transport. For example, 31% of women don’t hold a full driving licence, so car use could make a big difference particularly for transporting children.
Legal shake-up required
There will need to be a large scale legal shake-up as regards insurance and licensing. For example, who will be to blame in the event of an accident – the car or the driver? Or maybe Google’s John Krafcik?
It’s widely predicted that self-driving cars will reduce accidents, as 90% of them are caused by driver error. This is already being born out in practice as some driverless features, such as automatic emergency braking, are reducing low speed collisions by 20%.
At least in the early stages of self-driving technology, the driver would be more like a pilot in terms of occasionally undertaking some driving functions and optionally overriding automated settings. While this is the case, conventional car insurance would be required.
When full self-driving cars are in use, then it’s likely the courts might have to decide who is liable in the event of an accident. Is it the manufacturer? The software developer (much of the self-driving technology is software and computer-based). This could cause controversy and maybe a degree of the ‘blame game’ being played.
For example, might the manufacturer try and blame the maker of a specific component if a car malfunction was found to be the cause or a contributor to an accident? Might the user (or ‘driver’) be blamed if, say, they haven’t kept the car’s software updated?
The Government also say in the ‘Pathway to Driverless Cars’ DfT report that they’ll look to provide “greater certainty around criminal and civil liability in the event of an automated vehicle being in a collision”.
Perhaps the driverless car will be insured and judged on its own risk quotient rather than the driver and their driving record. Perhaps the government will levy an amount on each driverless car sold to pay for a uniform insurance policy covering it.
The ABI (Association of British Insurers) hints at the ‘transfer of risk’ from the driver to the car itself (see “Driverless cars or autonomous vehicles“), and is working with the Government, vehicle manufacturers, the legal community and regulators to tailor a solution to reflect the impending driverless car landscape. It is even possible that a type of taxi insurance will need to be taken out by the owner on behalf of the car itself!
While a degree of manual input from the driver is required to drive a car, then a driving licence will be needed. If any type of ‘pilot’ activities apply then the car isn’t fully self-driving. Once it is, then a conventional driving licence won’t be required.
It may not be as simple as this. In the same way cyclists and even pedestrians are encouraged to learn some ‘road awareness,’ it’s likely users of driverless cars will either be encouraged or required to possess some road user knowledge.
How this translates into a legal requirement (if it’s deemed necessary as a legal requirement) remains to be seen. Maybe some type of theory test similar to that contained in the current driving test may be used?
The technology for driverless cars is being adopted in stages and while a degree of manual operation is involved then full driving licences and ‘conventional’ car insurance applies.
An important aspect of learning to drive is developing spartial awareness – modern theory tests require you to watch interactive videos and identify threats as they unfold. If these skills are not taight to driver-passengers, there is still the risk of accidents occuring. At the moment, the technology may not be as good as a human who is able to take in everything that is happening, on side roads, junctions and pavements.
Will a Google Car spot the child riding their bicycle on their front drive, who may suddenly turn into the road? And how well will Google be able to identify a motorcyclist approaching a junction? It is possible that human error will be replaced by computing error, and when this happens, who will be held responsible?
Photo: By Mark Doliner [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)], via Wikimedia Commons