Driverless personal vehicles have been discussed in the media for quite some time, however I think the implication for commercial taxi use ups the ante significantly on the insurance liability question, when higher speeds are considered. I don't believe that cars should be driverless, just driver assisted.
First, the obvious question of who would feel comfortable letting an unmanned machine deliver you or a loved one across town or to the airport? Never mind that the potential hazards one faces in navigating hourly city traffic constantly change. When you include common variables such as inclement weather, accidents, fires, falling trees along with less common hazards of sinkholes, slides, power failures, flooded roads, civil unrest or mechanical breakdown, I would imagine the the probability of an inevitable accident would be very sobering, plus new worries such as computer viruses or worms. Not to mention (although I will) one could easily speculate that autonomous, driverless taxis could someday be the ideal tool of the future terrorist.
My concerns as a fellow driver and occasional pedestrian is what would be what would happen when the computer driven taxi strikes a pedestrian, cyclist or animal or becomes involved in a collision, regardless of who's at fault. Assuming no one (or no one of ability) is riding in the back, Will it somehow render aid or call for help or will it continue on its way ? If an accident occurs in a cell phone and satellite dead zone, will the machine be smart enough to move out of the dropped zone to make an emergency call, as a human driver would? How does such a car operate in such conditions? What happens if a driverless autonomous taxi hits a deer or other large animal? Will the machine leave the corpse in place as a hazard for other drivers or will it leave itself there, as a bigger danger for drivers to collide with?
I know that autonomous cars are a work in progress, as they have had their share of accidents already. People are not perfect drivers either, but the driver assumes the ultimate responsibility should anything go wrong and in most cases has the added ability to safely remove dangers for other drivers. So, we know that the autonomous cars can only work as well as programming and infrastructure allow; and this is by no means perfect. Just consider the sad state of modern day voice recognition systems in current cars, they are awful most of the time.
Would a terrible, infamous autonomous car accident just be chalked up to new technology and remedied with a new slightly less lethal firmware update? Will the human victim(s) be blamed in the end, with the new technology getting a pass? I'm pretty sure I don't want to be the victim behind the reason for that operating system upgrade.
No right-minded insurance company would cover this potential rolling catastrophe. I would expect that even LLoyds may be too cost prohibitive for this risk. In the end I would imagine that, if approved for use; the states may allow the company's to self insure their autonomous taxis, time will tell. In fact Volvo, has mentioned that they will need to accept the liability of their autonomous vehicles. In the end we may conclude that this autonomous technology may be most applicable in low speed, low trafficked, controlled city environments.
Let's hope that the term 'computer crash' won't now include auto victims.