Who should we blame if an autonomous vehicle causes injury or loss of human life?
By Steve Smith
I have had a keen interest in driverless or autonomous vehicles for some time. My interest stemmed from some work I did around 2011 on an earlier form of the technology. In the past week an Uber driverless vehicle collided with a pedestrian while on a test run. The Uber vehicle was carrying a safety driver, but the vehicle was in self-driving mode when the accident happened. Uber has temporarily halted further testing of their self-driving vehicles pending an investigation.
Many have been quick to condemn self-driving and autonomous vehicles as ill-conceived and lacking the awareness and capabilities of human drivers. Some have hinted that the safety driver may be at fault in failing to prevent the accident and all of this was in the media before the full facts of the incident were known.
Jeremy Vine discussed this incident on his Radio 2 lunchtime programme on Tuesday. His introduction to the topic: “We find out why a pedestrian was killed by a driverless car in Arizona – maybe the clue is in the question.” was already implying the technology was at fault. Is this incrimination (implied or explicit) justified? I don’t think so. Would we not still be living in caves, hunting with stones and wooden spears had we not learned and moved on when something fails to go to plan. We did not stop building ships when the Titanic sank. On the contrary, the maritime authorities learned from the disaster and effected a radical and positive change in passenger liner safety. So, before we write off autonomous and self-driving vehicles let’s take a quick look at the ‘bigger picture’.
According to CNN, more than 32,000 people died on roads in the US in 2013 – around 88 deaths on the road per day. In the same year, the UK government statistics recorded 1,713 deaths in reported road traffic accidents – roughly five fatalities per day.
Proponents of autonomous vehicles claim they will be significantly safer than vehicles driven by humans while those opposing autonomous vehicles say they are incapable of processing the level of information a human driver processes such nearby schools and children playing near the road. A human is likely to realise a child will follow a ball into the road whereas an autonomous vehicle may not. Is this a fair assessment? There is a school in the road where I live. There are school warning signs, a 20 miles per hour speed limit and speed ramps. But that doesn’t stop the idiots who think the warnings do not apply to them racing up and down the road close to ripping the sump out of their ‘motors’ on the ramps.
I was involved in a road traffic accident a couple of years ago. I was on the M25 on the inside lane at around 2330 and there was virtually no other vehicles on the road. Nothing in front and only one visible set of vehicle headlights approaching from behind in the centre lane. Out of nowhere, the vehicle coming up from the rear slammed into the side of my vehicle. When the driver got out of his vehicle and approached us on the hard shoulder he said his mobile had dropped onto the floor from the passenger seat and that distracted him to the point that when he looked over he veered into the side of our vehicle.
The Uber autonomous vehicle is equipped with sophisticated navigation and safety systems including: a LiDAR to create a 3D image of the surroundings, an array of cameras and GPS. It has the ability to detect vehicles and people and even red traffic lights. With all of this technology it is not going to be distracted. It will not take its ‘eyes’ off the road to change a CD, swat a fly, take a drink, answer a phone. respond to an email or send a text message.
This raises another point the opposers are constantly raising and that is the question of ‘who to kill?’ or perhaps the less emotive ‘who to sue?’ in the event of an accident. This is a very serious matter that has to be resolved to ensure the legal system is prepared for cases such as the fate of the poor woman in Arizona. But, is it so clear cut now for accidents involving vehicles driven by humans?
In the accident I was involved with, despite the other driver saying he got distracted at the scene of the accident, when it came to the claim he accused me of veering into him. The insurance company resorted very quickly to the fallback ‘knock for knock’ even though there were three people in my vehicle and only the driver in the other vehicle. The insurance company refused to accept witness statements from family members and I lost my no claims bonus. I accept that my claim was trivial when compared to the fatal accident in Arizona but, it could have been given the speeds involved. Clearly, the current insurance claim process is far less than ideal.
I believe autonomous vehicles will become an important part of life in the future, not just on our roads, but on rivers, oceans and in the sky. Autonomous vehicles will be liberating for elderly and physically impaired people giving them the freedom of choice as to how and where they can travel. The introduction of ‘Car as a Service’ could have a major impact on pollution levels as well as reducing parking congestion in our cities and towns. The benefits are considerable but we must allow the technology to be developed just as we did for our current vehicles. Can you imagine the M25 with someone walking in front of every vehicle with a red flag?
The UK government is reviewing the law to prepare for the introduction of autonomous vehicles on our roads. Autonomous vehicles are at the centre of our governments industrial strategy post Brexit and the chancellor Philip Hammond has even promised they will be on our roads within the next three years. The review will address the legal aspects including where the responsibility lies when the vehicle is being controlled by a human, a computer or a combination.
The Uber autonomous vehicle will not be distracted in a human sense, but it could be ‘distracted’ by a design flaw such as a software bug. Uber had no choice but to halt any further testing until this fatal accident has been thoroughly investigated. When the facts are known the parties involved must learn, share their findings and continue with development. In my opinion halting progress is not an option.
Is this the end of autonomous vehicles?
Who should we blame if an autonomous vehicle causes injury or loss of human life?
By Steve Smith
I have had a keen interest in driverless or autonomous vehicles for some time. My interest stemmed from some work I did around 2011 on an earlier form of the technology. In the past week an Uber driverless vehicle collided with a pedestrian while on a test run. The Uber vehicle was carrying a safety driver, but the vehicle was in self-driving mode when the accident happened. Uber has temporarily halted further testing of their self-driving vehicles pending an investigation.
Dllu, Uber autonomous vehicle prototype testing in San Francisco, CC BY-SA 4.0
Many have been quick to condemn self-driving and autonomous vehicles as ill-conceived and lacking the awareness and capabilities of human drivers. Some have hinted that the safety driver may be at fault in failing to prevent the accident and all of this was in the media before the full facts of the incident were known.
Jeremy Vine discussed this incident on his Radio 2 lunchtime programme on Tuesday. His introduction to the topic: “We find out why a pedestrian was killed by a driverless car in Arizona – maybe the clue is in the question.” was already implying the technology was at fault. Is this incrimination (implied or explicit) justified? I don’t think so. Would we not still be living in caves, hunting with stones and wooden spears had we not learned and moved on when something fails to go to plan. We did not stop building ships when the Titanic sank. On the contrary, the maritime authorities learned from the disaster and effected a radical and positive change in passenger liner safety. So, before we write off autonomous and self-driving vehicles let’s take a quick look at the ‘bigger picture’.
According to CNN, more than 32,000 people died on roads in the US in 2013 – around 88 deaths on the road per day. In the same year, the UK government statistics recorded 1,713 deaths in reported road traffic accidents – roughly five fatalities per day.
Proponents of autonomous vehicles claim they will be significantly safer than vehicles driven by humans while those opposing autonomous vehicles say they are incapable of processing the level of information a human driver processes such nearby schools and children playing near the road. A human is likely to realise a child will follow a ball into the road whereas an autonomous vehicle may not. Is this a fair assessment? There is a school in the road where I live. There are school warning signs, a 20 miles per hour speed limit and speed ramps. But that doesn’t stop the idiots who think the warnings do not apply to them racing up and down the road close to ripping the sump out of their ‘motors’ on the ramps.
I was involved in a road traffic accident a couple of years ago. I was on the M25 on the inside lane at around 2330 and there was virtually no other vehicles on the road. Nothing in front and only one visible set of vehicle headlights approaching from behind in the centre lane. Out of nowhere, the vehicle coming up from the rear slammed into the side of my vehicle. When the driver got out of his vehicle and approached us on the hard shoulder he said his mobile had dropped onto the floor from the passenger seat and that distracted him to the point that when he looked over he veered into the side of our vehicle.
The Uber autonomous vehicle is equipped with sophisticated navigation and safety systems including: a LiDAR to create a 3D image of the surroundings, an array of cameras and GPS. It has the ability to detect vehicles and people and even red traffic lights. With all of this technology it is not going to be distracted. It will not take its ‘eyes’ off the road to change a CD, swat a fly, take a drink, answer a phone. respond to an email or send a text message.
This raises another point the opposers are constantly raising and that is the question of ‘who to kill?’ or perhaps the less emotive ‘who to sue?’ in the event of an accident. This is a very serious matter that has to be resolved to ensure the legal system is prepared for cases such as the fate of the poor woman in Arizona. But, is it so clear cut now for accidents involving vehicles driven by humans?
In the accident I was involved with, despite the other driver saying he got distracted at the scene of the accident, when it came to the claim he accused me of veering into him. The insurance company resorted very quickly to the fallback ‘knock for knock’ even though there were three people in my vehicle and only the driver in the other vehicle. The insurance company refused to accept witness statements from family members and I lost my no claims bonus. I accept that my claim was trivial when compared to the fatal accident in Arizona but, it could have been given the speeds involved. Clearly, the current insurance claim process is far less than ideal.
I believe autonomous vehicles will become an important part of life in the future, not just on our roads, but on rivers, oceans and in the sky. Autonomous vehicles will be liberating for elderly and physically impaired people giving them the freedom of choice as to how and where they can travel. The introduction of ‘Car as a Service’ could have a major impact on pollution levels as well as reducing parking congestion in our cities and towns. The benefits are considerable but we must allow the technology to be developed just as we did for our current vehicles. Can you imagine the M25 with someone walking in front of every vehicle with a red flag?
The UK government is reviewing the law to prepare for the introduction of autonomous vehicles on our roads. Autonomous vehicles are at the centre of our governments industrial strategy post Brexit and the chancellor Philip Hammond has even promised they will be on our roads within the next three years. The review will address the legal aspects including where the responsibility lies when the vehicle is being controlled by a human, a computer or a combination.
The Uber autonomous vehicle will not be distracted in a human sense, but it could be ‘distracted’ by a design flaw such as a software bug. Uber had no choice but to halt any further testing until this fatal accident has been thoroughly investigated. When the facts are known the parties involved must learn, share their findings and continue with development. In my opinion halting progress is not an option.