- Chris Vallance
- BBC technical affairs correspondent
4 hours ago
A new government-sponsored research report in the UK shows that despite the rapid development of artificial intelligence (smart) technology, the day when self-driving cars completely replace human-driven cars does not seem to have fully arrived, because how to define the safety of the former is not simple. s answer”.
The report from the Centre for Data Ethics and Innovation warns that it is not enough that self-driving cars are safer than regular cars.
The report notes that public tolerance for driverless car crashes is low, even though driverless cars are, on average, safer than regular cars.
The report comes as the UK government is working on plans for self-driving cars. The Department of Transportation had earlier said that some sedans, sedans and vans with self-driving capabilities might even be allowed to travel on highways next year.
The UK government’s plans include a “safety target” for the vehicles: that these cars should be as safe as those driven by capable drivers.
That would set the standards to be met before self-driving vehicles would be allowed to hit the road — and automakers might face sanctions if self-driving vehicles don’t meet those standards, the report said.
The Center for Data Ethics and Innovation is a government agency of experts responsible for trusted innovation using data and artificial intelligence. The agency said the safety of self-driving cars should not be a question that science alone can answer.
The report said the public may have a hard time putting up with such collisions, and even though autonomous vehicles are on average safer than human-driven cars, the public will see autonomous vehicle accidents as “an irresponsible tech company or norm” lax regulation”.
The report warns that it appears that the public expects self-driving cars to be as safe as trains or airliners, which would require driverless cars to be on average 100 times safer than human-driven cars.
Jack Stilgoe, professor at University College London (UCL), is an advisor to the Centre for Data Ethics and Innovation. “What we want to do is say there is no easy answer to this question,” he said, suggesting that establishing safety standards should be a democratic decision.
The Centre for Data Ethics and Innovation said it was important to consider how to spread risk among different groups. Even with improvements in overall safety, “some populations may see substantial safety improvements, while others see no improvement or even new risks.”
driver’s bias
The report recommends that other risks need to be assessed as new technologies are applied.
One is potential bias in the algorithms that control the vehicle.
The report warns that some groups of people, such as wheelchair users, may have only a small portion of the data used to train the algorithms that control the vehicle, which might create bias.
The report also states that self-driving vehicles should be clearly marked so that “people have the right to know what kind of drivers are sharing the road with them.”
In its report, the Center for Data Ethics and Innovation cited an academic survey that said 86 percent of the public agreed with this view.
How to test driverless vehicles on public roads is a serious ethical issue, because other road users, whether they want to or not, might in fact become test participants, Professor Stego said.
“It’s important to understand that consent is a moral principle,” he said.
Such technology might lead to pressure to change roads and road rules to accommodate autonomous vehicles.
Professor Stigor said this required that the debate should be conducted with transparency.
“The danger,” he said, “is that the world enters a new phase without knowing the truth, with changes to accommodate a form of transportation, but the benefits are not widespread.”
Self-Driving Highway Map
The UK government on Friday (August 19) published a policy paper in preparation for planned legislation to begin allowing self-driving vehicles to use UK roads.
The government has said the new law will be passed early as parliamentary time allows.
Legislation is expected to hold automakers accountable for the actions of self-driving vehicles, something a law committee recommended earlier this year.
The UK unveiled proposed updated road regulations in April that would allow motorists to watch entertainment videos while their vehicles are driving autonomously.
The original rule was to allow this when vehicles were travelling at low speeds on the highway, such as in congestion.
The UK government has reiterated that vehicles capable of driving themselves on motorways will be on the market by next year.
The government has also reiterated its aim to roll out autonomous driving technology on a wider scale by 2025, as well as investing £100m in funding for industry and related research.
Thatcham Research, a centre for automation research in the auto insurance industry, welcomed the ambitious goals of government guidance, but warned of the need for “a thorough clarity on the legal responsibilities of motorists” and transparency on how the market is rolling out the technology, such as ” How does the seller describe the system when handing over the car keys, and how the self-driving system itself communicates with the driver.”