One machine had its tyres slashed while stopped in traffic. Another was threatened by a man brandishing a gun. The man involved in that incident was later charged with aggravated assault and disorderly conduct, and the police report of his arrest noted that he “stated that he despises and hates those [Waymo] cars.”
One Jeep Wrangler was reported trying to run Waymo vehicles off the road on several occasions, with its owner, Erik O’Polka, telling the New York Times: “There are other places they can test. They said they need real-world examples, but I don’t want to be their real-world mistake.”
Undoubtedly, these are isolated incidents: Waymo says its vehicles log more than 25,000 miles of autonomous running a day in Arizona. But they hint at an underlying mistrust of self-driving cars, perhaps linked to some of the high-profile incidents such machines have been involved with.
That includes the first recorded case of a pedestrian, Elaine Herzberg, dying after being struck by an autonomous vehicle, an Uber-run Volvo XC90 in Tempe, Arizona in March 2018.
That incident rightly prompted much talk about the safety of autonomous cars - even though police reports focus on human factors (the ‘back-up’ driver was distracted watching video on her phone; Herzberg was crossing an unlit road at night without paying attention; and Uber had disabled the XC90’s emergency braking system).
Make no mistake: it is vital that every lesson possible is taken from that incident and applied to make autonomous cars as safe as possible. But the same applies to any accident. And consider that in 2017 (the last year for which data is available), Tempe police recorded 8686 car accidents – including 24 fatal incidents – involving normal cars in the city.
There is always resistance to change, and a tendency not to trust new technology – and there is no question that the arrival of autonomous cars creates a massive amount of ethical, moral and legal questions. The UK government is wise to recognise such issues need to be explored and discussed right now, and that people understand the potential benefits – and, yes, limitations – of autonomous cars.
So do autonomous cars make you nervous? I can understand if they do, but I’d also question whether they should. As I've written before, I’m minded to trust an autonomous car featuring a high-powered computer processor programmed by some of the sharpest minds in the industry to do better than some of the crazily bad driving you can see on UK roads every day.
One example appeared on on the same day the government updated the Code of Practice: a driver in Devon was arrested on suspicion of drug-driving having crashed his car. He told police that the crash occurred because he had swerved to avoid an octopus. Yes, an octopus. Police said they found no evidence of an octopus on the road.
You’d hope that’s not a problem an autonomous car will ever have…
Fully driverless cars could run on UK roads by the end of 2019
Insight: Waymo, Google's self-driving cars division
Opinion: why do people think autonomous cars are scary?
Driverless cars: the race to save 500,000 lives