There's ugly symbolism in the deadly accident that took place in Tempe, Arizona, last Sunday evening. A self-driving Volvo operated by Uber, the world's second-most valuable startup, ran over Elaine Herzberg, 49, as the apparently homeless woman pushed a bicycle loaded with plastic bags into the street. Rich company kills poor person, robot kills human.

Even though Tempe police are not inclined at this point to blame the Uber vehicle — Herzberg apparently stepped into the road suddenly from the shadows — optics such as these are likely to set back the autonomous vehicle industry. And that's not a bad thing, regardless of whose fault it was in Tempe. This is still a world populated and run by humans, and we humans should be given more time to decide whether we want machines to take over our roads. The issue is ethical as much as technological.

Elon Musk, the founder of Tesla, has been the best at articulating the pitch autonomous car developers have made to regulators. "If, in writing some article that's negative, you effectively dissuade people from using an autonomous vehicle, you're killing people," he told reporters in 2016.