Fully autonomous American killing machines still aren't allowed to go around murdering people willy nilly. There are rules -- or policies, at least. And their robots will follow those policies.
Yes, the Defense Department is still building murderous robots. But those murderous robots must adhere to the department's "ethical standards."
Arizona prosecutors' decision to not criminally charge Uber for the March self-driving-car death of Herzberg signals that tech companies won't be punished for taking egregious risks with their untested technology even when the worst happens -- in this case, the crash death of Herzberg, a homeless woman in Tempe who became the first person killed by a self-driving car.
Uber has already settled a civil case with Herzberg's family. And the National Transportation Safety Board has yet to release its full findings of an ongoing investigation. And local authorities say that the car's "backup driver," an ex-con, may still be charged with vehicular manslaughter because she was watching "The Voice" on her phone when the car hit Herzberg. She did not hit the brakes until after the collision.
But Uber isn't blameless; a preliminary report by NTSB found that Uber had deactivated the car's emergency braking system. And that decision comes down to money. Self-driving cars can be programmed to brake whenever there is an object that the computer system can't identify, which in tech jargon is called an "edge case." But programming the car that way can make the journey jerky and nauseating. Uber was in a rush to start its self-driving taxi service that summer, so it had programmed the car to take chances.
Indeed, the car's detection system had noticed Herzberg six seconds before the collision. But it did not brake because it read her shape -- she was pushing a bicycle with bags -- as benign.
So of course they're going to throw their lowest-level
Uber employee "non-employee independent contractor" under the bus. But it's Travis Kalanick who should be in jail.