Uber has literally gotten away with murder.

"After a very thorough review of all the evidence presented, this Office has determined that there is no basis for criminal liability for the Uber corporation arising from this matter."

Arizona prosecutors' decision to not criminally charge Uber for the March self-driving-car death of Herzberg signals that tech companies won't be punished for taking egregious risks with their untested technology even when the worst happens -- in this case, the crash death of Herzberg, a homeless woman in Tempe who became the first person killed by a self-driving car.

Uber has already settled a civil case with Herzberg's family. And the National Transportation Safety Board has yet to release its full findings of an ongoing investigation. And local authorities say that the car's "backup driver," an ex-con, may still be charged with vehicular manslaughter because she was watching "The Voice" on her phone when the car hit Herzberg. She did not hit the brakes until after the collision.

But Uber isn't blameless; a preliminary report by NTSB found that Uber had deactivated the car's emergency braking system. And that decision comes down to money. Self-driving cars can be programmed to brake whenever there is an object that the computer system can't identify, which in tech jargon is called an "edge case." But programming the car that way can make the journey jerky and nauseating. Uber was in a rush to start its self-driving taxi service that summer, so it had programmed the car to take chances.

Indeed, the car's detection system had noticed Herzberg six seconds before the collision. But it did not brake because it read her shape -- she was pushing a bicycle with bags -- as benign.

So of course they're going to throw their lowest-level Uber employee "non-employee independent contractor" under the bus. But it's Travis Kalanick who should be in jail.

Previously, previously.

Tags: , , , , , , ,

19 Responses:

  1. harryh says:

    Elaine Herzberg died on March 18, 2018. Kalanick hasn't worked at Uber since June 20, 2017. It's a bit odd to suggest that he should be in jail due to the actions of a company he didn't work for at the time of the accident.

    He is on the board of directors I suppose. Maybe he and Arianna Huffington should be sharing a cell?

    • jwz says:

      He owns 10% of the company and personally controls 3 of the 17 board seats.

    • MattyJ says:

      As with any business, it takes on the personality of the founder/CEO/showrunner. Uber the company has assholery deeply ingrained into its DNA by Kalanick. To imply he 'doesn't work there' any more is disingenuous at best.

      If corporations are people, then Uber is Kalanick. Seriously, fuck that guy.

      • The names you are looking for are Eric Meyhofer, head of Uber's Advanced Technology Group, and Peter Castelli, head of ATG's Test Operations. Peter Castelli in particular. Technically speaking, it was his vehicle, and his employee who killed that woman. The test operators were not contractors, they were employees paid hourly. The job title has been discontinued.

    • Michael says:

      It is very likely, though, that the event was a direct outcome of decisions made while he was CEO; and it is probable that the decisions were made according to policies enacted by him.

  2. Zygo says:

    Indeed, the car's detection system had noticed Herzberg six seconds before the collision. But it did not brake because it read her shape -- she was pushing a bicycle with bags -- as benign.

    I've seen this text repeated several times in reporting on this incident. I'm hoping it's just one journalist getting it wrong and several other journalists parroting it, but I don't think we're in that timeline.

    If it's true, someone thought it's perfectly OK for a motor vehicle to drive itself through detectable solid objects, provided that the software can't prove the objects belong to a small set of proscribed categories.

    In other words... Pedestrians: avoid. Cyclists: avoid. Eagles: avoid (they're endangered!). Sharks: avoid (yeah there's not many sharks in the streets but the shark model came with the neural net software so why not just throw it in there?). A fruit vendor's cart rolling into the middle of the street: SPLAT! An empty garbage can on a windy day: CLANG! A tree branch that fell into the road: POW! Live power lines: ZAP!

    This error should have been economically self-correcting on the basis of bodywork repair expenses alone.

    • The problem is false postives. If a vehicle decides there's something in the way and slams on the brakes, but there's nothing there, that's a "negative rider experience" (actual Uber ATG terminology). If your system is generating too many false positives, better to ignore them and rely on operator intervention in the case of true positives than to risk bad yelp reviews.

    • DWidel says:

      Here's the thing about autonomous cars. If they stopped for things that aren't moving they'd never go anywhere. Billboards, street signs, trees: they all don't move. So they just look for things that are moving but at a different speed than themselves. Those are the other vehicles. Trying to sort out the other stuff is really, really hard. They aren't going to solve it anytime soon. In the mean time, don't park your firetruck on the highway, or attempt to cross the steet.

    • kwk says:

      "Object detected. Classification: squishable."

  3. Luckily here in Galt's Gulch we don't have any poor people getting in the way of progress like this.

    Our cars are also made of Rearden Steel so even if something like this did happen (it won't, nothing like this ever happens in Galt's Gulch) she'd just fly right off the hood without even a scratch to the car. There wouldn't be any aggravation trying to get the looter or her heirs to reimburse you for the damages she did to your property.

  4. Michael says:

    It's not like human drivers are ever charged either. I mean, the standard of awfulness to be charged criminally for killing someone with a car is crazy high. Has Arizona ever charged a sober person for killing someone with a car? Inshallah

    • Michael says:

      The problem is cars.

      • jwz says:

        A problem is cars. There can be more than one problem.

        • Michael says:

          People are also a problem. People driving cars. And Uber. People driving cars for fucking Uber. Uber driving cars without people into people. Disintermediation!

  5. Michael says:

    It is an even worse type of negligence than you describe — apparently they also deactivate the Volvo built-in automatic emergency braking system that has a track record of being actually deployed in the wild as enabled by default. So they deactivate a safety system already known to behave as a net positive.

    • Yes, the Volvo safety system was deactivated. The Volvo detection system is a simple one, and it was decided that its simplicity was interfering with the complex system that Uber was working on. It's hard to teach your machine how to figure out what to stop for if another system has already stopped the vehicle. The official ATG party line was "the test operator is responsible for the safe operation of the vehicle." It was incredibly easy to get fired from that job. The job was boring and stressful, and there was a lot of turnover. Eventually, the people with the most experience training new operators started to leave for jobs at other robotics companies. An accident was inevitable, and this was accepted. The only question was which company would be the first to kill somebody.

      • Michael says:

        Of course, that reasoning about teaching to stop is what should be punished in the first place — if a Volvo car decides to auto-brake, it was already driven in a risky way (and such events are actually a very useful training dataset — a consistently classified collection of dangerous situations)… But obviously this type of negligence is not what the officials are willing to punish.

      • BHN says:

        ...and since there aren't significant consequences to the company for killing someone, they had a fiduciary responsibility to their owners to just take the risk that Uber would be the first to kill someone with their 'autonomous' car.

        Otherwise they might have been expected to at least equal the performance of the Volvo system they turned off before letting their system out into the wild.

  • Previously