Uber launched a fleet of its much anticipated self-driving cars in San Francisco on Wednesday, and by late morning the effort already hit a bad-driver milestone: running a red light. [...]
Annie Gaus, a freelance writer and producer in San Francisco, tweeted Wednesday morning that she "Just passed a 'self-driving' Uber that lurched into the intersection on Van Ness [Avenue], on a red, nearly hitting my Lyft." [...] "It was close enough that both myself and the driver reacted and were like, 'Shit,'" she said. "It stopped suddenly and stayed like that, as you see in the photo."
With Uber's self-driving cars now on the streets of San Francisco, the enforcement of traffic violations is in the hands of The City's Police Traffic Company, which was unaware Wednesday morning that the vehicles began roaming city streets that day. [...]
"I was unaware the cars have been released in the wild," said San Francisco Police Traffic Company Sgt. Will Murray. "Isn't that like the headless horsemen?"
"They are required to have someone seated in the front driver's portion of the vehicle," said Murray, who added that, "If they were committing flagrant violations, if they were not obeying the laws" then traffic officers will pull them over and ticket them.
He did not say if that had yet occurred or how one goes about ticketing a car driven by a computer.
Uber Blames Its Drivers As More Reports Of Self-Driving Cars Running Red Lights Surface
Uber's action is illegal, California DMV Deputy Director Brian G. Soublet wrote in a letter to Uber late Wednesday, which was also sent to press. Soublet added that the ride-hail behemoth was required to obtain an autonomous vehicle testing permit before operating self-driving vehicles on city streets.
"If Uber does not confirm immediately that it will stop its launch and seek a testing permit, DMV will initiate legal action," the DMV wrote, "including, but not limited to, seeking injunctive relief."
Suggesting that this was more than first day jitters, KRON 4 got its hands on a set of photos that the channel says show an autonomous Uber driving through a red light on Harrison at 4th Street. The pictures were taken on Sunday morning, which means that the car was likely being used for testing or mapping purposes and did not carry a paying passenger. Still, it would suggest that the software piloting the autonomous vehicles had problems as recently as three days before the much publicized launch of the autonomous ride-hail service. That is, unless these incidents are all the result of human error -- a.k.a. Uber drivers.
"These incidents were due to human error," an Uber spokesperson told the Guardian about the both the Van Ness incident and the 3rd Street incident. "This is why we believe so much in making the roads safer by building self-driving Ubers. The drivers involved have been suspended while we continue to investigate."
Isn't that neat? It's the humans, not the un-permitted software, that is at fault according to Uber. Unfortunately, that argument likely won't sway the DMV.
So let's see...
The self-driving software is bad enough that they run red lights and make dangerous turns... but they have humans in the drivers' seat! Who are also so terrible at their jobs that they can't prevent the car from running red lights and must be fired.
I guess none of us are as incompetent as all of us? The software is so bad that it makes human drivers even worse?
The usual argument for self-driving cars is that they will be safer for everyone than human-piloted cars. If that hypothesis turns out to be true, then I'm all for them! One can even imagine a shiny Starfleet future where self-driving cars lead to the end of personal car ownership and dramatic emissions reduction. Enter the shimmering arc!
Uber, of course, does not give the slightest fractional shits about whether self-driving cars are safer or cleaner: they are interested in them because they are cheaper. Allow me to remind you of this bit from Fight Club:
I'm a recall coordinator. My job is to apply the formula. It's a story problem.
A new car built by my company leaves somewhere traveling at 60 miles per hour. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now: do we initiate a recall?
Take the number of vehicles in the field, (A), and multiply it by the probable rate of failure, (B), then multiply the result by the average out-of-court settlement, (C). A times B times C equals X... If X is less than the cost of a recall, we don't do one.
And now we get to the part where the Uber software, operating as designed, is now literally trying to murder me:
Before the surprise launch of Uber's autonomous vehicles on San Francisco streets this week, I rode in one. I can tell you firsthand: Those vehicles are not yet ready for our streets.
I was at one of the demonstrations covered in the SF Examiner, along with others who Uber hoped to impress with their new technology. None of us were told that just two days later, Uber would be releasing this technology on our streets on a large scale. I did tell Uber some things about the shortcomings of that technology, however.
In the ride I took through the streets of SoMa on Monday, the autonomous vehicle in "self-driving" mode as well as the one in front of it took an unsafe right-hook-style turn through a bike lane. Twice. This kind of turn is one featured in a 2013 blog post that is known to be one of the primary causes of collisions between cars and people who bike resulting in serious injury or fatality. It's also an unsafe practice that we address in all of the safety curriculum we offer to professional drivers, including the videos we consulted on for Uber as recently as this fall.
I told staff from Uber's policy and engineering teams about the safety hazards of their autonomous vehicle technology. They told me they would work on it. Then, two days later, they unleashed that technology on San Francisco's streets. Your streets.
Since yesterday, we have been told that "safety drivers" in these vehicles have been instructed to disengage from self-driving mode when approaching right turns on a street with a bike lane and that engineers are continuing to work on the problem. In the meantime, Uber is continuing to operate autonomous vehicles for passenger service in San Francisco.
There's no other way to put it: Launching autonomous vehicle technology before it's regulated and safe for our streets is unacceptable. If you support safe streets, please sign the petition to tell Uber to address this dangerous and illegal turning behavior immediately.
The people who wrote this software do not understand the traffic laws and programmed it with a set of rules that they figured was close enough. And then released them into the public.
"Disrupt transportation! Move fast, release early, and crush innocent people under two tons of fast-moving steel!"
I really can't express how unsettling it was today, riding my bicycle in traffic in the rain -- a time when San Francisco drivers are notoriously even less competent and more erratic than under normal conditions -- and wondering what fresh new hell of unpredictability I might encounter from poorly-behaving software in an alpha-test that I most assuredly did not click "Agree" on.