
The visitors don't just come at night. They come all day, right to the end of 15th Avenue, where there's nothing else to do but make some kind of multi-point turn and head out the way they came in. Not long after that car is gone, there will be another, which will make the same turn and leave, before another car shows up and does the exact same thing. And while there are some pauses, it never really stops.
"There are some days where it can be up to 50," King says of the Waymo count. "It's literally every five minutes. And we're all working from home, so this is what we hear."
"We have talked to the drivers, who don't have much to say other than the car is programmed and they're just doing their job," King says.
"There are fleets of them driving through the neighborhood regularly," says Lewin. "And it's been going on for six, eight weeks, maybe more."
Much of the commentary I saw on this was people acting like it was some weird mystery, why would the cars glitch out like this. And an anonymous Waymo PR flack shat out some meaningless verbal diarrhea that amounted to "what we are doing is not technically illegal so you can't stop us."
But it's obvious what's happening here. Someone found a bug where these cars weren't negotiating u-turns properly, maybe even on this very street. So they added this street to the test suite, and they're going to run it again, and again, and again, until they get it right. Which is a completely normal way of debugging things when it's code on your own computer -- but is sociopathic when it's a two-ton killing machine on public streets, non-consensually involving real live humans into your process of debugging your buggy-ass software.
But here's something that might really bake your noodle. What if they aren't actually debugging code? What if this is not an edit-compile-deploy-test cycle at all. What if instead they're just training the network? What if their process is simply, "if we drive cars down this dead-end road ten thousand times, and they don't crash, we're just going to bake all of those runs into the network and call that whole 'u-turn' problem solved."
We're all going to fucking die, is what I'm getting at.
Previously, previously, previously, previously, previously, previously.
I've said this before, perhaps even here: all the people worried about cars evaluating various Trolley Problems are really, really optimistic about the level of consideration and thought being put into self-driving cars.
Yeah, all the folks badgering me about how I should just bike everywhere can add this shit on top of "BTW that take is ableist as fuck" to my "mm, yeah, nope" answer.
"It's easier to imagine the end of the world than to imagine the end of capitalism."
So we're going to get what we deserve.
Always were:
Hey man, I bought a Smart car two weeks ago....
(Now if I can get solar panels on the garage next year, the earth will definitely be saved.)
In re ableism, this and particularly the RTs of RTs in the thread.
I used to work at Google and talked to the folks who worked at waymo before it was called that. I asked them if they cared about the trolley problem. The answer is invariably, no, they aren't trying to design cars that evaluate emergency situations as moral problems with a goal of maximizing utility, their goal is to deploy a version controlled/provenance tracked system that has (statistically) fewer adverse events than human drivers per mile across a representative sample of common driving scenarios. Oh, and if they had to choose, they'd save people in the car before outside the car.
IE, when they get cars that drive themselves about as well as an average person in terms of killing nonoccupants, their job is finished and Waymo lawyers/lobbyists begin discussions with DoT and lawmakers about expanding the scope of self driving.
Last I checked the waymo fleet had driven enough miles and killed few enough people that we can expect them to transition to the lobbyist mode in a couple more years.
Well, that's pretty damning.
The trolley problem is not a principle of autonomous control system design, it’s a paradox meant to illustrate the foibles of ethical reasoning. Can you imagine someone actually implementing a system that tries to compare the utility of hitting two different crowds? That would be much more unethical.
We have a park right near our city center. It used to be full of learner drivers until a law or regulation was passed that prohibited learner drivers. It's not that difficult of a solution. Wherever self-driving car testing becomes a nuisance, ban it. This stuff happens, templates for remedies already exist.
What about 16th street? 17th? etc.? Maybe because on a map it looks like it should be a through street ...? In which case I might be semi-impressed that it figures out it can't just plow past the "Do not enter" sign.
The PR idiot claims this is part of the slow streets initiative and is somehow related to Lake street, but this has ben a one-way exit from the Presidio for at least 9-10 years that I know about. It's not a recent change to the 'dynamic San Francisco road rules.'
Surprised the Karens in that neighborhood aren't more up in arms about it.
Someone should just put up a homemade "Do Not Enter" sign (with a disclaimer underneath that says "Only valid for robot cars", facing "inwards", at the other end of the street. Then they'd have a robot car trap. Although they might just endlessly do U-turns until they ran out of juice, and the street would just be littered with dead robot cars, still not ideal.
The same sign facing out to prevent cars entering that bit od 15th St. is probably a better solution.
I think if this were near where I live, the neighbors would just wait for the car to be on the problematic street and place traffic cones to trap it there. Do that a few times and they'll probably start avoiding the trap.
Remember that the robot does come equipped with a minimum wage monkey-in-a-can as a helper for these situations. (Not an actual Trunk Monkey, sadly.)
I was thinking of this, instead:
Which I am surprised to discover, Amazon does not sell. They do sell captrops, though.
Sadly, the places that sell those will only ship to cops.
Looks pretty easy to make. Flat bar, a metal cutting bandsaw, a drill press, a few bolts, nails and epoxy and you have your own personal spike strip. Not that I'd condone anyone putting one of these into use to trap a murderbot.
You're on the right track.
But the best road spikes are hollow, so when they inevitably break off they hopefully become embedded in the tire and ensure rapid deflation by providing a sure path through which air may flow.
And that said: Hollow metal caltrops are not hard to make at home with fairly basic tools like a saw, a grinder, and a welder.
Alibaba sell them in varying levels of quality between $70 and $230 plus a hefty shipping charge from China. Lots of reasons not to buy there, but it's an option for the very enthusiastic.
I knew I should have done something with areadenialdevices.com
Nice to see the suggestion for a heavy-duty tire repair kit included in the search results for caltrops. I wonder who buys these.
If I lived in SF, I don't know that I could resist the temptation to pull up behind it and also start making the three point turns, but I'm a bit bad at those, we all may be there a while. Unfortunate.
The word Caltrops always sounds like it should be a state agency.
California Transportation Operations center. Or some such.
Soon the basic multi-point turn won't be enough, and they'll want to add augmentation to the training run. Let's see how bananas this road can get!
There could be another factor involved - the road databases sometimes don't allow for left turns.
There's a doctor's office I go to in Reston, VA (outside DC). I drive in from the west, and it is an easy left-turn in from that main road.
And most of the map systems, including Apple, Mapquest, and Google, won't let me left turn there. ALL of them tell me to pass it and u-turn, or stay on another road to left-turn onto the road in question so I can right-turn into the office's parking lot.
These systems are all working off of vector map systems given to them by the states and the counties, and they often have mistakes in them, like saying there is a divider in the road so you can't turn left.
So it may not be the fault of the car software so much as the supplier of the map data that is telling them that where they were supposed to go, they can't get to by the obvious (to us with eyes and brains) route.
That said, this is still the kind of data that software is going to use to kill us all, so there we are. Not disagreeing with the last line at all.