Tesla adds Road-Rage Mode to their murderbots

Why Tesla Can Program Its Cars to Break Road Safety Laws:

Putting a Tesla in "assertive" mode will effectively direct the car to tailgate other motorists, perform unsafe passing maneuvers, and roll through certain stops ("average" mode isn't much safer). All those behaviors are illegal in most U.S. states. [...]

Regulators, though, have so far had a hard time reining in AV companies that fail to prevent their customers from flouting roadway rules -- never mind ones like Tesla, which actively enable their customers who would use automation in service of their personal convenience and speed, rather than to enhance collective safety.

That's in part because, by and large, U.S. law tends to favor penalizing individual drivers for breaking the law, rather than penalizing car manufacturers whose vehicle designs make breaking those laws easy. No automobile company has ever been prosecuted for installing an engine that can propel a car more than 100 miles an hour, for instance, even though such speeds aren't legal on any road in America; nor have companies been held accountable when their customers use cruise control to speed, even though technology to automatically stop all speeding has existed for decades. [...]

"The way it's worked, historically, is that [the National Highway Traffic Safety Administration] is in charge of vehicle safety, and individual states have been in charge of the safety of human drivers," he said. "But it gets complicated when the car starts assuming the responsibility of the human driver... Stopping at a stop sign is not a federal law; it's a state law. Sure, NHTSA can say your car design is unsafe because it's breaking a lot of state laws, but they can't enforce those laws themselves."

Many states are pursuing legislation that would hold AV companies accountable for deploying vehicles that can violate roadway rules with the touch of a button -- though others, under pressure from industry lobbyists, are passing bills aimed at encouraging AV testing on public roads, and to shield their manufacturers from legal action when their safety software fails.

Previously, previously, previously, previously, previously, previously, previously.

Tags: , , , , ,

16 Responses:

  1. J. Peterson says:

    Coming in Q3: Hit & Run Mode

  2. Dude says:

    Corruption at Tesla? The hell you say!

    Why, next you'll say that at least part of the company's corruption can be attributed to Musk-oil getting personal business lessons from Jeffrey Epstein, whom he met via mutual acquaintance Ghislaine Maxwell.

    How dare you even think such a thing!

  3. Michael says:

    Years ago I had an argument with someone who claimed self-driving cars would be safer, because they would be programmed that way. My position was that this will last exactly up to the point where someone will mod their car to be more aggressive.

    I admit, even back then I did not expect that they come from the factory that way, but then I still didn’t had a strong opinion about Musk. So, not really surprised, not even disappointed. I have zero positive expectations these days for anything “Valley”.

  4. Doctor Memory says:

    If nothing else, the first wrongful death lawsuit over this should be one for the ages.

  5. Every Tesla employee from Elon on down should be indicted for conspiracy to drive recklessly.

    • Dude says:

      Thanks to Musk-oil's union-busting, Tesla doesn't have "employees" per se; just a bunch of faces he must think are Replicants, and he can't understand why they don't just return to his COVID-riddled factory to continue their work, like they were programmed to do.

  6. Thomas Lord says:

    We need some platform standards for cars that can be controlled via software well enough for Tesla-level self-driving. Standards like when IBM opened up the PC architecture. And some "open source" frameworks so you can hack your own car to drive as you like. The "magic cauldron" effect but for deathbots. See how that goes.

    It can't be worse.

    It'll make it a lot easier to ban cars entirely in a lot of places.

  7. Zygo says:

    Police departments today could purchase vehicle GPS data from car companies (or mobile phone GPS data from cell carriers for that matter), run it through some software, and send drivers a bill for all their speeding fines at the end of every month.

    Some insurance companies already offer this service in reverse to their own customers, offering a discounted premium to customers who can convincingly fake a GPS data log, or drive a little slower, whichever is easier.

    Insurance companies are probably the real gatekeepers here, at least in states with mandatory auto insurance. "Tesla model X5 murderbot? Monthly premiums start at $100, but increase to $5000 if you touch the 'mode' button. We buy a list of drivers who touched the button every month from Tesla."

    Just stay the hell away from roads near where rich people live, work, or shop, and you'll probably be OK.

    • Aidan Gauland says:

      Just stay the hell away from roads near where rich people live, work, or shop, and you'll probably be OK.

      I mean, that's generally good advice, anyway.

  8. thielges says:

    Yet another reason why the criteria certifying auto drive software should never be relative to human driver behavior. Actual human drivers are a terrible role model.

    Tesla is trying to lower the bar. So long as their software can outperform the average aggro self centered distracted clown, all is well.

    Be best, autopilot.

  9. Rudolf Polzer says:

    Reminds me of an article from years ago where some self driving car manufacturer had programmed their cars to speed.

    Reasoning was that a car going a little above the speed limit is less likely to get rear ended than a car that goes exactly the limit.

    Which is a fair point - should AVs be programmed to minimize total accidents, or at-fault accidents only?

    However clearly Tesla does neither and deserves to be kicked out of the industry. Maybe kicked out from earth. It is because of Tesla's aggressive experimenting that we cannot buy real AVs yet.

    Maybe weed junkies should not be car company CEOs. After all, they aren't allowed to drive, so why should they then be allowed to make vehicles...

    • Michael says:

      Maybe weed junkies should not be car company CEOs.

      I doubt very much he’s a weed junkie, that would mellow him out way too much. There’s some other stuff involved.

  10. HH says:

    Me: Say, all car companies, not just Tesla: you guys know how fast the car is going, probably what the speed limit is and where the stop signs are, when you're too close to other cars etc. Why don't you just enforce the rules?

    Car companies/regulators/(presumably) rolling coal enthusiasts:

  11. dpf says:

    at least you can pay for it in doge (somehow elevating it to less of a joke or something).


  12. Dude says:

    ...and now, this is happening:

    (via Engadget): "Tesla driver in fatal California crash first to face felony charges involving Autopilot"

    The article says it's the first such charge to be brought in a case like this.

  • Previously