FAFO WaPo tech writer

If this link works, it will take you a WaPo article where the author tests out Tesla’s recall update software and is shocked that his Model Y won’t stop itself at stop signs, even though it shows the upcoming stop sign on the screen. He later looks up on a website to find that he would have had to pay extra for that feature.

The comments are amusing. Anti-Muxxers have a lot of the same sentiments as antivaxxers. “Demon cars! Death machines! Human guinea pigs! Untested! Experimental!” The author proved that even Tesla can’t fix stupid.

WaPo gift article link

Subscribe
Notify of
5 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
JGorn
Admin
Famed Member
JGorn(@admin)
1 month ago

Thanks for the gift article.

As an old software engineer who spent years focused on safety critical software, I’m offended by driver assist software. Not the general concept, of course, but the fact that it is developed haphazardly without the kind of standards that would apply in avionics. I cut my teeth in avionics, specifically the flight management system for the Boeing 737, and the standards required for that kind of system are extreme. But in many ways automotive systems are more complex. Not the systems themselves, so much as the environment in which they must operate. Not only are environmental conditions chaotic and often unpredictable, but drivers are morons. The fact that the government hasn’t stepped up and enforced a high level of certification standards commensurate with the life-endangering complexity of the target problem is infuriating. People will die, and we just don’t seem to care as long as it’s on the roadways instead of a flaming fireball from the sky.

Sure, it’ll work most of the time. Well over 90%. Maybe even 98%. But that’s not good enough. Safety critical systems should be a minimum of “five nines” reliable. That’s 99.999%. Nine nines is a better goal.

My car is a Hyundai Ioniq, which has the same kind of driver assist features as a Tesla, where it basically will drive itself unassisted in the right conditions, and I use it as it makes dealing with stop and go conditions much more convenient. But for sure I’m in full control and paying attention at all times, and I absolutely would not trust my life to it. Knowing that cars near me are being auto-driven with morons playing games on their phone is infuriating. I can’t tell you how many times the software has failed because it gets confused by an object or boundary condition for which it has not been adequately tested.

Musk and his ilk absolutely do not give a shit about you and your safety. You are just a source of money to be exploited, and otherwise you are a nuisance to him. He would be happy to risk sending you on a suicide mission to Mars, or smearing your guts in an unworkable underground vacuum tube, so long as he lines his pockets with other people’s money. He’s equally dismissive about over-promising and under-delivering with his vehicles as well. That’s fine as far as his buyers go – we all get to decide what risks we want to take. The problem is that driver-assist and self-driving software puts everyone at risk and not just the people who sign up for it.

If you really want to do it right, you would implement it only where the infrastructure is built to support it, and then have rigourous testing and maintainance protocols. Such an environment simply does not exist, nor is it even under discussion

Original content copyright © The FAFO Chronicles or individual member authors. Other content containing copyright material owned by identified sources is either used by permission, or is believed to fall under the "fair use" doctrine of the US Copyright Office. If you believe your copyright has been infringed, contact the site administrator and such content will be promptly removed.
5
0
Would love your thoughts, please comment.x
()
x