332
Elektrek: "Tesla FSD Beta tried to kill me last night"
(electrek.co)
This is a most excellent place for technology news and articles.
I'm sure you're just going to downvote this and move on without reading but I'm going to post it anyway for posterity.
First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.
I'm familiar with all of these incidents. Its great that they're in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla's autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of "hands-off" system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
April 22, 2022: Model Y in "summon mode" tries to drive through a $2 million jet
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system's safety
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
May 7, 2016: First known fatality involving Tesla's Autopilot system
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we've got in terms of a list showing how "Dangerous" it is? That is pretty remarkable.
Excited to see your response.
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s "Full Self-Driving" software from around 12,000 vehicles to almost 400,000 in about a year”
https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/#
You claim the timeline is important here and this is all post-2022.
Tbh the other side is also anecdotal. There's no stats here.
What's fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you're referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn't going to prevent 100% of accidents, and there's not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.
I won't argue AP, FSD, or any other system doesn't have it's issues but most of these responses are overblown sensationalism.
I am not a "Software by trade" that was a typo. Believe it or not I wrote that entire thing on mobile.
Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.
And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.
Please let me know where I have misrepresented facts, I will either correct them or cite sources.
Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?
Here is an alternative Piped link(s): https://piped.video/watch?v=jvO9_zTTsPg
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.