- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.
What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?
Have you seen how humans drive? Its not a very high bar to do better.
But betas? Seriously?
deleted by creator
I was taught to always drive defensively. You never know when someone’s going to get distracted, get stupid, have a stroke… add glitchy robots to the list, it doesn’t make a whole lot of difference.
And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver’s license.
From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.
Funny how George Hotz of Comma.ai predicted this exact same issue years ago: “if I were Elon Musk I would not have shipped that lane change”.
This issue likely arises as the cars sensors can not look “far enough ahead” on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.
Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.
FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to “Navigate on autopilot” so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.
FSD beta is currently available in most of Europe and has been for several months.
FSD has never driven under a truck
Yes it has. Well, into the back of one so fast that it went under at least.
which is an LKAS system.
So is FSD. 🤣 It’s level 2 bud, you’re really REALLY confused for someone pretending to own one.
The driver deliberately instructed the car to drive into the trailer.
Are you saying Josh Brown killed himself? Because if you are, that would be a new repulsive low even for you Elon simps.
The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.
Not Auto Pilot (AP). There’s a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There’s also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go “on ramp to off ramp”. So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn’t need to touch it outside of answering the nag (the frequent nag to “apply force to the steering wheel” to tell it you are still alive and paying attention)*.
'* At least I think that’s the same for FSD. I’m only on AP with AP1 hardware. Never had an issue that I’d blame on a “bug” or the software doing something “wrong”.
It’s the beta part that scares me the most, the type of assistance isn’t really relevant. People shouldn’t be driving around in betas. These aren’t phones.
deleted by creator
“Some of you may die, but it’s a risk I’m willing to take.” - Lord Farquaad and Musk
Musquaad
I mean… They opted into a beta. Beta means this may happen.
Even if those dipshits “opted in,” the rest of us sharing the road sure as Hell didn’t!
This isn’t just some email web app that may have a few bugs, it’s putting lives at risk on the road. They shouldn’t be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.
I guarantee that the other drivers on that road didn’t opt for a “beta”.
I just can’t understand how regulators all over the world allow these things on the road. How the fuck do you allow the release of potentially deadly (for everyone involved, not just for the user) software en masse for the public to beta test for you… This is not Diablo IV…
Beta only means buggy piece of shit to people who use software and then mostly gamers. In industries where prototypes can kill people a “beta” product is one that is safe for the intended use. For example, if you invented a new way to do internal scans of people, before you can even test it on humans you would have done extensive testing on animals to know what works, what doesn’t, and what gives them cancer, and have done the modelling to have a strong understanding on if it is safe with humans.
Nobody would tolerate a scanner that gave people cancer, oops
“Run, run, run, as fast as you can. You can’t catch me. I’m the gingerbread man!”
I’m not especially sympathetic to the Tesla drivers this might kill.
I’m worried about everyone else.
I consider the suicide attempts a feature. I’ll test for you, Tesla.
It shouldn’t have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What’s so hard about driving a real car manually? Did you all become fatass lazy people that don’t even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it’s amazing, but it isn’t as good as a human YET, thus causing life threatening accidents. FSD literally is still in beta, and people are driving full speed in roads with this beta software.
It can’t be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.
Self-driving cars are actually only legal in a few countries. And those countries have tests.
It’s only the United States that just lets anyone do what everyone earth it is that they want, even if it’s insanely dangerous.
Everywhere else any car company that’s espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they’re not going to come across difficult situations.
In its current state it has basically no chance IMO.
If they’d concentrated in making AP/Highway driving smarter first they might have got that through… there are already rules for that… but cities? I’d love to see the autonomous car that could drive through London or Manchester.
Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.
Also please provide an example of a life threatening accident cause by FSD.
The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.
Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.
Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?
If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.
Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?
FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.
If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.
There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.
Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.
I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?
Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.
Wow that’s sure a lot of text for someone that didn’t read the article.
The author states that despite having storage plugged in, he was not given the option to save a recording.
Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.
I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!
Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.
Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?
I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.
The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.
Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.
Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.
I dont know is Musk is responsible enough to be the one to get us there though.
Lol who would trust their life to Elon Musk? 🤣
idiots and losers
Well, this article is written by FredTesla who use to mod the TeslaMotors subreddit. Not only did he drink the koolaid, he brewed the damn stuff.
This is like that show “Upload”; the guy literally gets killed by a car
That was a really good show.
You should finish watching that first episode before making such bold statements.
I mean I think its still a valid point. The car in the show was sabotaged, and that is definitely something that might be a thing once all cars self-drive. Especially once they remove controls like steering wheels.
There hasn’t been a tesla FSD hack yet, but it would take spoofing a software update (and spoof the authentication and certs, etc)… The attacker would need to have access to a pretty massive supercomputer to make their own custom self-driving software and today getting the certs and everything right is next to impossible… but even then its only next to impossible, not impossible.
It may be difficult to spoof a certificate today, but tomorrow is a whole new day. To wit, OpenSSL has a pretty long history of serious vulnerabilities, despite being the best SSL library out there.
It is absolutely only a matter of time until the Tesla OTA functionality is compromised. There’s too many moving parts for it to not be.
“Attack surface” is the term you want. Big software means big attack surface. So keep code lean for security as well as efficiency.
There are still a lot of other layers that need to be compromised past the cert for such an attack to even be possible. Even so, I suspect when such an attack does happen it will probably be for stealing cars. Your car would just wake up in the middle of the night and drive itself somewhere else to be cut up for parts. Less likely is any kind of safety issue since its so easy to take over control of the car.
Don’t even need sabotage. You already share the road with cars that someone repaired under a tree with the cheapest parts they could find.
oh look more anti right to repair sentiment.
no, cars repaired by people other than the manufacturer wont kill you
I’ve been repairing cars for over 15 years. There’s a massive spectrum for quality on almost any aftermarket replacement part. Literally the same part can range from $50 to $400 and the only difference is quality and durability.
Sometimes the cheap part is fine, sometimes they cause weird problems. Especially electrical parts.
yeah sure, but that is unlikely to kill you or someone else, and diy repair is almost always good for the consumer
Well hold on there, he survived the crash, and would probably have been ok. It was the upload that killed him.
Yeah, my bad 🤣 I meant the car technically endangered him to not live longer 😔
Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. “Beta testing” a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.
NHTSA hasn’t had a permanent director in years.
I think NHTSA had a director for like… 2 or 3 months in 2022. But before that, it was blocked in the Senate. And before-before that, it was Elaine Chao’s Department of Transportation and she was incredibly anti-regulation.
Step 1 is that the citizens need to recognize what has happened to the federal apparatus. We’ve gutted our own government and safety regulators. Not just NHTSA, but also SEC, FTC, etc. etc. The anti-regulators / libertarians have the momentum with regards to laws in the past decade, and this is the natural result.
I like my Tesla but there’s no way I’ll be switching that thing on. They’re even calling it beta, what the fuck do people think that means?
FFS. He was testing a beta update at 73 miles per hour. Is he really expecting sympathy?
Maybe it shouldnt be released for real world use with such major bugs then. Dont give me the crap that iTs DiFfErEnT because tesla is a “technology company” either. Its a car, safety features on it should work damn near 100% of the time before it is released.
What crap am I giving? I’m just saying it’s a stupid idea to beta test self driving technology on the highway.
It’s the same reason we don’t take drugs that haven’t been tested yet. You know, not in lab rats.
Google treats its users as beta testers all the time. Difference is a phone won’t kill me when it crashes and reboots.
I thought all FSD updates were beta updates? Did I miss the announcement of FSD going GA and being stable?
If that’s the case, then yeah I probably wouldn’t test run a new update on the highway first. But I also have no idea if this issue happens at lower speeds as well.
Isn’t that the issue? He’s using something that’s still in beta on the highway.
Yes, 100%. Anyone is a fool to use Tesla “FSD Beta” pretty much anywhere. But Tesla markets it as totally safe to use anywhere and everywhere (but especially highways) so there’s a point where you have to stop calling everyone that owns a Tesla a fool and acknowledge that the common denominator is Tesla and just not the owner’s foolishness.
I didn’t call everyone that owns a Tesla a fool. I questioned whether someone who decides to risk their life to test a feature still in beta deserves sympathy.
Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.
Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.
Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
so can you provide a link of an accident caused by FSD?
Musk just did a 20 minute video that ended with it trying to drive into traffic.
this one? Where does it drive into traffic? https://youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.
Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: https://youtu.be/aqsiWCLJ1ms?t=1190 The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.
The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…
This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.
Other than that it performed flawlessly for over 40 minutes in a live demo.
I get that this is an alpha, but the problem with full self driving is that’s way worse than what users want. If chatgpt gave you perfect information for 40 minutes (it doesn’t) and then huge lies once, we’d be using it everywhere. You can validate the lies.
With FSD, that threshold means a lot of people would have terrible accidents. No amount of perfect driving outside of that window would make you feel very happy.
it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.
Here is an alternative Piped link(s): https://piped.video/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
Your posts here show you’re not interested in reality, but I’ll leave a link anyway
https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-investigations/
Excited to see your response about how this is all user error.
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
- I’m from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
- “Allegedly” with FSD engaged
- Tesla FSD “phantom” braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
- Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
- Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla’s massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
- This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
- When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
- What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
- Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I’m almost positive this means that FSD was disengaged completely.
- We don’t have all the facts on this case yet and I’ll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
- If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn’t even there.
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
- This one is a favorite among the tesla hate community. Understandably so.
- Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
- Smart summon relies exclusively on the front camera and ultrasonic sensors
- While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The “driver” did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I’m sure you can understand the object recognition isn’t exactly trained on parked airplanes.
- The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
- Tesla is working on a new version of smart summon which will make this feature more useful in the future.
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
- I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don’t bother watching it.
- “It wouldn’t have hit them, it definitely wouldn’t have hit them. Do we need to cut that?” “No, you can keep it in”
- If you look at what was happening on the car’s display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to “nearly take out a bicyclist”. It did definitely overreact here out of safety but at no point was anyone in danger.
- Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
- lol
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
- Now we’re getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
- Again, pre-FSD. If the driver didn’t see the overturned truck and disengaged to stop then I’m not sure how anyone expects a basic LKAS system to be able to do that for them.
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
- This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
- The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.
May 7, 2016: First known fatality involving Tesla’s Autopilot system
- Now we’re getting way back into the V1 autopilot systems which weren’t even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
Excited to see your response.
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”
https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/#
You claim the timeline is important here and this is all post-2022.
Tbh the other side is also anecdotal. There’s no stats here.
What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.
I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.
I am not a “Software by trade” that was a typo. Believe it or not I wrote that entire thing on mobile.
Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.
And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.
Please let me know where I have misrepresented facts, I will either correct them or cite sources.
Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?
deleted by creator
deleted by creator
One of many many examples: https://www.businessinsider.com/tesla-stops-tunnel-pileup-accidents-driver-says-fsd-enabled-video-2023-1?international=true
Tesla has a huge problem with phantom breaks.
See my huge post about that very accident right below. Do you have any other “Many many examples”?
Here is more: https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-investigations/
How many do you want?
ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.
deleted by creator
Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…
Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.
lol the anti tesla crew will downvote even the most basic facts.
But is it technically the user’s data, or is there some clause in Tesla car ownership that says it is Tesla the company’s data?
Forgive me I’m ignorant of the fine details. I purchased a Chevy Bolt but had been looking into a Tesla as an alternative until Elon tried to be the super-cool Twitter guy.
I saw the videos of them running over infants in strollers. Does that count?
The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.
Dan O’Dowd of Green Hill Software. Spent millions of dollars on a superbowl ad hitpiece and it backfired spectacularly. Although clearly there are still a few people that believe it. You should listen to the podcast with whole mars catalog of him trying to explain himself. Its really wild.
Tesla took him to court and won
on FSD? link please
Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.
FSD has only been out for less than 3 years.
The first public release was much later than the smaller beta, which I had access to. And my reference to seven years was Josh Brown being killed by autopilot in 2016.
Can you link a few? Something where FSD directly or indirectly causes an accident?
You’re working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won’t. Hell, they performed a recall because it was driving through stops. Something it’ll still do, of course, but they performed a recall.
Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago… SOOOO safe, all the news stories of it killing people are fake!
Yeah. These people aren’t even good liars, but they try their hardest to defend the complete nonsense and lies.
Can you link to even a single news article about a death involving FSD?
https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/
That was the top google result with a ton of other articles. I’m sure that somehow doesn’t qualify and you’ll move the goalpost in response. Go read some Qanon Elon jizrag news. That’s where the real facts are.
And that’s an article about Autopilot which is a completely separate system. For someone with such strong opinions, you sure seem to lack even a basic understanding of the technology that you’re discussing here, but I’m sure you’ll just pull out more insults and keep making references to your current obsession, Musk, as if that makes your argument any more credible or factual.
The early alpha build not part of public release? That video? The one with the known regression in the model S?
That video was a demo of the new FSD beta 12 software, which is the first time a neural network was in complete control of the car, resulting in a massive reduction in code and overall smoothness. Did I mention the part where it was unreleased to the public? Maybe there’s a reason for that?
Other than that the car performed flawlessly in the entire 40 minute drive.
The recall was most definitely not for “driving through stops”. It was to fix the behavior of doing a “rolling stop”, which is something 99.5% of drivers do, which is how it learned to do that. Where do you see that it still does not make a complete stop at stop signs?
I’m not trying to remain in the dark here, I’m just presenting facts. I’m very open to change my mind on this situation entirely just give me the facts. You said there were thousands of these videos I’m just asking for evidence. I just get downvoted and nobody posts any of the evidence.
Im an AI professional and have been an FSD beta tester for almost 3 years with tens of thousands of miles logged. How can I possibly be the one “in the dark” here?
Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.
Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.
all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.
You were provided evidence and disregard it and make excuses for it. It’s hard to have a discussion if you just exclude all evidence for it.
Think of it another way, you’re saying there’s absolutely no way that FSD has ever failed in its publicly available software, even with hundreds of thousands of cars on the road? Use a logic test on yourself and ask if that’s realistic.
Fsd makes a TON of mistakes. I’ve had the beta from the first public release. I don’t trust it to do anything more than lane holding and cruise control, with maybe some supervised lane changes. But it’s a beta. I understand that I am helping to test beta software.
FSD in its current form should not be given to everyone. Tesla had it right when they gave it only to proven drivers (okay, it would have been better to test with paid employees, but I digress).
FSD right now is like handing the keys to your 15 year old child and going to sleep in the back while they drive you home.
Can you point to this evidence as I don’t see it anywhere?
Also busting out a strawman argument one reply in to the discussion isn’t a good sign for the strength of your argument.
Look at the replies. I’m not going to sit and hand-pick them out for you, there’s plenty on there. Plenty of people either posting videos or stating first hand evidence of issues with FSD.
Not sure where you see a strawman either. But whatever, if you aren’t seeing any evidence despite the many posts and don’t see how impossible a perfect record is then you won’t be convinced with any evidence regardless or are just a troll.
I have looked at the replies and there are only a couple links about Autopilot crashes from users who think this is the same as FSD when it isn’t.
Your strawman is claiming that this user is saying FSD is perfect and never had a failure. Nobody is arguing that. You guys keep mentioning all the deaths related to FSD, yet nobody has been able to provide a single one as evidence.
Removed by mod
Please let me know where I stated anything inaccurate in the comment about the single incident that has been dug up in the 500k FSD cars and millions of miles traveled self-driving.
Also lets please keep this civil and not be name-calling. I hate Elon as much as anyone else and he deserves pretty much all the hate he gets. However it doesn’t change facts. Its not like he was responsible for writing even a single line of code in FSD or even designed or built any of the cars himself.
deleted by creator
Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.
not all accidents are that violent. I would even accept a video of a simple fender bender to prove that FSD beta causes accidents with any sort of frequency. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?
Wait, are you now suggesting you won’t accept that Teslas with FSD ever get into accidents without video evidence? FSD is perfect?
No I would never suggest that. The overwhelming consensus here is “FSD is dangerous. More dangerous than humans” Im asking for any proof of that here. So far, nothing. If they were getting in to accidents all the time there would be all kinds of footage, no? The fact is that even in this beta stage its already safer than human drivers. That apparently rubs people the wrong way for some reason. Don’t we all want safer roads?
You sure did suggest that when you said you would even accept a fender bender.
Do you have any footage to share of FSD fender benders? If not how can you even claim it’s dangerous? Every car equipped with FSD hardware is equipped with 360 dashcams. It should be really easy to find some footage where FSD is at-fault for an accident.
Again- you’re suggesting they’re perfect by implying they don’t even get into fender benders.
Look, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.
Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.
They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.
They do whatever gets them clicks. Facts do not matter.
And this opinion is based on what? Obviously every online news source is concerned with increasing readership. But I’m not aware of any consistent factual issues in their reporting.
Honestly the quality of journalism in this article is pretty low. Some of the points are valid but most are just nitpicks about little opinion pieces at the ends of the articles. I don’t find these particularly valuable, and they sometimes contain some bad takes as pointed out here, but that’s not an issue of factual reporting. So the worst they’ve identified is a few minor omissions which, sure, but if you write thousands of articles that’s going to happen.
And by the way, this article is making the case that Electrek is deliberately biased towards Tesla, not away from them. So if anything it undermines your point.
I think the scandal about car referrals was pretty suspicious, but again, when you look at their reporting it comes down as pretty balanced. Perhaps you could argue they talk too much about Tesla but they cover the good and the bad. And I would say almost everyone in America has been talking about Tesla too much for quite some time.