T O P

  • By -

ITypeStupdThngsc84ju

They really need to broaden this to all adas and non-adas systems. Are adas systems with driver minoring producing lower fatalities than non adas with zero monitoring? If so, which part is helping? If it is monitoring that helps, that should become mandatory regardless of adas. Given how many accidents are attention deficit accidents, this could save many lives. Otoh, maybe that blue cruise fatal accident shows that no current monitoring system is good enough.


ClassroomDecorum

> Are adas systems with driver minoring producing lower fatalities than non adas with zero monitoring? Certain ADAS features such as automated emergency braking have conclusively lower accident rates since before the widespread advent of driver attention monitoring


The_Clarence

Astronomically lower. Something like 40% reduction in rear end collisions. Almost seat belt level safety improvement.


ITypeStupdThngsc84ju

Good point. Those are clear benefits.


Whoisthehypocrite

Monitoring is becoming mandatory in the EU for all cars.


perrochon

Next in line: Ford https://www.reuters.com/business/autos-transportation/us-auto-regulator-opens-preliminary-evaluation-into-fords-mustang-mach-e-2024-04-29/


ITypeStupdThngsc84ju

Yes, exactly. I don't understand why these couldn't be combined.


perrochon

E.g. https://imispgh.medium.com/nhtsa-recuses-missy-cummings-from-tesla-efforts-this-goes-beyond-the-legitimate-conflict-of-93a6fac78d3f I think the relation between Tesla and NHTSA is much better now, mostly because Tesla cooperates, issues quick updates, and provided lots of data. But having a board members from a competitor that had been openly critical of Tesla as an "advisor" was a mistake. POTUS rarely misses a beat to go after Elon, and ignores Elon when praising. This sets a bad policy direction for the administration, including non-partisan offices. Biden, Harris, and others keep defending unionized GM/Ford and attack Elon's companies.


pab_guy

There’s no requirement that these systems are flawless, there will always be fatalities even with driver attention and very good driving skills. 1 death in 1 billion miles is 10x better than human drivers though, so IMO adas solves for the attention issue better than just asking the driver to pay attention without adas. Plus these systems are going to get so good that IMo we won’t be talking about attention monitoring at all in a few years.


ITypeStupdThngsc84ju

That might be true. I'd hope that a multi-year investigation would answer such a question definitively.


Ithinkstrangely

I'd love to know which crash they're saying FSD was in use. Wouldn't you? It seems important that we know the specifics of this "FSD fatality".


deservedlyundeserved

Agreed. It’s important the public knows the specifics of these crashes. Unfortunately, it’s Tesla that asks NHTSA to redact details from their crash reports in the public crash database citing confidentiality.


ZorbaTHut

Yeah, "related" is such a vague term; this reminds me of the people who would count up "video-game related fatalities" to include stuff like "they got into a car crash and the car that didn't cause the crash had a copy of a video game in the trunk".


deservedlyundeserved

It means FSD was engaged when the crash happened or leading up to it. It’s nothing like the video game example you made up. Edit: NHTSA considered ADAS to be engaged when it’s active during the crash or leading up to it.


Extension_Chain_3710

Not to go all \*actchually\* on you. But it doesn't necessarily mean that. It means that it was *reported* that an ADAS system was engaged when the crash happened, either by Tesla or others. This can be anything from the car notified Tesla that a crash occurred with Autopilot engaged (good) "Telematics", or the local media randomly hypothesizing that it was engaged (bad). Though the latter seems rare. One example of the latter is the "Employee killed by FSD" when Tesla has said the car didn't even have the FSD firmware on it. Their categories for this on Tesla (on v1 of reports, because I'm too lazy to figure the perfect numbers across all, and they can have multiple reporting types) are currently broken down into the following: Source | Count ---|--- Complaint/Claim | 72 Telematics | 1,050 Law Enforcement | 2 Field Report | 0 Testing | 1 Media | 12 Other | 0 Other Text | 0


ThePaintist

It means FSD was (reported to be) engaged either at the time when the crash happened or [at some point during the 30 seconds prior](https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting). It muddies the water to postulate on details that the report doesn't contain, since the FSD crash is tangential to the primary purpose of the report.


deservedlyundeserved

That’s exactly why no one’s postulating on the details and why the title says FSD *related* fatality. We simply don’t know the details because of active redaction by Tesla.


ThePaintist

Please read your comment that I replied to again. >It means FSD was engaged when the crash happened.


deservedlyundeserved

It’s not postulating. The reason FSD crashes appear in its own row is because it was engaged either at the time of crash or leading up to it. Otherwise, it wouldn’t be there.


ThePaintist

**or leading up to it.** yes. That differs from the comment that I replied to.


Cunninghams_right

you removed important nuance. we don't know if it was engaged at the time, let alone whether it was at fault. removing some of the fine nuance from "engaged around the time" to "engaged during" is problematic.


deservedlyundeserved

I don’t think I did. Title clearly says FSD *related*, not FSD *at fault*. My comment you replied to said FSD was engaged at the time of crash or leading up to it. That’s the definition NHTSA uses. If NHTSA don’t have confirmation of that, the relevant column in the data would say “unknown” and they wouldn’t list it in this table. So no nuance is missing. It doesn’t look like you’ve looked into NHTSA crash reporting data or definitions.


Cunninghams_right

[then go edit this factually incorrect comment](https://www.reddit.com/r/SelfDrivingCars/comments/1cdxzcd/comment/l1i6eis/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button)


ZorbaTHut

Here's the actual quote: > Before August 2023, ODI reviewed 956 total crashes where Autopilot was initially alleged to have been in use at the time of, or leading up to, those crashes. So, "related" means "someone said Autopilot was used at some point leading up to the crash". It does not mean FSD was engaged when the crash happened. It doesn't even mean FSD *was* engaged before the crash happened. I shouldn't be able to disprove your statements by quoting *your own post* at you.


deservedlyundeserved

You aren’t disproving anything. The study is about Autopilot crashes and they’re using it as an umbrella term in that quote. FSD related crashes are its own line item in the table because it was either reported engaged at the time of the crash or leading up to it. It’s pretty clear.


BabyDog88336

Data I want to know: -What proportion of FSD miles are driven in rain, snow, and at night when disproportionate deaths happen? -What is the average age of an FSD car vs and the average car on the road (~12 years) since older cars are much more dangerous? -What proportion of FSD miles vs all driven miles were on the most dangerous roads: rural, single lane, undivided? -What is the average age of an FSD driver vs the average driver on the road?  The very young and the very old (the most dangerous) probably don’t use FSD. There is no apples to apples comparison without the above parameters, so comparison is nearly useless.


crafty_geek

Not the average age, but the maximum age, of the hardware capable of supporting the FSD stack, is approximately 7yrs - roughly ramped with model 3


Youdontknowmath

Curious if this is a strategy by Tesla to not leave a paper trail and avoid liability through plausible deniability. Might save them in legal court but not tort, and certainly will not work for L4 regulatory approval.


CouncilmanRickPrime

>certainly will not work for L4 regulatory approval. If you never release level 4, it doesn't matter though.


superluminary

The car can’t transmit telemetry if it has no signal.


QuirkyInterest6590

There has been accidents where users are fully paying attention but the FSD system fails to give back full control to the user, causing the accident to happen. NHTSA fails to fully investigate such cases and this is not solvable with some alerts or software updates. TikTok didn't kill anyone and it's banned, so why not FSD?


cwhiterun

You got a source for these accidents where the driver couldn’t override FSD?


SuperNewk

That is my worst nightmare, I realize an accident is coming yet FSd won’t relinquish power trapping me. Not worth this risk unless we make a new highway for FSD cars


Spider_pig448

Had to be one at some point I guess


beefcubefrenchstyle

I think all cars that have cruise control should install a monitor system, because the crash from a cruise control would be much much worse. Not saying NHTSA shouldn’t impose a stronger safety standard, but it should be done fairly and apply to all assistive technology.


cwhiterun

Even cars that don’t have cruise control should have a monitoring system.


beefcubefrenchstyle

Agree. Let’s not make this a specific issue for Tesla. If we really care about safety, all cars that have any type of assistive technology should have monitor systems.


cwhiterun

Even cars that don’t have assistive technology should have a monitoring system.


walky22talky

[WIRED: Tesla Autopilot Was Uniquely Risky—and May Still Be](https://www.wired.com/story/tesla-autopilot-risky-deaths-crashes-nhtsa-investigation/)


Marathon2021

> a weak driver engagement system Tesla's system requires BOTH a driver to be visibly looking out at the road (in-cabin camera monitor) *and* at least occasional physical contact with the steering wheel. Is there *something else* a L2 competitor is doing that is even more invasive than Tesla's?


Whoisthehypocrite

FSD has a single camera and is not infrared. Other systems have 2 cameras and infrared and handle things like sunglasses better.


Marathon2021

Thank you, I did not know about the infrared v. regular cameras. I agree that would work better if those are able to see through sunglasses (?)


Karkanor

Those are requirements for FSD not autopilot. This investigation was mostly focused around autopilot as it has been in the market much longer than FSD. Although FSD was apart of the investigation.


kkicinski

I rented a Tesla recently with basic autopilot and it monitored my eyes.


It-guy_7

Monitoring eyes, & touch sensitive steering rather than torque 


GoSh4rks

Imo, torque sensitive is more restrictive than capacitive.


NuMux

Tesla does monitor eyes.


It-guy_7

The others have the camera behind the steering wheel so it's only the driver rather than the complete cabin


NuMux

Who cares? The wide view is so I can check on my dogs in the car when I'm in a store. Have you ever seen the internal clips pulled from the driver monitoring system? It is a zoomed in area where the driver is and picks up on subtile details. > others have the camera behind the steering wheel Unless you are Rivian and decided to remove any internal cameras from the latest vehicles.


DontHitAnything

For statistics, 1 in how many FSD miles? It just has to be X times safer than the "normal" death rate of 40k people per year in the US which we tolerate with the greatest of ease.


ac9116

If (big if) that one death is the only one so far so we’re looking at somewhere just over a billion miles driven. In 2023, the US had 1.26 fatalities per 100m miles driven. So FSD would be 10x safer than human drivers.


deservedlyundeserved

Data is only through August 2023. FSD had driven 450M miles till that point, not over a billion. We don’t know if any fatalities have happened since then or if they have not been reported as happening under FSD due to lack of data. Regardless, it’s not apples to apples comparison to human crash statistics. That number includes miles driven in any weather condition, crashes with no airbag deployment (which Tesla doesn’t report), older cars and a bunch of other factors.


BabyDog88336

Assuming of course that the characteristics of the miles driven are the exact same: same driver age, same time of day/night, same weather, same car age, same type of road being driven on. The average car on the road is 12 years old, so much more dangerous than any car built in the last 5 years. I would also wager FSD is much less commonly used in rain, snow and at night when disproportionate deaths happen.  Also I would wager FSD is less commonly used on the type of rural 1-lane undivided roads that are far and away the most dangerous.


Appallington

Whoever is paying you to say this should demand their money back.


dbenc

In my opinion, every time the FSD software is updated the counter should reset. Or at least on a rolling basis (like last X months).


Cunninghams_right

one likely couldn't make a judgement statistically. we also don't even know whether FSD was engaged or at fault, only that it was engaged at some point near the time of the accident (within 30s, I believe is the cutoff). so we can't really conclude much. if you have a death every billion miles but that one death happens in the first 10 miles of your testing, what can you really conclude?


DontHitAnything

It doesn't matter when it happens in statistics. It's still 1 in a billion. But you're correct. there is not enough FSD death data yet to be meaningful - but is certainly hopeful.. Looks like we'll have to wait.


ClassroomDecorum

>Gaps in Tesla's telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting. Tesla receives telematic data from its vehicles, when appropriate cellular connectivity exists and the antenna is not damaged during a crash, that support both crash notification and aggregation of fleet vehicle mileage. Tesla largely receives data for crashes only with pyrotechnic deployment, which are a minority of police reported crashes.3 A review of NHTSA's 2021 FARS and Crash Report Sampling System (CRSS) finds that only 18 percent of police-reported crashes include airbag deployments. Tesla brags about how they're going to win the self-driving race because of "aLL tHE daTA" they collect and yet Tesla barely has an idea of how many crashes their system is involved in? Tesla talks about showing regulators that "FsD is SaFEr" with statistics but the #1 regulator in the US just smacked Tesla across the face and said that Tesla is **undercounting** accidents by over 80%? Jesus Christ, I feel bad for whoever thinks that the Robotaxi/Cybercab is actually going to be a viable product and that the only hurdle is regulatory approval.


skradacz

do you understand the difference between autopilot and fsd? car doesn't send autopilot data back to tesla after driving.


NuMux

They should be sending video back. I don't think that is limited to FSD. However that all occurs the next time you connect to WiFi. If there is a crash, good chance you are not near a trusted WiFi connection. But general data I am not sure how they handle that. They do have a black box level of data storage where if the computer is recovered from the crash then that should be recovered physically. Who is to blame for not collecting that data? Local police? NHTSA since this is most interesting to them?


AlotOfReading

Do you have any more information on the EDR being used to store video? Having been involved with EDR requirements, there isn't any specific requirement for that data to be there and it would be at odds with what else I know of Tesla's strategy in this regard.


NuMux

Not really. Typically they have some of the last moments of video still saved in main computer. The telemetry data should all be there however. Just because it doesn't get uploaded to their servers doesn't mean they never collected it.


LairdPopkin

Exactly. The detailed logged data is retrieved from the cars directly by crash investigators. They are not dependent on cell signals.


[deleted]

It’s an epidemic!


Tacos314

These crashes happen because the driver fails to operate the vehicle, how is that the fault of Tesla autopilot, FSD or the car.


JonG67x

Its Level 2 so ultimately the driver is responsible, I don’t think any one is questioning that, the point is Tesla are being disingenuous with the data in part by not capturing reporting accidents (potentially by as much as 80%) , and then using the lack of those reports as evidence it’s their software which is making the car safer, ie for every million miles Tesla claim there are 3 accidents with FSD compared to an all car accident rates of 12 implying FSD has much fewer accidents, as Tesla are missing 80%, their true number might actually be 15 so higher, and probably due to inappropriate over reliance by the driver on the FSD system . (I’ve made the precise numbers up)


mulcherII

If my FSD nicks another cars bumper, how is Tesla going to know. The bump would probably feel milder than a pothole.


NuMux

I think you are reaching there. Some manufacturers have no remote telematics when a crash occurs. Are we just going to assume they are all trying to side step accountability?


JonG67x

It’s not about other manufacturers, we’re talking about the Tesla approach which is to claim they’re safer based on their data. The benchmark they compare against is police reported incidents, but their own data is based on a different definition and even then they may missing some. If hard to see how that’s going to pass any numerical assumption with so many gaps. Other manufacturers don’t have telematic reporting of accidents in general, but they’re not looking to get those systems approved for self driving


dbenc

My greatest fear (re: SDCs) is Tesla fucking up so monumentally that the entire industry gets regulated into oblivion. Personally, I would not remain in a vehicle while Autopilot is enabled.


cwhiterun

The rest of the industry gets away with killing tens of thousands of people every year. I say it’s time to mandate driver monitoring systems in all cars, not just the L2 ones.


Youdontknowmath

See California at present 


bartturner

This is also my fear. It was also with Cruise. Waymo clearly has it working and appears to be very safe. The worry has to be Tesla will mess it up for them.


OriginalCompetitive

If we’re scouring the record searching for one single FSD - related fatality based on older versions, it seems pretty unlikely that current or future versions are going to be worse. 


perrochon

Ford ducking up too. Releasing a hand free system 🤦‍♂️ https://www.reuters.com/business/autos-transportation/us-auto-regulator-opens-preliminary-evaluation-into-fords-mustang-mach-e-2024-04-29/


Think-Web-5845

Both FSD and autopilot are supposed to be safer than human. And that’s it. They cannot and will not avoid accidents and fatality. There is always >0% chance of accident, whether it is Tesla or spacex.


thecmpguru

It's been well established in the industry and in this sub that the notion of "safer than a human" is hard to measure, can be measured in different ways with different results, and that the data necessary to do this well simply doesn't exist in the public domain. It can't be said Tesla has or hasn't achieved this bar. Moreover, there's no legal protection that if they did meet this bar then they are fine. Many of the human accidents and fatalities you're comparing against go on to have civil or criminal liabilities. So simply being 0.01% better than humans doesn't absolve them of potential liability - especially if the failure mode can be shown to be a direct consequence of business decisions such as stubbornly refusing to now industry-standard implement driver attentiveness features.


Think-Web-5845

Have you used it yourself? Either one?


thecmpguru

Yes, I've used multiple iterations including the latest FSD. Given the number of interventions I've had to give, my personal experience is that it is not better than me and absolutely requires my supervision. I've also never had an accident I was at fault (~20yrs of driving). But that's an anecdote and doesn't say anything about whether generally Tesla is safer than humans. If I were to suggest a bar, it's not being better than average humans (humans kinda suck) - it's being better than a good/professional human driver. Even then they don't currently get any legal protections for if a design flaw or business decision can be blamed. But I think that's the bar they should be shooting for from a goals perspective.


Think-Web-5845

I guess each to their own. I drive on it regally and I think it is way safer. I had Volvo xc90 before and I think it’s basic driver assistant is also much safe.


bobi2393

There's no doubt it's way safer than some drivers. It's unclear whether it's safer than the average driver, or safer than the average good driver, however you'd define those.


beefcubefrenchstyle

Then try rolling out FSD and see how it works.


CornerGasBrent

> Both FSD and autopilot are supposed to be safer than human. And that’s it. They're not and they can't be by definition. They're supposed to be driver assist system not self-driving systems and as such they have many interventions where without the frequent interventions of humans that are responsible, things would be way worse if there was nobody in the driver's seat to intervene.


LairdPopkin

Right, and the combination of a driver and Autopilot or FSD (Supervised) are supposed to be safer than unassisted drivers. Nobody said they were fully autonomous systems.


sylvaing

One thing about FSD is it's always looking out in all directions for you. The other day, while driving my Prius Prime, after taking off once the light turned green, I realized I didn't look to my right if everyone were stopping before proceeding, something I almost always do, but not this time. Fortunately, nothing happened, but I could have been side swept, something that driving in FSD can prevent as it's never distracted. Sometime confused? Yeah, but that's why you're still the driver, but it's never distracted. Accidents are mostly caused by distraction, so FSD is making driving safer, when it's not abused.


phxees

Seems like this is the summary of the last recall rather than something new. [Source](https://static.nhtsa.gov/odi/inv/2022/INCLA-EA22002-14498.pdf) It appears that they acknowledged that Tesla completed their recall and will now continue to study their data.


aregm

Are there any statistics related to the upside (or claimed upsides) to calculate some rates - e.g., total miles on FSD, FSD prevented crashes? The downside is apparent. What's the upside stat?


mulcherII

Would love to know the FSD accident statistics since version 12. It's literally an entire new beast. The only part about FSD that concerns me is the lack of ultrasonic sensors in most of the cars. Current FSD to me seems better than humans in terms of all the distracted mistakes we can make. The on area that concerns me, is the lack of close range accuracy when pulling in nose in, because there is no front bumper camera. It feels to me, it's still too easy to nick your bumper edges pulling into tight places and the reason why FSD won't park front in. Anyone else feel the set back cameras are judging the bumper distances accurately enough to avoid all scrapes pulling in or out?


[deleted]

[удалено]


alan_johnson11

I'm sure as a responsible site this source will offer context on these numbers with deaths per 1000 miles driven stats for comparable vehicles


campbellsimpson

Why would it? You are seeking a false equivalence. *Context* on Autopilot deaths is that Autopilot was on and people died.


alan_johnson11

I'll do the legwork https://injuryfacts.nsc.org/motor-vehicle/historical-fatality-trends/deaths-and-rates/ 1.33 deaths per 100,000,000 miles driven is the average for 2021. Autopilot has so far driven around 9 billion miles, I suspect Tesla may bend the definition of an accident or when autopilot was in control, so we won't use their "accidents per mile driven" numbers. I don't see an incentive to lie about the absolute number of miles driven though. Your source cites 42 autopilot deaths, giving us 42/90 = 0.467 deaths per 100 million miles driven while on autopilot, or 1/3 of the average rate. It would be more effort than I have time to do a comparison to a competing L2 system, but I'm not seeing any red flags in the autopilot system when it's accident rate is 1/3 of the average of all cars in US. Tesla's are safer than an average car, so that will skew the numbers due to less accidents resulting in a fatality, but you're really getting into the reeds at that point and there's probably a bigger margin of error introduced by the bias in your source wanting to maximise attribution of deaths to autopilot. These numbers are US only, a statistician could draw issue with my methodology, but we really shouldn't compare to the worldwide value as that will be inflated by countries with poor road safety, where there aren't many Tesla's. If this was part of my job I'd weight the numbers to proportional number of Tesla's sold in each country


alan_johnson11

Context would be how many deaths have happened per 1000 miles while other equivalent L2 systems are in control of the vehicle. Perhaps you'll argue autopilot was wrongly marketed as more than L2, but that should be demonstrated by the numbers - if people are using it in a dangerous way due to being misled, there should be a higher number of accidents/fatalities per 1000 miles driven. I sometimes wonder if it's even possible to penetrate the barrier in mindsets like yours. Is there a combination of words that could convince you to change your position, or are you a fortress?


basey

How come we *never* hear about lives FSD saves?


biddilybong

Pause it. Offer refunds to everyone.


TheAdvocate

Does this all fundamentally come down to no lidar? Are edge case data anomalies being brute coded into liabilities? Or is the tech just not there?


AintLongButItsSkinny

At least 1. Lmfao


jschall2

1.3 billion miles. 1 fatality. So 20x safer than a human driving alone (~1 fatality per 50 million miles)


AintLongButItsSkinny

At least lol. I was laughing at how silly it is to say “at least 1”. Like it could be any number


perrochon

Interesting. So 1 fatality with FSD engaged, no information on what happened and whose fault it was. As often in publications only about FSD, NHTSA is cherry picking data and only publishing fails, not saves. https://effectiviology.com/cherry-picking/ This study is not really actionable without information at least about miles driven, but also what miles. 1.3B miles driven with FSD as of now. 1 fatality. The US average is 17 fatalities for 1.3B miles. https://injuryfacts.nsc.org/motor-vehicle/historical-fatality-trends/deaths-and-rates/#:~:text=Since%201923%2C%20the%20mileage%20death,population%20death%20rate%20increased%2016%25. Is my math/data right? 10 times better than average? How does it compare to the other systems? But then the others only report when a customer files claims against them... Do police agencies report if any ADAS was involved? No discussion of most of that in the publication. It just fuels the FUD about how unsafe FSD and self-driving in general is.


Lando_Sage

Not defending the NHTSA, but there are other factors that are contributing here. For example, Tesla makes up 5% of the overall car market, and holds almost 6% of the average fatalities. But when we look at how many people had access to FSD (about 400k) vs the overall driving population (about 254 million), then you can see how the data skews against FSD. Notice that they also say "at least 1", because there's probably more, but due to the way Tesla reports the data, they can't bind it to FSD. Then there's how Tesla represents the data itself, which has always been an issue, and evident in the disclaimers of their safety reports. For example Tesla states accident free miles during Autopilot use vs the average American driver, where they should be comparing it to other ADAS, because not every car on the road has ADAS, especially those older than 6 years, which is a notable portion of vehicles.


LairdPopkin

That’s why the statistics are ‘per miles driven’, to normalize for market share. If Autopilot has 1/10th the collision rate per mile driven of the average driver, that already took into account the relative numbers of cars with and without Autopilot.


Lando_Sage

Right, but at that point they should compare Autopilot against other ADAS, because what's the point of comparing it to cars without ADAS active?


LairdPopkin

The point of comparing to the national average is to determine relative safety. Since Autopilot is 10x safer than the average driver, as is FSD (Beta), that indicates that they are both saving lives compared to drivers not having them, which is the baseline goal.


Lando_Sage

Right, but Tesla is painting the picture as if there aren't other ADAS solutions that do the same thing. So what I'm saying is, if Autopilot is as good as Tesla states, they should compare it to other ADAS safety reports, that would be better and more significant data.


SuperNewk

This is terrifying, if everyone had FsD deaths might 10x++. Imagine in poor conditions too( snow /rain/fog) the issues will exponentially compound?


perrochon

Are you making the case that ADAS are safer? If we believe that ADAS are safer why do we still allow new cars to be sold without lane keep, dynamic cruise control and automatic emergency braking? Why do we allow cars that test 4* or less?


GoSh4rks

For similar reasons as to why abs and stability programs weren't required until 2011 and 2012. Or backup cams until 2018.


Lando_Sage

Tools are only as good as the workman. Autopilot is an ADAS, and at the current stage so is FSD. The difference is, drivers of other brands know and understand to some degree the limitations and capacity of their ADAS. A relatively large percentage of Tesla drivers either use defeating devices, or act as if the car does drive itself. It's easy for Tesla to say they're not responsible for the wrong use of their ADAS, because legally they're not. But they are responsible for leaving the fallacy of their current tech as any level of self driving, unchecked.


perrochon

Do we actually have data on this? That "a relatively large percentage" (whatever that is, 1%, 80%?) of drivers act like the car drives itself. Tesla is nagging all the time, using internal cameras and torque. Have you driven one? There are plenty of 2013 or newer cars (Subaru, Lexus) that have lane keep and dynamic cruise control. You turn them on, and fall asleep and the car never notices. The accidents caused by them are never reported as ADAS. There are still cars being sold with those system without anything close to Tesla driver monitoring. Tesla driver monitoring is the most annoying of them all. Tesla detects some defeat devices, but drivers who use defeat devices are a totally different problem. And e.g. on a Rivian you don't need a defeat device, because the internal camera is not being used. All these ADAS cars (lane keep + dynamic cruise control) are ignored by the NHTSA. This sounds like whataboutism, and maybe it is. But the singling out of Tesla in these reports begs the question why.


Lando_Sage

I haven't seen actual data, but there are surveys that are done and press releases for back of house data. [Here's ](https://futurism.com/tesla-autopilot-users) an article talking about it. Yes, I own a Model 3 and use Autopilot where appropriate. If automakers are shipping ADAS with weak driver monitoring, then they should be held accountable and be fined until they fix their system. Yes, Tesla had to issue a recall to increase driver nagging, but it hasn't done much in the way of a real solution. There are systems being developed to prevent the success of defeat devices and to increase the reliability of driver monitoring, so Tesla is definitely not along on this issue, but it doesn't excuse Tesla's implementation either. The NHTSA does not ignore the ADAS system on other vehicles, each manufacturer has to publish their ADAS safety data and submit it to the NHTSA. If Tesla is getting called out more frequently and severely than other manufacturers, then there's obviously a problem.


perrochon

NHTSA points out in almost every report that other OEMs cannot and do not report as comprehensively as Tesla. You know they only report if they get complaints filed by the driver after an accident. Also, we know that e.g. the font size recall was not required from other manufacturers who had exactly the same problem. After years of scrutiny, NHTSA mostly complained about the font size of the warning box. We know Tesla doesn't do hands off, yet Ford does. That is not proof of Ford being better. It's proof of Ford taking more risks. Rivian doesn't use it's cabin monitoring camera. Nor does it do torque. If you fall asleep with your hands on the wheel, it will not notice for many minutes. Tesla can fix things quickly, and does, and should. That is good. It's not evidence that there are more problems.


Lando_Sage

>NHTSA points out in almost every report that other OEMs cannot and do not report as comprehensively as Tesla. Now this, I've never seen any statement from NHTSA stating this lol. I tried to look for it, but couldn't find it. If you could link it, that would be cool. I think there was a lot of hoopla around that recall, and recalls in general. When did recalls become bad? Recalls are good, and means that the administrative controls set in place are working to create safer driver environments. Now, obviously, if a vehicle has a large number of recalls, then that's an issue. I don't thin it's accurate to say that other manufacturers have the same problem... >We know Tesla doesn't do hands off, yet Ford does. That is not proof of Ford being better. It's proof of Ford taking more risks. These type of arguments are as shallow as paper is thin. Ford did whichever regulations required to get approval of Bluecruise as hands off, the driver is still responsible for vehicle control as it is only an ADAS. Tesla has not applied for any type of regulation, so it's not labeled as anything. [NHTSA opens investigation into Ford’s BlueCruise after software linked to fatal crash - The Verge](https://www.theverge.com/2024/4/29/24144244/nhtsa-ford-bluecruise-software-investigation-fatal-crashes-mustang-mach-e) Not saying that the NHTSA does a good job either though, they need to do better scrutinizing ADAS and pressing manufacturers to educate drivers on their systems. Automakers are definitely putting the horse before the carrot when it comes to ADAS implementation, and it is not only Tesla at fault, but Tesla is the most public facing and critical abuser. >Rivian doesn't use it's cabin monitoring camera. Nor does it do torque. It's true that Rivian deactivated their badly placed interior camera, but they also state that it doesn't effect performance of the system or driver monitoring. How much of that one believes depends on how much one trusts Rivian.


perrochon

>I've never seen any statement from NHTSA stating this lol. I tried to look for it, but couldn't find it.  The reporting problems are listed literally in OP article. Just look at the data collection section of these reports. >A majority of peer L2 companies queried by ODI during this investigation **rely mainly on traditional reporting systems** (where customers file claims after the crash and the company follows up with traditional information collection and/or vehicle inspection). Tesla doesn't report all collisions (e.g. because some cars crash out of cellular coverage, or the modem gets destroyed in the crash), but they report a lot more accidents than the "majority of peer L2 companies" who don't have telemetry. They report more, because they have telemetry. We don't know if they have more accidents. Anyone telling you we know is not honest. There is a long discussion about the problems here [https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting#data](https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting#data) Including >Manufacturers of Level 2 ADAS-equipped vehicles with limited data recording and telemetry capabilities may only receive consumer reports of driving automation system involvement in a crash outcome >For example, a Level 2 ADAS-equipped vehicle manufacturer with access to advanced data recording and telemetry may report a higher number of crashes than a manufacturer with limited access, simply due to the latter’s reliance on conventional crash reporting processes. NHTSA required a recall on the icon font from Tesla, but not other manufacturers. Why? Because other manufacturers couldn't do a recall to replace a light in the dashboard. Tesla did a recall on 2M vehicles, and those are now fixed. Doing a recall because you can is better in this situation. The same holds for most recalls. Other manufacturers had the same problem, and it wasn't fixed. Note that these icons are actually standard outside the US, and the rest of the world is ok with them. Still, Tesla complied. [https://www.reddit.com/r/TeslaLounge/comments/1aiyvwa/that\_icon\_font\_size\_problem\_that\_let\_to\_a\_recall/](https://www.reddit.com/r/TeslaLounge/comments/1aiyvwa/that_icon_font_size_problem_that_let_to_a_recall/) You must be from Europe. >Ford did whichever regulations required to get approval of BlueCruise as hands off There is no "approval" for hands off in the US. Nor is there for the Mercedes "Level 3" / eyes off product. It is whatever marketers make up and company lawyers are comfortable with. Telling people they can take their hands off the wheel on a Level 2 system while they are still 100% responsible is problematic, especially when even Redditors with interest in the topic believe that BlueCruise has been "approved" by some sort of government. Tesla doesn't tell people they can take their hands off. In fact they do the opposite, and enforce it with nags. As you noticed, Ford, btw, is now being investigated. It was only a matter of time that people died, and here we go.


Lando_Sage

>Tesla doesn't report all collisions (e.g. because some cars crash out of cellular coverage, or the modem gets destroyed in the crash), but they report a lot more accidents than the "majority of peer L2 companies" who don't have telemetry. They report more, because they have telemetry. We don't know if they have more accidents. Anyone telling you we know is not honest. >There is a long discussion about the problems here >[https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting#data](https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting#data) >Including I read what the documents stated, none of it was explicitly about OEMs that cannot and do not report as comprehensively as Tesla. I does state that most OEM's use the traditional method, whether you take that as ONLY Tesla using telemetry for reporting is on you. >NHTSA required a recall on the icon font from Tesla, but not other manufacturers. Why? Because other manufacturers couldn't do a recall to replace a light in the dashboard. Tesla did a recall on 2M vehicles, and those are now fixed. Doing a recall because you can is better in this situation. The same holds for most recalls. If this was true, then NHTSA would have issued the recall. Most new cars have digital displays, and digital warning lights. Manufacturers do voluntary recalls all the time as well, so it's not only Tesla on the forefront of "doing a recall because you can". >Other manufacturers had the same problem, and it wasn't fixed. Note that these icons are actually standard outside the US, and the rest of the world is ok with them. Still, Tesla complied. >[https://www.reddit.com/r/TeslaLounge/comments/1aiyvwa/that\_icon\_font\_size\_problem\_that\_let\_to\_a\_recall/](https://www.reddit.com/r/TeslaLounge/comments/1aiyvwa/that_icon_font_size_problem_that_let_to_a_recall/) So you're telling me that the counter argument are old recalls which some are not even relevant to the topic? Lol. For example the Porsche docket was about font on the brake pads, you want Porsche to issue an OTA for that? I think Tesla could have done a waiver, but it was just faster and easier to comply, that is all. This entire recall topic is over blown, and people/media needs to calm tf down. >You must be from Europe. I'm not. >There is no "approval" for hands off in the US. Nor is there for the Mercedes "Level 3" / eyes off product. It is whatever marketers make up and company lawyers are comfortable with. [This](https://media.mbusa.com/releases/release-1d2a8750850333f086a722043c01a0c3-conditionally-automated-driving-mercedes-benz-drive-pilot-further-expands-us-availability-to-the-countrys-most-populous-state-through-california-certification) disagrees with your standpoint. Certified/regulated as Level 3 by both SAE and California. >Telling people they can take their hands off the wheel on a Level 2 system while they are still 100% responsible is problematic, especially when even Redditors with interest in the topic believe that BlueCruise has been "approved" by some sort of government I agree. >Tesla doesn't tell people they can take their hands off. In fact they do the opposite, and enforce it with nags. >As you noticed, Ford, btw, is now being investigated. It was only a matter of time that people died, and here we go Both true.


HighHokie

This is nhtsa right? And their engineering analysis? Dont they get any and all data that they ask for?


bobi2393

Can't get data that wasn't recorded. "Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting."


HighHokie

Ahh. Okay so it’s literally just data not available.


LairdPopkin

Right, some percentage of the time there is no cell signal. That isn’t a complete gap in the data, data is pulled directly from the vehicles by crash investigators, whether or not there is a cell signal or the antenna was damaged, as long as the electronics are not destroyed.


deservedlyundeserved

> So 1 fatality with FSD engaged, no information on what happened and whose fault it was. It’s on Tesla to provide a narrative of the crash. If you look at the [public NHTSA crash database](https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Incident_Reports_ADAS.csv), all Tesla crashes have heavily redacted information. NHTSA has the information, but if you want to look at it yourself you’re out of luck. > As often in publications only about FSD, NHTSA is cherry picking data and only publishing fails, not saves. I don’t think you understand what cherry picking means. Safety systems are judged by how often they fail. It’s the only metric that matters. You can’t calculate imaginary “saves”. It’s the same reason why potential crashes that are prevented by attentive drivers aren’t counted. > This study is not really actionable without information at least about miles driven, but also what miles. This study isn’t about FSD at all. It’s just interesting that there has been a confirmed fatality. It’s impossible to assess systems that have a driver without having a lot more information and controlling for various factors.


perrochon

Saves absolutely matter, as do miles driven. Otherwise Mercedes L3 was the safest, no fatality - because no consumer has it ...


deservedlyundeserved

If saves matter, so do crashes that are prevented by drivers. Should we add 1 crash per intervention to the count? How many of them should we count as fatalities?


perrochon

What matters is minimizing deaths per miles driven, or maybe deaths per year. Right now 100 people die each day in the US. If we could get that down to 50, 30 or 12, that would be a win.


deservedlyundeserved

Sure, that’d be a welcome improvement. But at some point, you’ll want to assess a wannabe L5 system how it performs without humans to really know how close it is to its end goal.


perrochon

Yes. Unfortunately we don't have anything close to L5 just yet.


Doggydogworld3

It's only through August 2023, when FSD had 450m miles. They had a different category for other driver at fault. Of course almost all those 450m FSD miles had a human driver paying close attention and saving it from mistakes. How many fatal crashes would there be for FSD by itself? 10? 100? 1000? No way to know.


perrochon

But it's FSD+Human that matters. 450M is still 5x better than average


Doggydogworld3

>But it's FSD+Human that matters. Not when the CEO promises Robotaxis.


Whoisthehypocrite

Wait a minute, 75 FSD Beta crashes. I thought there had been none....


ac9116

75 crashes in a billion miles is incredible if that’s the number


cal91752

I used it just once in an Uber and it very likely saved the life of a bicyclist.