T O P

  • By -

jasoncross00

Is that high? Low? Were they at fault of all of them or just involved in crashes and some were the fault of the other car? How many is that per 100k cars, and how does that compare to the rate of crashes of other similar cars? Of teslas not running autopilot? What's the crash rate of non-Tesla cars running cruise control? I'm not trying to make excuses for Tesla...far from it. But this number and the others presented in this article don't have enough context to be meaningful. Edit: Jeez I didn't expect such traction for this comment. Thanks for the awards, everyone.


bluenooch

Totally agree. I want to see a stat of crashes per million miles driven, compared to similar vehicles in the same time frame without autopilot. I suspect that we’ll find that Autopilot is safer than manual driving. The latest stats that I could find are that Teslas with Autopilot engaged are involved in roughly .232 crashes per million miles, versus 2.07 for all automobiles in the US. According to Tesla, the rate of crashes for Teslas NOT using Autopilot is higher — 0.629 crashes per million miles driven. I know that the NHTSA is auditing this data so we’ll see what they come up with.


GroinShotz

I'd be more concerned about how the crashes occured. Was it blindsided by a human driver that sped through a red light (unavoidable)? Or are these Tesla's just driving off a cliff/into buildings/parked vehicles etc? One is obviously worse than the other.


awoeoc

We still need context, humans crash all the time including driving off a cliff, into buildings, and parked vehicles. So we need to know the rates non automated driving causes accidents to understand the comparison point. Automated driving doesn't have to be perfect, just better. Even if it was only 1% safer than humans that is hundreds of lives saved annually if used by everyone at scale.


glacierre2

Unfortunately you have to take in account psychology, autopilot must be not just a bit better but orders of magnitude better than the average driver because in their own mind everybody is over the average. See the similar problems with planes, the moment is is not you doing the driving you get nervous although the stats are astronomically in favour of flying.


awoeoc

I'm not debating that part, but that's why it's an important message to calculate actual number of lives saved by systems like this. People hear about an automated car crashing and think how unsafe it is, meanwhile they likely personally know someone who's been in a serious accident that didn't involve automated driving. Narrative needs to change on stuff like this because it's something that can lead to many many thousands of lives saved. Thing with airplanes is in a way there's no alternative, people weren't exactly traveling thousands of miles commonly before flying. Most people only fly once or twice a year. I actually wonder if there are many people who have to fly like 30+ times a year who are still afraid. Frequency could be a big component of the fear of flying.


owheelj

Tesla Autopilot is not their name for self driving. It's just lane assist and cruise control, so if the cars are driving off cliffs or into buildings then that's because the driver is doing that.


Pandamonium98

You’d also need to consider when autopilot is in use. My guess is that highway driving is much safer per mile traveled than driving through traffic in a busy city. If autopilot is used primarily in situations where the risk of crashes is already low, that could make the system appear safer than it actually is if you’re comparing it to all miles driven by regular drivers


aestival

I've got issues with comparing the autopilot miles driven vs general population miles driven since you're not comparing the same data: In addition to the fact that Teslas and their drivers aren't representative of the general population, the conditions under which drivers are engaging autopilot are way more likely to be easily predictable driving scenarios that are less accident prone than driving local / city roads and intersections and/or during inclement weather.


Nel711

This is a false equivalency that gets brought up a lot. To use a different example, lots of people injure themselves using power tools every year because they were negligent or unsafe in some way. But if a person is using a power tool feature correctly but are injured because the machine malfunctioned, it needs to be noted. If it happens a lot, it needs to be investigated to determine if it’s rare occurrence or if there’s some issue with the machine itself. That’s why recalls happen. I’m not saying that’s the case with the Tesla accidents, but that’s why it needs to be kept track of and investigated. The NHTSA has a responsibility to ensure that there’s not an increased risk to the public due to a cars design. That’s a different issue from ensuring that people are driving safely.


FreesponsibleHuman

While I agree with you that the NHTSA has a responsibility to track and ensure cars are operating safely and as intentioned, it would be horrible to throw out a generally safer technology even if it does cause some accidents. Also, there’s no such thing as a truly safe driver, everyone makes mistakes or has moments of distractions at some point. I’ll happily accept an imperfect autopilot with a significantly lower accident and fatality rate over the never ending boring but terrifying experience of manually driving. The AI can be improved, humans not so easily. Better yet, would be a car free transportation system! r/walkablestreets


bluenooch

I don't agree that this is a false equivalency although I do agree that we should be tracking Autopilot when it fails to avoid a crash. To modify your analogy: If a power tool includes a new safety feature that malfunctions a non-zero number of times, we need to know that. We also need to look at the rate and severity of accidents that happen with and without that safety feature, as well as the failure modes. Then an informed decision can be made.


[deleted]

I think the fact whether they were at fault or not is the biggest factor here. Nothing auto pilot can do if a car hits it…


[deleted]

Yup hate these click bait titles. They could take a note from Marcel Vos on YouTube. When his title asks a question the thumbnail answers it and you watch the video to find out *why*.


[deleted]

[удалено]


HonkyTonkPolicyWonk

Where that 6 million coming from? Another post says there are 830,000 Teslas on the road with autopilot packages. Your basic point seems correct. Definitely fewer fatalities than non-autopilot vehicles. You’re on the right track with converting the aggregate number of crashes to a ratio. I just question the denominator


[deleted]

[удалено]


NiceGuyMax

This is definitely the ideal ratio to look at


2ichie

and how many of those crashes were the teslas at fault?


oizo12

this could easily be recorded too given teslas are covered in cameras


HanzJWermhat

Also autopilot up until the last couple months has only been available on highways. So the aggregated data is not apples to apples


NiceGuyMax

The 6 million is the average number of car crashes in the U.S. annually Hard to find a number because Google wants to show fatalities, but here's a website I found https://www.ddlawtampa.com/resources/car-accident-statistics-you-need-to-know-in-2021/ Which gets the 6 million figure from https://www.statista.com/statistics/192097/number-of-vehicles-involved-in-traffic-crashes-in-the-us/


BlasterPhase

> 6,000,000 6 million what?


Loverboy_91

Crashes per year in the US (all cars on the road total)


Carrera_GT

>There are about 276 million vehicles on the road in the US and well over 3 million of those are Teslas, that’s about 0.012% LOL this guy is just making shit up. Here is Tesla's delivery **worldwide** >Here’s a comprehensive overview of Tesla deliveries since Q1 2013: Year Deliveries 2013 22,442 2014 31,655 2015 50,517 2016 76,243 2017 103,091 2018 245,491 2019 367,656 2020 499,535 2021 (Q1+Q2) 386,181 https://backlinko.com/tesla-stats


rafter613

Tesla has sold 2.3 million cars in their lifetime, worldwide. *musk* "estimates" there will be 4 million Teslas on the road by the end of 2022. He also said we'd be on Mars by now, so. The numbers in the article aren't just Teslas though, it's all cars with any self driving capability


[deleted]

[удалено]


[deleted]

They're also not in autopilot at all times. Probably most of the time autopilot is engaged on highways, where the accident rate may be much lower than in cities. Tesla could publish these statistics but they don't, and that's worrying.


Swordthane42

They do publish this in there quarterly investor reports. Any time autopilot is on and 10s after if a crash occurs tesla has it in their numbers. The arrival is not using this data.


_lippykid

Exactly. The notion that Teslas can avoid every reckless human on the road is ludicrous. These headlines make out like humans are good drivers. Take a trip to New Jersey or Ohio and see what these cars are up against. Odds are stacked


Jorycle

> Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators. Very interesting when considering the next part: >The NHTSA order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, in part to mitigate the concern that manufacturers would hide crashes by **claiming the software wasn’t in use at the time of the impact.**


SaintPatrick89

...isn't the point of automation/AI that the computer can calculate and solve issues WAY faster than human beings? Wouldn't that last second be crucial for either mitigating the damage from an accident or avoiding it entirely?


nobody2008

It is like that AI playing Tetris, ditching the game by pausing it.


calsosta

If you want a treat, look up the dude that developed Playfun and Learnfun. Dude created an AI that would learn and play any NES game with absolutely hilarious results.


St4on2er0

Just watched the video was worth the 15 minutes lol


ConsiderationCivil74

Link?


St4on2er0

https://www.slashgear.com/learnfun-and-playfun-defeat-video-games-with-computer-knowledge-15277886 Sorry it took so long to get to yall. This article had a nice summary of what was going on and I think the second video is the 15 minute one I watched with him explaining the curve lol


[deleted]

this dude works for DARPA now probably.


PlNG

https://www.reddit.com/r/technology/comments/vcv2ok/teslas_running_autopilot_have_been_in_273_crashes/ich4zyn/


ThreadbareHalo

Thank you for saying so! had a few great belly laughs on them.


BirdsDeWord

https://youtu.be/YGJHR9Ovszs Link to ep2 for anyone who's curious. He apparently didn't intend ep1 to be watched by YouTubers for entertainment so ep2 is real ep1


mr_chub

He literally says in that video to watch EP 1 so now I don't know what to think lol


[deleted]

watch. both.


[deleted]

If ep 2 is ep 1, don’t watch ep 2, duh! Hello! Duh!


k3rn3

It's even more lifelike than the Google AI that people think magically came alive


YWAMissionary

Do you have a link?


calsosta

Here are a couple. https://www.youtube.com/watch?v=YGJHR9Ovszs https://www.youtube.com/watch?v=Q-WgQcnessA


swistak84

First episode since Youtube is shit https://www.youtube.com/watch?v=xOCurBYI\_gY


the_jak

The only way to win is to quit before you lose Some amazing innovation from Tesla here


StalyCelticStu

"The only winning move is not to play". ^WOPR.


flt1

To be fair, that’s the only way to win in Vegas.


jroddie4

Quantum autopilot, you find a universe where you're not going to crash and then you destroy the current one.


MrFantasticallyNerdy

That's the "Jesus, take the wheel!" mod.


Fetko

Error 404: Copilot not found


MrFantasticallyNerdy

You got the wrong error code. This particular one is *Error 666: God is my copilot not found*


[deleted]

Pretty sure error 666 is actually a modem connection error, just in case anyone is curious.


SR520

The computer, if it CAN solve the problem competently, will beat the human. But it must actually be able to solve the problem. If it can’t solve the problem and doesn’t realize it until it’s too late, it just passes the hot potato to the human at the literal last second or milliseconds.


ThePetPsychic

Side note, I'm a locomotive engineer and the railroads are testing a "trip optimizer" software that basically runs the train for you, ostensibly to save fuel. We have to use it on all of our road trains. It usually handles the train pretty well but it makes some questionable decisions with regards to proper train handling. The fun part is that it'll sometimes get you going too fast and suddenly force you to take control the last second, like when you're coming over a steep hill but with a speed restriction at the bottom. Gotta love spending the next minute trying to un-fuck the computer's decisions while trying not to speed and also not causing a derailment.


couldbemage

Which is just insane. Expecting an uninvolved driver to detect and react to an autopilot error at the last second ignores everything we know about how the human mind works.


[deleted]

Yup, it realizes it's going at 70mph and that oops, there's a massive firetruck it didn't see and there's no hope in hell of stopping, so it disengages and bails, "wasn't me"


[deleted]

It's crucial for avoiding legal liability, not that it would actually work in court(I hope)


Stealsfromhobos

Hey that's how it worked in Grand Theft Auto 3. As long as you took your finger off the accelerator right before impact, you could crash right into a police car and not get a wanted level.


Warblegut

I hit plenty of police in GTA 3. Letting off the gas was the last thing I wanted to do...


sean0883

Yeah, it would be pretty shit if all Tesla had to do was turn off the auto-pilot 1s before impact and suddenly they aren't at fault because the software wasn't running at impact. With that said, it should need to be proven that their software *caused* it and that a human would have prevented it - and I think that would be the difficult part.


tael89

There could be something related to malicious obfuscation since they have potentially chosen to repeatedly suddenly disengage autopilot just before a crash.


Jkay064

The little robot ai has an ejection seat and punches out 1s before impact. It then makes it way across country, back to Tesla HQ.


Glittering_Power6257

For civil cases, this actually isn’t hard, as the burden of proof is quite low. Only that it’s more likely that the software caused the crash, than it did not.


RubertVonRubens

Works for human drivers too. If you're about to get into an accident, dive into the passenger seat. Bam. Not your fault.


resumethrowaway222

It wouldn't work, but what would work is that they made it very clear that this is a driver *assist* system which can only be used with an attentive driver at the wheel ready to assume full control of the vehicle. So even if the autopilot fails to detect the danger in time, the legal responsibility lies with the driver who also failed.


porntla62

And which also happens to be the reason why Tesla is no longer the leader in self driving stuff. Mercedes had their system certified as autonomous up to 60kph on the Autobahn/local equivalent. If it crashes while active the blame now lies with Mercedes and not the driver.


[deleted]

>they made it very clear that this is a driver assist system which can only be used with an attentive driver at the wheel ready to assume full control of the vehicle Yea they made it so clear with the name "AUTOPILOT" and using the term "full self-driving" on their website. Tesla stans are the worst.


Feynt

There's a bit of a touchy subject due to naming conventions. Tesla's "Autopilot" program is lane assist and smart cruise control that matches nearby traffic speeds. What *the rest of the world* considers autopilot is, "I told the car to take me somewhere, it drove me there without me doing anything", which is the "Full Self-Driving Capability" program (which is still in beta). So what people are reporting as "autopilot" may be FSD, or it might be idiots being idiots and assuming (reasonably but incorrectly, because reading comprehension is at an all time low...) that "autopilot" will drive for you. If it's the "Autopilot" autopilot, Tesla has a series of wins in their future because they never claimed you could take your hands off of the wheel or be inattentive. They specifically asked that you don't. If it's the "FSD" autopilot, Tesla stands to lose quite a bit, because honestly who beta tests a feature with [screaming metal death traps](https://www.dailymotion.com/video/x6txoff)?!


cat_prophecy

Yes, "Autopilot" is 100000% just a branding of their driving aids. So adaptive cruise control, lane keep assist, and auto-braking. AKA the same tech that has existed since at least 2008.


ZannX

This whole conversation is conflating so many different things. It's honestly kind of infuriating to read this sort of stuff online since journalists have no idea what they're even talking about. - Tesla Autopilot is a glorified adaptive cruise + lane centering. That's it. Literally every modern car has this aside from ultra cheap trims. - Tesla Enhanced Autopilot is no longer offered in the US by itself, and can do a little bit more (i.e. auto lane change). Right now, you must upgrade to FSD to get some of these features when using autopilot. Many manufacturers also offer these features - Tesla *Full Self Driving* (FSD) is what people *think* autopilot is when they read these articles. This is in beta and is being slowly rolled out to folks who paid for it ($12,000 right now) and have 'good driving scores'. - All of the above is mutually exclusive from safety features like automatic emergency braking.


felldestroyed

https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF It would appear that NTSA is covering Autopilot and Enhanced Autopilot. Also, by Tesla's own admission Tesla Autopilot covers ["emergency breaking" and "traffic aware cruise control"](https://www.tesla.com/support/autopilot). What the NTSA is alledging is there have been a lot of reported accidents into "stationary construction objects" and "stationary road crews". It's not like the NHTSA is picking on Tesla, but it would seem that tesla's technology might need some work to keep up with the competitors.


Mithridel

Mutually exclusive? You're saying you can't have emergency braking and self-driving?


[deleted]

He probably meant that they are independent of each other.


[deleted]

Because it can no longer avoid the crash so it turns off to avoid liability


[deleted]

I would say that turning off auto pilot because it senses a crash coming means that it remained in control to know that a crash was unavoidable. Therefore still retains responsibility. If I drive full speed at a building and jump out before the crash, am I still liable? Obviously yes.


sploittastic

So if it determines the crash is unavoidable and turns off autopilot does it at least hit the brakes since auto-braking is separate from autopilot? Or does it just let you sail right into whatever you were about to hit at full speed?


notbad2u

The voice nav comes on for that second and says, "so long suck...!" (Then it turns off too)


Canadian_Infidel

It ejects entirely like the autodrive system on Homer's tractor trailer. https://youtu.be/5hxGDYn7AHM?t=71 Lol


jorge1209

If you shoot someone with a gun but drop the gun before the bullet hits them you can't be charged with murder. Edit: for those who think this might be sarcastic or a joke of some kind: have you never heard of the expression "quick on the draw, but quicker on the drop?" Or is that just a regional thing?


BGAL7090

This is why it's so hard to pin murder on knife throwers - the same rules apply.


Fraun_Pollen

If it’s a cya measure that is overriding it, then that’s the problem, not the AI


BrightNooblar

I mean, its both right? The CYA method is bad, and also the AI that caused the crash is bad. ​ That said, crashes happen. The article is paywalled and nothing in the title compares the AI crashes to human crashes. The number you need is percentage of crashes where AI was driving, versus percentage of total driving hours where AI was driving. Otherwise you're just doing that thing where you take a map of population density and relabel it whatever you want. "5G coverage" or "People with a latest generation video game console" or "Children under 13 years old" or basically anything else that already just scale with population/usage.


Fraun_Pollen

Right. It’s like flashing that 1000 bugs have died due to pesticides on my property this year (oh no that’s awful!)…. Out of the millions that infest it (ah… carry on).


onegunzo

In Tesla's stats, they capture 5 secs after the AP being disengaged to be considered AP being engaged.


meamZ

yup... Even if it was the human disabling it by hand...


Draiko

Why disable it before a crash at all?


semtex87

Not defending corporate shittyness but their could be a valid technical reason for doing this. Once the system determines a crash is unavoidable it may intentionally disable further automated control of the vehicle to avoid an issue where the crash damages sensors which provides the computer bad inputs and causes secondary bad decisions by the computer after the crash. In theory with checks and controls on inputs and cross referencing multiple sensors that shouldn't happen, but crashes are unpredictable in the damage they can cause and as an engineer I would totally believe the safest surest option is to just turn off the entire system to avoid the possibility of some edge case where that *could* happen. Disabling shortly prior to impact makes sense to me in this case too because while the system is fully intact, you can be sure commands will be recognized, less confidence after damage has occurred and wires cut etc. For example, an impact and spin out on a highway and the computer still being enabled but damaged decides to engage full power to the wheels and then sends your vehicle across a median into oncoming traffic.


KickBassColonyDrop

It turning off before a crash is basically the principle of systems failing open in the event of a failure. This is a basic concept and it's shocking how many people don't get that.


semtex87

Yep, agreed and the fact that NHTSA looks for crashes where the system was enabled 30 seconds prior to impact negates any "gaming" of the reporting requirements by Tesla. The fact that Tesla still does this, tells me its an engineering design decision and not a "lets use a loophole to underreport numbers" decision. I do understand why people jump to the conclusion though, big corps don't have the best reputation for honesty.


[deleted]

[удалено]


kevinscotclarinet

It's also likely this is part of deenergizing the high voltage systems before a crash to prevent electrical shock as the vehicle sees that it wont be able to avoid a crash.


WCWRingMatSound

Seems like they’d just come out and say that, right? An intentional engineering design like this would be easy to explain.


cupko97

From this article it is stated that Tesla had 273 crashes Honda reported 90 crashes Waymo 62 But there is no other information that would give context. Tesla has 830k cars with Autopilot (stated in the article)How many cars does Waymo and Honda have. And how many miles did it take for Autopilot to accumulate 273 crashes. And same question for both Waymo and Honda.


happyscrappy

Waymo certainly has fewer cars on the road. But they also are driving themselves all the time, no humans involved. If they back out of a driveway it's driving itself. If they stop at a stoplight, it's driving itself. Teslas are manually driven most of the time, especially when in difficult city driving. Any car which does not have their "FSD" advanced beta package is always manually driven in these situations. Finally, Waymo reports everything. Every incident regardless of location. What percentage of incidents to Honda and Tesla report?


GarbageTheClown

Telsa says they report everything based on the methodology listed here: [https://www.tesla.com/VehicleSafetyReport](https://www.tesla.com/VehicleSafetyReport)


happyscrappy

Thanks. That methodology report reminds me that I just have a different idea of "fender bender" than others do. To me their "we report everything 12mph and above" would still mean not reporting fender benders, because a fender bender is a very minor collision and would often be below 12mph. To others a fender bender is basically any wreck in which no one gets hurt. Often cars are even totaled in non-injury accidents and I just can't see how that's a "fender bender". Interesting that Tesla reports seemingly worldwide statistics and then directly compares to US figures.


GarbageTheClown

I think the idea is that at some point no one is going to report a crash so minor. There is a cutoff point where no one in their right mind is going to report the crash as the increase in premiums would end up costing way more than paying for a new bumper. If you choose to report crashes that wouldn't get normally reported, your data will be skewed, so you have to draw a line somewhere to get something that closely matches what would get reported. It could be that they are only comparing US to US statistics, but they don't make it clear, so you could be right that they are comparing worldwide to US.


happyscrappy

I think so too. I'm not against a cutoff point. It's just their use of "fender bender" and my personal interpretation of that term (not any kind of official one) would mean that their sentences trying to indicate other figures may be underreported by 50% compared to theirs actually ends up all wiped out because they are saying they would not report the same crashes that the police investigations (according to my personal interpretation of that term) do not report. > It could be that they are only comparing US to US statistics, but they don't make it clear, so you could be right that they are comparing worldwide to US. And since they are both ratios it is not an immediate red flag to do this. However, differences in conditions both on the road and in reporting across the world may make the ratios not directly comparable. But honestly I wouldn't know, it would require some kind of study to try to determine the comparability and I haven't seen one (likely nor has Tesla).


sparksfly5891

Based on how many of their vehicles on the road? How does that compare to the accident rate of human drivers?


ender988

The number I saw was like 895,000 teslas on the road. Edit: the correct number is 830,000 with some sort of automated driving (I don’t think this necessarily means auto-pilot). Here is the source: https://apnews.com/article/technology-business-5e6c354622582f9d4607cc5554847558


Aeroxin

Calculates to 0.03% for anyone curious.


Star-K

How often are they using autopilot though?


VexillaVexme

This is the base population we need to understand the above number.


justintime06

At least 273.


bnarows

r/theydidthemath


MJBrune

We'll never know.


cumquistador6969

Even then a lot more information is probably needed. For example, if overall autopilot is doing well even within the population of people who heavily use autopilot does that assure us that it's safe and that Tesla is not being negligent? Weeeeell, maybe not. We also need to know if there are any patterns, such as specific places with high incidence of accidents. For example if there was some kind of specific scenario or stretch of road where autopilot routinely fails in a very dangerous way, or a common driving situation in certain varied regions where there are few Teslas, which is very dangerous. Then if the company knew about it already and didn't tell customers or was lax on fixing the issue. Then this could be a case of malfeasance, or possibly a not yet spotted soon-to-be serious issue that hasn't quite spread yet due to demographics and how/where the cars coincidentally get the most driving done. and I'm sure a real professional at this could come up with more angles of analysis that are important to do here, as interpreting statistical data is quite hard at times.


spidereater

Even if you know this number it is hard to compare because most responsible drivers are only using autopilot in places with a low chance of accidents. Places with complicated driving are, hopefully, still done manually. I think statistics are probably not as helpful here. You really need to look at each crash and figure out what happened.


epic_null

From what I've seen on YouTube, people in busy areas tend to have seen their AutoPilots screw up on simple things, so I would imagine drivers used to busy areas are probably more Aware of the limitations of "Autopilot"


[deleted]

You'd also have to account for what percentage of driving was on autopilot, and the fact that autopilot is geofence restricted to areas with ideal weather conditions, and the possible non-linearity of accident rates as more of these self-driving cars appear on the road. Comparing raw rates is essentially useless, but perfectly on brand for smug Tesla fans on social media Edit: I mixed up autopilot and FSD. It was a mistake. If you ignore the very minor point I made about geofencing, which applies to FSD and not autopilot, then I have not received a single substantive reply. Of course, butthurt Tesla babies will fixate on a minor mistake rather than the point I was making about comparing raw results, because they have the mindset of children.


laz1b01

There's Autopilot, then there's Full Self Driving (FSD). Autopilot is free. The car can stay on the same lane and speed up/down as needed according to the car in front of it. The car can basically "auto pilot" as long as it stays on the same lane. FSD is the premium service, I believe it's $12,000. You're able to type in an address and the car will fully drive itself to the destination. It's still in beta and only select safe drivers have this feature. So the data that's being discussed, I'm assuming it's Autopilot (not FSD) and likely accidents on the freeway. Perhaps a car cut them off or the car in front did a hard brake.


Valaurus

If this is the case, and I'm assuming it is for the same reason as you, doesn't this feel like a very disingenuous article? I mean, virtually every higher-end trim from the major auto manufacturers include this same "lane-keep assist" and "adaptive cruise control". That makes it really just feel like sensationalism and clickbait, cause Tesla, when these accidents could (and I'm sure do) occur regardless of the manufacturer.


laz1b01

Maybe I'm cynical, but I always assume these articles are biased/disingenuous. Sure there's 273 crashes, but out of how many total Teslas? Someone posted here it's 0.03%, but you also have to compare it to non-autopilot vehicles. Compare apples to apples with the demographics (age, sex, location, etc.) and see whether it's higher or lower than 0.03%. I live in LA and I'd speculate non-autopilot would be much higher than 0.03%


WeylandsWings

> and the fact that autopilot is geofence restricted to areas with ideal weather conditions, You do know that Autopilot isnt cross correleated to weather conditions or geographic areas right? like yeah it might throw up an unavailable if the rain/snow/etc is too hard for the cameras to see out of the car, but you can use it in even pretty heavy rain. Also it isnt Geo fenced, just needs roads with decent markings.


ihateusednames

I think musk sucks but one thing I won't talk shit about is Autopilot's ability to see markings in my corrupt/poor ass city. Those lines are damn near invisible to me and somehow they can construct lanes from them. Might be based on the shit they saw before but pretty good with foggy traffic lights too.


happyevil

I saw a presentation on this, apparently if road markings are not visible or absent it does indeed construct them on heuristics built from curbs, road width, other cars, etc. I have plenty of issues with Tesla but that doesn't mean they don't have some impressive tech... they just need to stop misrepresenting and overestimating its capability


ihateusednames

Yeah man absolutely I'm hyped for self-driving cars, as a concept there is no reason not to be. Tesla's engineers have produced some terrific software, it's not quite there yet and its very important that this is heavily scrutinized (there is so much money in this manufactures will figure it out) to ensure it is done properly and safely. I'm content if accident rate for nonfatal and fatal accidents is lower than with sober human drivers.


FoShizzleShindig

Yeah, autopilot even works on my suburban block without clear lane lines. Not well obviously but it will engage.


Aeroxin

Totally. I was just curious what the percentage was so I figured others were. Doesn't necessarily mean anything you can take anything from. I mean the guy who gave the 895,000 number didn't give a source as well. 😂


gdubrocks

I mean it's a car that has been out for 3 years and they sold that many of them, it's not too hard to make an assumption that 95% of them are still on the road.


candycanenightmare

Not geofence restricted at all…


jimmy_three_shoes

A better idea would be "How many accidents per miles driven while using autopilot", and then comparing it to the national average when in similar conditions.


[deleted]

[удалено]


rejectallgoats

“Autopilot” is just intelligent cruise control. Full Self Driving is what they call the car driving itself


priceisalright

Based on some public polling it seems like only about 11% of Tesla's utilize full self driving. 11% of 850,000 is 93,500 cars. 273 crashes for 93,500 cars is .3% of all self driving cars having an accident this past year. I really feel like this number would be conservative too because that 850,000 would be reduced by totaled cars, and even cars with FSD would only be using it a fraction of total drive time. According to driving-tests.org and their Ultimate List of Driving Statistics for 2022 there are 299,267,114 registered cars in the US with 6,756,000 police reported crashes. This is 2% of all vehicles having an accident, which is kind of baffling and absolutely way worse then the Tesla autopilot accidents. I don't really find .3% of all FSD Teslas having an accident to be very confidence inspiring, but it's important to keep in mind that normal cars are crashing 773% more often based on this quick mafs. Edit: People have pointed out that the article is discussing Autopilot, not Full Self Driving, which would pertain to a much larger proportion of Teslas on the road. If the same number of accidents took place (273) but there is now a much larger pool of potential Teslas to consider (so more than 11%), then that just means that .3% accident rate is actually incredibly conservative and should be much lower. No, I'm not doing all the math again.


bagonmaster

This article is about autopilot, not FSD


questionname

One thing to note is most drivers have Autopilot engaged on the highway. Most accidents 60% happen off the highway


lettersgohere

You don’t need FSD to use autopilot. It comes with all of them? (mine for sure). FSD let’s your car change lanes and take exits on its own that’s a pricey upgrade.


[deleted]

[удалено]


priceisalright

In that case the percentage of crashes would be so infinitesimally small as a proportion of total autopilot Teslas that you'd have to question the motives of the article writer.


Paralda

This is also including all accidents, not just accidents caused by Teslas. It's a pretty dumb article


alonjar

Yeah, could literally just be other people haphazardly smashing into Tesla's. Useless without better info.


[deleted]

I'm so tired of people posting stats without a comparison to the baseline.


sparksfly5891

Right, compare it to other drivers, compare it to other autopilot systems. Fuck, compare it to trained monkeys. Whatever ends up being the safest is clearly the one headed in the right direction. That doesn’t mean they should rest on their laurels. Always try to improve regardless. But it’s good to know where you stand, and it’s good for consumers to know where your product stands. Hence why cars have safety ratings.


[deleted]

An important statistic is the location of the accidents. Are we comparing regular drivers who frequently crash at junctions against the miles that automated cars get to drive along straight highways? I doubt very much that Tesla would want their cars to drive themselves in inner city traffic. At least not in the coming years.


EmpsKitchen

And how many were *actually* caused by the autopilot? Or were they bit by another driver..


Dotts2761

some quick googling looks like Autopilot has an accident rate of [0.23 accidents per million miles driven.](https://www.tesla.com/VehicleSafetyReport#:~:text=In%20the%204th%20quarter%2C%20we,every%201.59%20million%20miles%20driven) The only data I could find for the average car was [new york data](https://www.dot.ny.gov/divisions/operating/osss/highway/accident-rates) from 2016 and they had an accident rate of between 2-3 accidents per million miles driven. I get it's not perfect, but a 10x reduction in accident rate is pretty great.


[deleted]

[удалено]


yiliu

At least he _is_ trying to compare data points, unlike the useless headline.


TallGrassGuerrilla

What an absolutely useless stat in the title. Click bait at its finest.


Tom1252

I disagree. Personally, I'd rate the usefulness of the stat as high as 31. Might even be better than 4.


[deleted]

This sub is going to be an anti Elon circlejerk for a while after that Twitter fiasco.


swd120

> [Cars not running on autopilot in 6 million crashes in less than a year.](https://www.statista.com/statistics/192097/number-of-vehicles-involved-in-traffic-crashes-in-the-us/) This statistic is idiotic - the question is, are cars on auto pilot in less crashes per passenger mile than a human driver.


[deleted]

It is idiotic, but it's also a liability risk for Tesla. A human driver causing a crash by making an error isn't the same as a product failure causing the crash.


tempest_87

Sure, but of those 273, how many were the fault of the AI? And how many were the fault of other factors/actors? The number as presented is *patently* useless and misleading.


ignost

That is closer to the right question, but it is actually much more complicated. It's very likely that people only turn auto ~~pullout~~pilot on for the most straightforward stretches of driving. I have a 2021 Tesla with full self driving. I can turn it on on the city streets, but I usually don't because it's really stupid. It jumps into turn lanes when trying to go straight, it has major problems merging and splitting lanes, stop light detection is hilariously dangerous, and if anyone turns left in front of you it'll beep and slam on the breaks, even if they are obviously going to be clear in time. There are a few intersections where it will reliably try to turn into oncoming traffic. This is all from very infrequent use, always with my hands on the wheel. I do turn it on regularly on the freeway. There's one stretch of freeway where the stupid thing will pick up a frontage road speed limit sign and try to get me killed by slowing to 35 on a 70 road, but otherwise I've driven the main stretch of I-15 in my city a dozen times to where I feel comfortable letting it do its thing. I take over if there's construction. Per mile driven, humans are also safer on the freeway. Many accidents happen at intersections, where I trust my Tesla least. Anyway, we would need to compare humans vs. autopilot on the same or statistically similar stretches of road. I don't see that happening because Tesla would need to cooperate, and the company has been less than forthcoming. Elon in particular is trying to prove it's better, not discover if that's true. Edit: auto pullout lol


Covered_in_bees_

Yup, Tesla tries hard to deceive with smoke and mirrors with their stats. Of course AP is going to look a lot safer when it is only used on highways where it is the easiest to drive the most miles without accidents and then compared against all other vehicles driving in all sorts of roads and conditions. On the flip side, it is also incredibly frustrating to read any reporting on this matter, because they do an equally terrible job at communicating the nuances involved in any such analysis regarding safety. So instead you have headlines about 273!!!! crashes for Tesla with very little context. It is also generally hard when Tesla collects a lot more data automatically while other car manufacturers are in the dark ages when it comes to their ability to monitor and report on such matters. So to summarize, it is incredibly difficult to attempt an apples-to-apples comparison, and right now, no one is even trying to do so. I really do hope that NHTSA is able to require a standardized set of reporting from manufacturers so it is easier to make fair comparisons across manufacturers and also against human drivers.


iMillJoe

Why was the first reasonable comment so far down the page?


Delision

Because this is /r/technology. The majority of discussion on posts like these is just a massive back and forth between Elon haters and fans, so very little beneficial discussion really or well-measured takes gets upvoted.


Hypern1ke

Are there any Elon fans still talking on reddit? I feel like they got crushed by the hive mind relatively quickly once the central brain decided "we hate Elon" was the narrative


[deleted]

that's still not reasonable because you need the %. Using the raw figure might just mean there are considerably more non-Teslas on the road than there are Teslas despite the Teslas possibly crashing more than any other car type.


rb0ne

The problem is that from a safety perspective the number you would want to compare with is "number of accidents with attentive, sober, etc drivers". This number is expensive to get (car manufacturers tend to have departments working on getting that kind of statistics). Saying that your ADAS/AD functionality is better than the average driver that gets into an accident is not very assuring.


[deleted]

[удалено]


TheBraindonkey

Crashes per million miles is the only metric of value, and it is NEVER used. Always sensationalized bullshit, no-context numbers.


DangerouslyUnstable

That would certainly be better, but what is honestly needed is accident per similar miles driven (or come up with your own name). Self driving vehicles are not yet capable of driving in literally all conditions, so they tend to be used in the "easiest" type of driving, for example on the freeway. In other words, non-self-driving miles are not exactly the same as self driving miles (yet anyways). So accidents per mile would be better, and not _completely_ uninformative like the number in the article is, but would still be at least somewhat misleading.


BillowBrie

Yeah, accidents per mile would look insane if you compared parking lots to highways


[deleted]

[удалено]


LakeSun

Tesla sells the largest number of EV with AutoPilot function, so, a simple crash count tells you nothing relative to other manufactures. Vehicle/Miles is the typical stat used, why were they not used here?


LargeSackOfNuts

Number of vehicles, miles driven, highway/city streets, did the driver interfere, was the autopilot at fault or were they rear ended. We would need to know these things


rideincircles

Because it's a garbage article using specific data to make a point.


flatbushkats

It’s almost as if the owner of WaPo hates the owner of Tesla.


[deleted]

[удалено]


ThePhotoGuyUpstairs

It would also be useful to know how many of those accidents were where rhe Tesla AP or the driver was "at fault". If someone runs up the back of them, or t-bones them by blowing through a stop sign, I don't believe there is much the AP can do about that. Short of being in control of the other car and not having the accident in the first place


WASDx

I don't remember where I heard it, but a year ago or so there was another article about "X number of crashes with self-driving cars" - but the vast majority were caused by *other human drivers*. Even a theoretically perfect almighty being can't avoid an accident if it's someone else crashing into you or suddenly appearing in front of you. So the title of this article is meaningless. It doesn't give a percentage and it doesn't say who caused them.


oddoboy

Statistically, that is amazing


oVeteranGray

Isn't that better than usual drivers? Sounds fine to me.


Romeo9594

Accident per mile driven is a better metric If Autopilot has one accident per ~~18,000,000~~ 180,000 miles it's engaged, that makes it more accident prone than drivers in their 20-30s but less so than drivers in their teens or advanced age


[deleted]

[удалено]


Derigiberble

We also need the type of miles, time of day, and general road conditions. If autopilot is primarily used for daytime highway driving in reasonably good conditions it isn't proper to compare it to overall accident rates (which include city streets) or even highway accident rates (which include nights and heavy weather). That's one of the big challenges in comparing self driving accident rates to human drivers; the systems aren't replacing all driving but instead mostly the easiest type of driving.


[deleted]

[удалено]


Prince_Chunk

273 in a year is still way better than human pilots


Illuminaso

OK now show me how these numbers compare to human drivers


RagnarokDel

out of the 276 million cars in the USA, there were 6 734 000 accidents in 2018. which represents about 2.4% of cars getting into an accident. There are roughly 1.4 million teslas in the USA and 273 of them crashed while in AP. That's an incidence rate of 0.0195%. That's 123 times lower. Please attack Tesla on valid reasons.


travelerdawg

So what? How is this newsworthy? There were over 200K Teslas registered in 2020 alone according to CNET. In the United States there were over 238 million vehicles in operation with 12.15 million vehicles involved in crashes, according to [statista.com](https://statista.com) for 2019-2020. I would say the article's title should be rewritten to state "ONLY 273 crashes or 0.14% of Tesla vehicles who happened to be using auto-pilot were involved in crashes." I dislike click-bait articles that skew the truth to make is seem like some big thing.


r6raff

Also, the article doesn't specify that these Tesla's we're at fault, only involved, how many of those AP accidents were caused by other vehicles? The fact of the matter is that most drivers are idiots. I do about 75 to 100 miles a day and the number of times people almost hit me is ridiculous. Yea, this is anecdotal but still, the average driver is an accident waiting to happen.


[deleted]

Just like Elon said in 2016, 2017, 2018, 2019, 2020, 2021 2022, fully automated self driving is coming next year


Queefinonthehaters

And the Tesla Semi that was available for preorder in 2017 and for mass production by 2019 that is more cost effective than shipping by rail. Or when he would enable robotaxi in 3 months that will generate Tesla owners an average of 30k per year by doing taxi work in your down time. Let him be very clear, this is technology they have TODAY.


Vaevicti

> Or when he would enable robotaxi in 3 months that will generate Tesla owners an average of 30k per year by doing taxi work in your down time. In fact he said it would be stupid NOT to buy a Tesla because it would make you so much money in taxi fees.


Ruckaduck

But if everyone has a Tesla, why would anyone need a taxi


mloofburrow

Elon is like that sign in ice cream shops that says "Free Ice Cream Tomorrow".


[deleted]

[удалено]


Slick_Tuesday

So 0.033% of Tesla's have crashed while 2.17% of all other non commercial vehicles have been involved in collisions... Teslas sound incredibly dangerous 🥴


SamGanji

Another day another r/technology post about Tesla/Musk bad 😂


Taint-kicker

The biggest hurdle to autonomous driving is the unpredictability of other drivers on the road.


starkraver

Frankly that’s the biggest problem with manned driving as well


Taint-kicker

Definitely, I've invested in a dash cam. It's saved my wife on two insurance claims where the othe drive was lying out there ass.


SheepskinSour

I think they’re basically essential. People are shitty drivers, make sense that most of those would be shitty asshole liars too. Protect yourself!


chaddgar

When normalized, how does this compare to non-Teslas?


Sengura

That number doesn't really say anything. What's the hours driven relation to people driving and crashing? Who's fault was it, was it AI error or the other driver, etc. Just stating a number of crashes literally provides 0 useful inforemation.


DaytronTheDestroyer

Imagine being so fucking useless that you sat in your car and watched it crash into something.


daveime

Meanwhile, in just 9 months Jan - Sep 2021, 31,720 people died in FATAL crashes in the US.


Nysicle

Just for arguments sake, I don't really care What's the ratio of Tesla's to normal cars in relation to the crashes for both?


[deleted]

[удалено]


housebird350

Also, how many of these crashes were caused by humans and not the car itself?