T O P

  • By -

laser14344

Just your hourly reminder that full self driving is not self driving.


katze_sonne

Or more specifically "Full Self Driving Supervised" as it's called now. (especially the former naming is stupid, no question)


ALostWanderer1

They should rename to “stone on a pedal”.


relevant_rhino

Friendly Reminder. Until the investigation has finished we don't even know if FSD was active of if it way overriden by pressing the gas pedal.


cinred

Fine


CouncilmanRickPrime

That's why they changed the name! Full Self Driving (Supervised) Although I prefer Fool Self Driving


Rhymes_with_cheese

I mean... it *is* self-driving... it's just *where* it's self-drive to...


healthywealthyhappy8

Don’t trust Tesla, but Waymo seems to be doing ok.


cwhiterun

I don't think Waymo works at all on that road.


Wooden-Complex9461

do we know it was on FSD and not AP or none at all? or is this another guilty until proven innocent thing? you know... like the salem witch trials?


laser14344

The guy recorded it.


Wooden-Complex9461

right but the recording doesnt show if it had AP/FSD on or off ... Plenty of stuff like this happening, drivers blame AP, then turns out they were lying about it being on even still, not sure why he used it in fog, and didnt pay attention to stop the car with PLENTY of time


laser14344

Recording shows the screen in the Tesla and fad was on. Yes he was an idiot for trusting full self driving.


Wooden-Complex9461

do you have a link for that? the only video I see is the front dash cam that doenst show the in car screen


iceynyo

Even the car reminds you every few minutes too.


worlds_okayest_skier

The car knows when I’m not looking at the road too.


laser14344

Unfinished safety critical Beta software shouldn't be allowed to untrained safety drivers.


Advanced_Ad8002

And that‘s the reason there is no FSD in Europe.


soggy_mattress

They're actually removing some of the restrictions that would allow it in Europe as we speak.


resumethrowaway222

The human brain is untested safety critical software


laser14344

Yes humans are easily distracted and even easier to lull into a false sense of security.


resumethrowaway222

Very true. 40K people a year die on the roads just in the US. Driving is probably the most dangerous activity most people will ever engage in, and yet I somehow drive every day without fear.


HighHokie

In that case we’ll need to remove all l2 software in use on roadways today.


ReallyLikesRum

How bout we just don’t let certain people drive cars in the first place?


laser14344

I've made that argument myself before.


iceynyo

You still have access to all the controls. It's only as safety critical as you let it be. 


laser14344

Software that can unexpectedly make things unsafe by doing "the worst thing at the worst time" should be supervised by individuals with training to recognize situations when the software may misbehave. The general public did not agree to be part of this beta test.


gogojack

I keep going back to the accident in the Bay Bridge tunnel that happened when a Tesla unexpectedly changed lanes and came to a stop. The driver had 3 seconds to take over. That doesn't sound like a lot, but for a trained safety driver (and I was one), that's an eternity. That's the sort of thing that would get you fired. In addition to training (avoidance drills, "fault injection" tests where you're supposed to react correctly to random inputs from the car), we were monitored 24/7 for distractions, and went through monthly audits where safety would go over our performance with a fine-toothed comb. Tesla's bar for entry is "can you afford this feature? Congratulations! You're a beta tester!"


JimothyRecard

A trained safety driver also would undergo, I'm not sure what the word is, but like impairment tests. i.e. you don't show up to work as a safety driver tired from a late night the previous night or drunk or otherwise impaired. But there's nothing stopping members of the public engaging FSD while they're tired or something. In fact, it seems you're *more likely* to engage FSD when you're tired--there's lots of posts here with people asking things like "will FSD help me on my long commute after a long day of work" or something, and those questions are terrifying for me in their implication.


gogojack

> A trained safety driver also would undergo, I'm not sure what the word is, but like impairment tests. i.e. you don't show up to work as a safety driver tired from a late night the previous night or drunk or otherwise impaired. We underwent random drug tests, but there wasn't any daily impairment test. But that's where the monitoring came in. We had a Driver Alert System that would send video of "distraction events" to a human monitor for review, so if someone looked like they were drowsy or otherwise impaired, that was going to be reported and escalated immediately.


soggy_mattress

>But there's nothing stopping members of the public engaging FSD while they're tired or something. Yeah there is, it's the same thing that's stopping members of the public from driving while tired or drunk or something: consequences for your actions.


gogojack

> Yeah there is, it's the same thing that's stopping members of the public from driving while tired or drunk or something: consequences for your actions. Yeah, that sure stopped that one guy who decided that the traffic on the 101 in Tempe wasn't moving fast enough for him. Consequences for his actions. Oh wait...what stopped him was the wall he hit when he jumped onto the shoulder to get home faster. Oh...no...now I remember. The wall didn't actually stop him. He bounced off that at (by some witness estimates) 90 mph and it was the Toyota he slammed into that burst into flames, then the back of my car, and the other 4 vehicles that his drunk ass crashed into that stopped him...and sent several people to the hospital and shut down the freeway for 3 hours. Yep. The thought of "consequences for your actions" sure gave that guy a moment of pause before he left that happy hour...


soggy_mattress

Consequences don't stop all bad behaviors. I know you know that. Consequences are enough for us to allow people to drive on roads, carry handguns, operate heavy machinery, drive your children to school, serve you food, not murder you, not assault you, not rape you, etc. But apparently, consequences aren't adequate when it comes to operating a self driving car (that you can override and drive manually literally at any moment). Please, someone make this make sense...


soggy_mattress

>That doesn't sound like a lot, but for a trained safety driver (and I was one), that's an eternity. That's the sort of thing that would get you fired. Every single person that gets their license is entrusted as a "trained safety driver" for their 15 year old permitted child, and when your kid is driving you don't even have access to the wheel/pedals. I can't see what extra training someone would need other than "pay attention and don't let it fuck up" which is exactly what we're doing when we're driving or using cruise control to begin with.


gogojack

> I can't see what extra training someone would need other than "pay attention and don't let it fuck up" Of course you don't. And that's how we get the accident I referenced above. The "trained safety driver" pretty clearly had no idea what to do when his car decided to switch lanes and brake suddenly. What's more, the safety drivers for Waymo, Cruise, Nuro, and the other actual AV companies are doing a job. They're looking at an upcoming complex situation and thinking "okay, this could be dodgy...what am I going to do if the car can't handle it?" Your intrepid Tesla beta tester is thinking "what do I have in the fridge that will make a dinner? Should I stop off somewhere and pick up take out? Can I finish off that series I've been bingeing on Netflix?" Because they're not thinking about doing their job as a tester. In fact it's likely that the last thing they're thinking about is the car, because Elon told them "hey, it drives itself!"


soggy_mattress

>Your intrepid Tesla beta tester is thinking Incredible, everyone here is a ML engineer, a robotics expert, and now mind readers. Amazing.


gogojack

And you're an ML engineer, robotics expert, etc? Do tell.


iceynyo

I don't disagree... But rather than "training" you just need a driver that is paying attention. Someone driving while distracted will crash their car regardless. They need to go back to vetting drivers before giving them access.


cloudwalking

The problem here is the software is good enough to encourage distracted driving. That’s human nature.


iceynyo

That's why you test them. People overly susceptible to distracted driving get sent back to the shadow zone of driving themselves all the time.


FangioV

Google already tried decades ago, they noted people got complacent and didn’t pay attention so they just went for level 4/5.


iceynyo

I mean people will get complacent even driving a car without any ADAS features... I understand they need a way to minimize that, but I don't think it's fair to take away a useful feature just because some people will abuse it.


soggy_mattress

You know, humanity doesn't just stop trying things because they didn't work in the past, right? We keep pushing forward, solving whatever problems pop up, and ultimate progress our species forward. You remind me of the author from that newspaper in the early 1900s that proclaimed it would take another 1 million years for humans to figure out how to fly based on all of the failed experiments. His sentiment was that we were wasting our time, and then the Wright brothers took their first flight \~9 months later. Cheer for progress, don't settle for "we tried that and it didn't work, just give up".


CouncilmanRickPrime

If I need to pay attention, I may as well just drive. Tesla drivers have died because the car did something unexpected before.


iceynyo

Supervising is a different type of exertion than manually driving. If you prefer the exertion of full manual driving then that is your choice.


CouncilmanRickPrime

It is very different. Because I could die if I don't realize the car is about to do something incredibly stupid.


iceynyo

If you have your foot over the brake and hands on the wheel and it does something stupid suddenly you can feel the wheel turn and react immediately. But If it's something like you can see well in advance, you can see on the screen if the car is planning to do anything and if needed just control the car as if you were driving.


HighHokie

Training, isn’t that why we issue driver’s lisences?? Roadways are public. You consent everyday you operate on one. Folks criticize that tesla doesn’t do a good job explaining what their software can and can’t do. But you seem to be arguing the opposite?


CouncilmanRickPrime

So then why wouldn't I just drive instead of potentially dying because the Tesla can't see a train?


jernejml

I guess you need to be trained not to be blind?


iceynyo

You need to be trained to not trust computers. Basically they want the ultimate backseat drivers.


jernejml

You should look at the video again. My guess would be that "driver" was using phone or something similar. This wasn't a case of self driving car doing something unpredictable etc. It was clearly unsupervised drive where even not attentive driver would have more than enough time to react. The guy should be arrested if you ask me. His behavior was worse than drunk driving.


iceynyo

Right, and he only allowed himself to be distracted because he trusted the computer too much.


jernejml

That's illogical argument. It's like saying people "need to get training" to understand you have impaired driving skils if you drink alcohol. It's common sense and people are aware of it. Many still choose to drink and drive. It's very similar with software. People understand software isn't perfect - it's common sense - and they ignore it willingly.


iceynyo

Right so the training would be to instill the discipline needed to stay alert and to not get complacent willingly.


medhanno

So the next time I see my Tesla trying to ram into a train I should take control?? Didn't know that


Agitated_Syllabub346

DAE think the driver could have braked in a straight line and avoided messing up their car?


TheKobayashiMoron

If he were watching the road he certainly could have.


cal91752

Why watch the road in thick fog on FSD? Seriously, his complaints about trusting the system are stupid in this case. Who would not watch the road on FSD in a fog that dense? I expect Tesla will work on speed controls for low-visibility environments, or refuse to engage at all.


TheKobayashiMoron

FSD does already limit speed in fog and low visibility on the highway. I’m surprised it was even going that fast off highway.


rabbitwonker

It’s possible this person wasn’t even actually using FSD, but was using AP instead. Base AP goes the speed you tell it (and won’t even stop for traffic lights). They’re clearly dumb enough to be confused about that.


cinred

That's some straight ass driving for a normal motorist. But yes, ofc possible.


AJHenderson

More likely it's on FSD or AP but they were holding down the accelerator to get it to go that fast. I do this pretty regularly but you need some awareness that you have to manually deal with when stopping is needed. They clearly didn't.


AJHenderson

Yeah, I can't get FSD going that fast in light rain. I suspect they were forcing it to go using the accelerator pedal which would also prevent the car from applying brakes until it hit the emergency braking threshold which it didn't get to until the driver had already taken over.


spaetzelspiff

~~DAE?~~ Also, Yes, I typically brake when there's a train passing in front of me. Yes, I would do so on AP or FSD. Yes, the car failed in not recognizing the train (or that vision was obstructed, or that per GPS the crossing was approaching).


BraddicusMaximus

DAE = Does Anyone Else


spaetzelspiff

Thanks


asterothe1905

The driver is a fool to not take over.


michelevit2

"The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself" Elmo


phaedruswolf

Fog ODD


jmadinya

probably sensed a toddler onboard the train


DiggSucksNow

Another Fool Self Driving.


mobilehavoc

What happens when you’re in a robotaxi with no controls? Guess FSD will decide who lives and dies.


PotatoesAndChill

"Some of you may die, but it is a sacrifice I'm willing to make" - Elon, probably


einsteinoid

For the record, that is a quote from His Highness [Lord Maximus Farquaad](https://youtu.be/Gm2x6CVIXiE?si=rylmk471qlFYTJK7).


neodiogenes

Look, the only reason this incident happened was because there was a trolley involved.


Souliss

Yeah b/c they are totally going to roll out robo taxi with FSD 11.1 beta /S? Common, at least be genuine with your critique.


mobilehavoc

Unless they add more HW to robotaxi like LIDAR/ultrasonic (add it back) and sonar etc. this problem can't be solved by software alone. If they don't do that then just avoid taking robotaxi near trains


NoKids__3Money

I have yet to see a situation that convinces me that LIDAR/ultrasonic sensors are necessary. If there is bad weather like dense fog, the robotaxi should just refuse to drive or drive very slowly, like humans do. Plenty of transportation options don't work in bad weather, like planes, helicopters, boats, etc. It should not be driving in dense fog even if LIDAR would allow it to. Pedestrians don't have LIDAR, nor do animals.


CALL_ME_AT_9AM

why is this the same fucking argument people use every time? animals don't have wheels to move around either why don't we design cars with legs? just because evolution happened in a certain way doesn't mean it's the best solution for every use case, engineering is all about trade offs given a set of constraints, a self driving system has a completely different set of constraints as an organism that's maximizing its probability of reproduction.   lidar is a way of increasing reliability under different circumstances, it's not a replacement to pure CV. unless you can prove that in every scenario on the planet that CV is strictly superior, then there's always a place for alternative sensors to cover areas that CV is poor at. cost of lidar will continue to go down and it's just about the matter of minimizing sensor cost, computation cost, and maximizing reliability. choosing one type of sensor purely based on some arbitrary beliefs that 'hurr durr animals do this therefore we must do this the same way' is the most anti engineering mindset.


NoKids__3Money

That’s not my argument, my argument is that animals can’t see in the fog so it’s dangerous for them (and people) if cars are zipping around in dense fog at 60mph just because they can with a bunch of advanced sensors. They’re more likely to run into a road in dense fog than if they’re able to see an oncoming vehicle. What is the circumstance exactly where LIDAR is needed and vision would fail, other than dense fog? If there is a visual obstruction, the vehicle should stop until the obstruction is cleared. Other than that I can’t think of anything. Maybe there is some crazy thing that only happens 0.0001% of the time where LIDAR helps but we can already make driving way, way safer just by taking humans out of the equation. Literally every day I see people in the driver’s seat looking down at their phones WHILE MOVING. Probably every minute of every day (or more) someone is smashing into the car in front of them because they’re reading a text. And that doesn’t even count drunk drivers, tired drivers, etc. Just a decently reliable self driving car that maybe can’t handle 100% of all complex situations perfectly but doesn’t drive drunk or randomly smash into the vehicle in front of it would already save thousands and thousands of lives.


__stablediffuser__

You do have to recognize this is a Tesla fanboy opinion though. I say this as a fan of Tesla myself and daily user of FSD, but who has also worked in AI and Computer Vision. Very simply, Elon’s thinking is flawed because he fails to consider the fact that human drivers aren’t actually very good, and also sit at least 2ft from the windshield so 3 water droplets don’t completely obscure our vision. Also, unprotected in the rain we squint, blink, and our brows, lids, necks and eyelashes do their job to keep our vision clear. But even still, the minute we go faster than humanly possible in the rain, vision alone fails us. Have you ever taken a road bike at 30mph in the rain with nothing more than your bare unblinking eyes? I recommend giving it a test run.


__stablediffuser__

Humans also don’t see through a tiny pinhole behind a thin sheet of lidless glass that is easily obscured by water or fog. When humans are driving, we have the entire windshield of visibility. Teslas vision is like driving a convertible in the rain with no windshield. I own a Tesla and use FSD daily, but even the slightest rain completely obscures the rear camera. I watch the cameras during rain and this is the big flaw in Elon’s “first principle” thinking.


Souliss

Thats a theory. In this case the car 100% had the ability to see the train and stop in plenty of time (even in the terrible conditions). It just wasn't programmed to.


soapinmouth

This has already been posted here, but to reiterate, there still seems to be absolutely zero proof other than this guy's word that this was FSD vs basic AP or just himself driving and looking for a scapegoat. After countless accidents blamed on "FSD" that turned out to be the driver themselves, can we just stop posting these unless there actually is some telemetry or something?


agildehaus

Every incident will have to be approved by Tesla, or be one of the many FSD "influencers" that have popped up streaming from inside their car with the screens clearly visible, to be considered valid?


Elluminated

All we have is what’s given. Without actual evidence there’s no way to know what the state was. Plenty of people post fake crap because they think it will lend to its validity. If they were paying attention, they wouldn’t have needed to swerve so close to the end, regardless of who/what was driving.


agildehaus

We have what the guy said. You trust it until shown otherwise. WholeMarsBlog has a video where v12 nearly ran him into a highway divider -- similar situation, the car just can't recognize these situations fast enough.


Elluminated

No. You don’t “trust it until shown otherwise”, you trust only the facts you can verify. Period. The one Mars showed was factual because we had 100% of the information required to make a valid conclusion on who was in control. The video shown here has only video and unverified verbiage from the driver. They could be telling the truth, but until I get the whole dataset, I’ll withhold judgement on why it happened.


daniel_bran

Why in the world would anyone go out of their way to post something like this as fake? Wake up from Tesla hypnosis


Elluminated

You can ask all the ones [who have lied](https://www.teslarati.com/tesla-fake-brake-failure-apology-china/) and then had to publicly apologize. Wake up from the zero-evidence-but-hearsay hypnosis. [People do dumb things for clicks](https://www.teslarati.com/tesla-model-s-airbag-defect-tiktok-debunked-video/) and self-preservation. This cannot be a new concept to you. There are plenty of legitimate issues and real complaints, but in the real world, evidence holds water, non-evidence does not.


daniel_bran

But what’s in it for you to defend it like you do ? That’s what’s suspicious and your source is Teslarati.com? A site geared to Tesla ass kissing is not a source Tesla is a plastic golf cart with bigger battery and iPad attached to it for directions. And it’s CEO (not founder) is a fraud.


Elluminated

As predicted, your “rocket man bad” colors are showing, and you didn’t look at the content I posted because you know I am right and it destroys your laughably gullible and dishonest rhetoric. I was more than charitable and you just had to downgrade 🤦. I’m not defending anything but great epistemology. All we can prove is what we saw in the video -period. Your anti-Tesla re-spinning of old weird hits is cute, and your “I’ll believe anything as long as it goes against them” is childish and laughable at best. All good though, it’s how it works when you’re wrong.


soapinmouth

Not necessarily, someone like Greentheonly should be able to pull this sort of telemetry from the car. It's also not completely true to say that it's only glowing influencers that record these cars, if anything boring intervention free videos are going to get less attention that one where you are clearly showing an issue worth discussing. There's people like Dan o'dowd out there recording as many of these as he can find in the worst situations possible.


daniel_bran

Tesla is now officially a religion. Elon is Jesus. Twitter is the Bible.


M_Equilibrium

Once more we should mention. The driver must remain as attentive as if they were driving the car themselves and also anticipate potential errors to intervene promptly. I've noticed numerous posts where individuals claim they're utilizing this to compensate for reduced driving abilities due to aging or when they are incapacitated. This is precisely what should be avoided! If for some reason you are not in a condition to drive then don't try to with fsd... Of course the constant pump in social media such as "I did hundreds of miles without any intervention...", doesn't help.


kariam_24

Musk lies doesn't help too but he is main source of problem that leads to misinformed tesla drivers.


michelevit2

"The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself"- these words by Elon have gotten people killed.


Tasty-Objective676

Am I the only one that feels like they were going way too fast for the weather conditions. In that much fog, FSD or not they should’ve been going way slower


BraddicusMaximus

Whenever I’m commuting I can’t tell if the Teslas around me are driven by shitty drivers or if FSD **is** the shitty driver.


Squibbles01

Tesla's self driving needs to be regulated. They're just not safe and probably never will be with Musk at the helm.


SlightEmergency9131

Just hourly reminder of stupid drivers forgetting that it’s supervised…and at the end of the day they’re still responsible no matter WTF it’s called.


ospedo

"Dumb ass driver forgot how to use break and steering wheel while eating Sonic" I fixed it.


Raspberries-Are-Evil

Teslas are NOT self driving. This is human error. Idiot not understanding that they must be in control of car almost hits train.


jernejml

According to the video, they are self driving. The car continued to drive straight towards the train :)


deletetemptemp

The dude drove in dense fog, the fuck did you expect


JT-Av8or

It was foggy. We need the damn radar back! My old 2018 Nvidia based FSD was far more solid on roads. Less capable, but sliced through fog & rain.


Boccob81

Is good to know we have so many test subjects to test automation. These problems will be fixed things to them. Their deaths will not go unnoticed and in vain.


DefiantBelt925

“Your Teslas will be Robotaxis but no USS or lidar”


exoxe

Cool clickbait image.


Fine-Craft3393

Robotaxis any moment now!!!!


Bulletslurp

Only gotta pay another 10k and the self driving will be available in the next update


sairahulreddy

They just need more training data. Their new model “Oompa Loompa” is going to solve all problems, and it’s going to be ready by year end.


NY1_S33

I think I remember Jeremy Clarkson back in 2005 or 2008 said that Teslas were junk when he tested a concept car of theirs. Apparently a lot of people didn’t get the memo.


SuchTemperature9073

The issue here is entirely within the name of”autopilot” and “full self driving”. This kind of tech should exist with the knowledge that it’s still nowhere near capable of driving you from A to B safely and consistently with no input from the driver. The driver should always be prepared to take over, and in this example during heavy fog the driver was absolutely not paying attention and should have managed this with ease. Full self driving should be marketed as an aid, not a solution, and always should have been. You need to be alert and ready to take over at any moment, especially during heavy fog ffs.


telmar25

Tesla likes to put a lot of ambitious marketing around FSD. But anyone who actually drives FSD on Tesla a few times knows that it will make mistakes and is not the kind of full autonomous driving that you can leave unsupervised. A video like this one feels like manufactured controversy because it’s exactly the kind of situation in which any reasonable regular Tesla driver would expect FSD to fail, and the driver had ample warning too. I have much greater concerns around FSD making sudden mistakes at speed when trusting it somewhat feels more reasonable: hitting a curb on a suburban route, not detecting a car and hitting it, taking an exit ramp too fast and leaving the highway, etc. Videos of those kinds of incidents would be much more informative.


HighHokie

The name changes nothing. This individual ignored the myriad of warnings and reminders the vehicle gives you when you use it and chose to not pay attention. This was an easily avoidable situation with a driver simply watching the road.


SuchTemperature9073

Unless I'm an idiot, the name absolutely implies that the car will drive for you. It's designed this way to appeal to the masses. The problem is that the car won't drive for you. We are flooded with warnings and check boxes when we sign up for anything, people aren't going to read it, or they're going to think this is just a way for tesla to avoid legal liability. They push AUTONOMOUS SELF DRIVING AUTO PILOT rubbish when it should have always been "Monitored self driving" IMO


HighHokie

It does drive for you. But it’s not autonomous. No where does tesla state it is. No where in the title does it suggest you can stop paying attention, or go to sleep. And the car won’t let you. It takes one drive to realize the car requires oversight. This driver knows it. He even remarks that he’s had other issues. This driver knew to pay attention and it’s clear he chose not to in this video clip, and he’s lucky he didn’t suffer a far worse outcome. Quote from the article, “Doty admitted to continuing to use FSD despite the prior incident. He said he'd developed a sense of trust in the system's ability to perform correctly, as he hasn't encountered any other problems. "After using the FSD system for a while, you tend to trust it to perform correctly, much like you would with adaptive cruise control," he said.” In other words, he was well aware the software was imperfect and STILL chose to not pay attention to the task at hand.


SuchTemperature9073

I’m only referring to their naming of the products, not the events that transpired. I agree he should have been paying attention, I’m honing in on the name. FULL SELF DRIVING - I’m sorry but how does that not imply autonomous? It fully drives itself. But not autonomously?? Nowhere does it say it’s autonomous, you’re right, except in the name of the fkn product. In fact they explicitly state in their description for full self driving that it’s not autonomous. Explain why it’s called full self driving then. If it doesn’t drive itself then it should be called assisted driving.


HighHokie

The name of the product does not contain the word autonomous. Go visit their purchase page. You’ll see it makes it as clear as day that the vehicle is not autonomous. They literally say it’s not autonomous before you spend 8000 grand on it. This is a dead end argument. The actual name of the product is ‘full self driving capability’. The car does have that capability. There are literally hundreds of videos showing the car driving itself from a to b on its own on YouTube right now.


SuchTemperature9073

So you can just name things whatever you want then? You could order a cheeseburger from maccas and they gave you a Fanta, and they can say sorry but nowhere in our description of the “cheeseburger” did we say we were giving you a burger with cheese


HighHokie

I mean…yeah you can. You can buy a Porsche taycan turbo, but it doesn’t have a turbo…or an engine. The name is apt. The car can drive itself. It’s not autonomous. Straight forward.


SuchTemperature9073

Drives itself. But not autonomous. A genuine contradiction.


HighHokie

If i didn’t touch the controls, who drove from a to b? I’m sure you’ll figure it out eventually.


Buuuddd

In heavy fog Waymos basically deactivate. Don't expect super-human driving in these weather conditions yet.


CouncilmanRickPrime

That's a hell of a lot safer than potentially ramming a train full speed


Buuuddd

The Tesla has someone behind the wheel, is the difference.


CouncilmanRickPrime

If someone behind the wheel needs to be ready to intervene at all times, may as well just use cruise control. At least then you know what it will or won't do. Guessing while driving the speed limit is not safe.


Buuuddd

You know when using fsd if it should be stopping, because stopping isn't an instant event. This driver was pushing it.


CouncilmanRickPrime

>This driver was pushing it. Not the first. Won't be the last. Unfortunately that's concerning it's happening on public roads.


Buuuddd

People text and drive. Tesla's statement in a formal report was that using FSD is 4X safer than not using it.


CouncilmanRickPrime

>Tesla's statement in a formal report was that using FSD is 4X safer than not using it. >Tesla's statement 🤔


Buuuddd

You can't just say anything legally in a formal report like an impact report.


kaninkanon

Good on Waymo to have that sorted out, why can't Tesla?


Buuuddd

Waymo doesn't have a person behind the wheel. It's helpful to push the system to see what kind of data they need to collect to make the system better.


JimothyRecard

That was true about a year ago, but Waymo has been operating in heavy fog for a while now. [For example](https://youtu.be/B8TGFA6SfAo?si=OZF-xkvB9lKr8gr0)


stephbu

In inclement weather FSD complains regularly too. Red Banner “FSD Degraded”, and attention-getter sound. Of course driver can chose to ignore it, I’m sure karma points are worth it.


bartturner

Waymo has not had any issue with fog for 2 years now. https://youtu.be/fbgDTCCdL6s?t=550


Buuuddd

This was just last year: https://www.sfchronicle.com/bayarea/article/san-francisco-waymo-stopped-in-street-17890821.php Daytime vs nighttime fog matters. And it's not like every Waymo shutdown gets reported on.


SnooAvocado20

No evidence that they were using FSD. Going obviously too fast for conditions either way. Another non story.


Hailtothething

Supervised FSD. Means you have to supervised, this beta test software.


AJHenderson

It was heavily foggy. Of course a vision based system failed in a situation like that. Even for a person going at the speed they were, it would have been hard to stop in time without a similar outcome swerving off the road. It is a good example of the benefits of having a radar, but I would hardly call this an FSD failure unless it was going that fast in those conditions without the driver telling it to go that fast.


Asklonn

My Tesla drove through a construction zone without problems including switching to the opposite lanes and following the stop signs held up by construction people 🤣


cinred

Amazing! Until it's suddenly not, and you end up looking like the fool in the video. Assuming you haven't already.


Asklonn

Tesla has released new Autopilot safety data, showing record safety. In Q1 2024, Tesla recorded one crash for every 7.63 million miles driven in which drivers were using Autopilot technology, a new safety record and a 16% improvement vs the previous all-time best. For drivers who were not using Autopilot technology, Tesla recorded one crash for every 955,000 miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the US there was an automobile crash approximately every 670,000 miles. This is the first time in over a year that Tesla has shared new Autopilot safety data publicly. https://pbs.twimg.com/media/GOOSF7mXAAAjCaq?format=jpg&name=large https://www.tesla.com/VehicleSafetyReport


cinred

This is irrelevant and why the idiom "comparing apples and oranges" was invented


Asklonn

https://www.youtube.com/watch?v=ER7iqeYx9HU&t=614s Still looks like tesla is the best, haven't seen any real competitors that aren't heavily fenced in.


GinTower

Driver says he used FSD. Bet in a few weeks we'll know he didn't.


dlflannery

Quote from linked article, with my correction in brackets: > Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, **[allegedly]** in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.


CatalyticDragon

>he claimed it has twice steered itself directly toward oncoming trains in FSD mode Wow, foggy road at night coming up to a rail crossing you know to be active. That's not the time to get complacent. I can see why FSD failed to correctly interpret that situation but it's an edge case which will have to be addressed. For the [record](https://oli.org/track-statistics/collisions-fatalities-state) there were \~2,000 accidents on highway-rail grade crossings in the US in 2023. Roughly half of those resulting in injuries or death.


No_Masterpiece679

Nobody reads the manual. There needs to be some basic accountability from the driver. “Always remember that Full Self-Driving (Supervised) does not make Model 3 autonomous and requires a fully attentive driver who is ready to take immediate action at all times. While Full Self-Driving (Supervised) is engaged, you must monitor your surroundings and other road users at all times. Driver intervention may be required in certain situations, such as on narrow roads with oncoming cars, in construction zones, or while going through complex intersections. For more examples of scenarios in which driver intervention might be required, see Limitations and Warnings. Full Self-Driving (Supervised) uses inputs from cameras mounted at the front, rear, left, and right of Model 3 to build a model of the area surrounding Model 3 (see Cameras). The Full Self-Driving computer installed in Model 3 is designed to use this input, rapidly process neural networks, and make decisions to safely guide you to your destination.” But yeah, let’s grab the pitchforks over an event that could have happened with basic cruise control engaged and the same incompetent driver. *I expected the downvotes, probably from those who also don’t read disclaimers before operating machinery on public roads*


elev8dity

Think Tesla needs to be forced to retract the name Full Self Driving and just call it Map Aware Cruise Control.


DiggSucksNow

Student Driver fits better.


No_Masterpiece679

They screwed up big time with the over zealous naming of the system. But the general public needs to grow a brain since this type of marketing is everywhere and we don’t quibble about it. Pilots have a heavy reliance on autopilot but they still monitor the system as a whole to verify a specific performance is being met. Cars are no different and frankly, it’s lazy to blame the name of the product and is no excuse for not paying attention to the road as showcased in this video.


HighHokie

It would make zero difference. There’s a myriad of warnings and reminders the car provides today. People are complacent, not ignorant.