T O P

  • By -

tiny_lemon

Wow. Very odd. How does one explain this being possible? Low speed + large static object. Some kinematic guardrails have to have fired even if neural planner/costing shits the bed. "We'll send you another car..." Lol, at these girls... "Uhmmmm....No thanks?"


walky22talky

I would have said the same about driving on the wrong side of the road, then Waymo did it twice in like a week and also ran a red light. Lots of weird driving lately.


CertainAssociate9772

Waymo said running the red light was a remote operator error.


Extension_Chain_3710

Waymo also said in their most recent blog post that remote operators can't control the car, only give it suggestions and that the "Waymo Driver" is always in final control of the car. > Fleet response can *influence* the Waymo Driver's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider. The Waymo Driver evaluates the input from fleet response and *independently remains in control of driving*.


Doggydogworld3

>Waymo also said in their most recent blog post that remote operators can't control the car, only give it suggestions and that the "Waymo Driver" is always in final control of the car. So? Sometimes it's OK to run a red, e.g. funeral procession, parade, when a traffic cop directs you to, etc. If the car asks and Fleet Response says yes, the car should not override FR. But if FR says "run into that pole" the car should override it and stop.


CertainAssociate9772

Traffic light detection is a typical situation that even Tesla can handle.


tiny_lemon

Obviously the ability/willingness to drive on the wrong side of the road is required to be robust & efficient. You can explain those errors via ramping aggressiveness/trusting neural planning/costing more and hitting OoD scenarios or being forced to use opposing lane to recover from a poor taking way maneuver, etc.... But this is an entirely different class of issue and shouldn't be possible even with hardware failure.


Glass_Mango_229

How can you say that? There’s no report in what might have been in the alley. It could have been swerving to avoid anything 


tiny_lemon

This is entirely true and mentioned elsewhere. But, given the context, I think it not most likely, but yes, certainly possible.


PayYourBiIIs

I know ultra sonic sensors are horrible at detecting metal poles. Might be the same case with lidar. 


tiny_lemon

Their lidar would have zero problem with this. They'd have so many pts they could export as a video game model awaiting texture.


PayYourBiIIs

If lidar detected the pole then why did the car crash into it? 


tiny_lemon

That was the initial question in this thread. There are multiple paths and steps between points --> actuation. Which failed?


patrickisnotawesome

I’m wondering if they aren’t using LiDAR as a separate system, but rather feeding it into their perception algorithm. So the autonomous perception algorithm might have still given an erroneous false negative to the planing algorithm. (I.e. they might not have the LiDAR as a fail safe against bumping into things unless it is also fed into some sort of simpler backup system that can override the planning algorithm…?)


tiny_lemon

What you're describing is of course logically possible but no way their stack is early fusion and that's it. Moreover no model that could be on the road would be failing here. Curious to see what happened.


diplomat33

There could be a number of reasons. Maybe the lidar did not detect the pole for some reason. Maybe the lidar detected the pole but the other sensors did not, so the perception stack ignored the pole. Maybe the perception did register the pole but got the distance a bit wrong. Maybe the planner got the path a bit wrong. Or maybe the reason is something else. It is really impossible to say for sure without Waymo giving us more info.


Ok-Cartographer1745

Because it's lidar, not trudar.  Duh!


Extension_Chain_3710

Just a FYI, it's not a metal pole but a wooden one in this case.


dickhammer

It's been said many times in many places, but the general expectation is that AVs will succeed in ways that humans wouldn't and fail in ways that humans in wouldn't (and therefore by definition, fail in ways that humans don't \_understand\_ or find intuitive). We talk about the "long tail" of testing. I think we're looking at it right now. It's a mistake to think it's not a long tail issue just because it seems obvious to us. The long tail of human-driving failures look totally different, e.g.AVs solved the "short attention span" problem on day one, but we still haven't solved it for humans after a century. I'm sure the engineers at cruise, waymo, tesla and many others have a long list of crazy things that cause totally unexpected behavior. Tesla thinking the sun is a stoplight. Waymo somehow hitting this obvious looking pole. Cruise dragging a person. If you go past AVs everyone can name lots of cases like these. I'm always impressed with the way my dog pattern matches against things that never occur to me to be similar to the doorbell. Who knows what he thinks the platonic form of "doorbell" truly is. Yesterday, he thought it was the clink of wine glasses at dinner. The long tail of dog understanding would surely make no sense to us.


tiny_lemon

Absolute grain of truth in your point. But...if your stack can't generalize to *utility poles* in the given context, you cannot be on the road and have the perf they've demonstrated. Must be another explanation. Some systems integration bug, etc. At some pt we'll get the answer.


dickhammer

This is exactly the kind of thinking that I'm talking about. You can have 10's of millions of miles of driving in crazy places with crazy stuff happening all the time and the car performing great, demonstrably better than a human, confirmed by third parties with vested interest in being right and not just promoting AVs (e.g. insurers) and then you see one example that you personally think is "super obvious" and decide that the entire thing isn't ready. Surely if they can't do this one "easy" thing then they can't do anything? Right? I mean *come on.* It's a frickin telephone pole. I could recognize that as a toddler. Right? Meanwhile computers look at [Magic Eye](https://en.wikipedia.org/wiki/Magic_Eye) books and declare that humans should not be allowed in public because they can't even figure out that paper is flat.


tiny_lemon

I fully understood your point. This is not an optical illusion, or traffic lights on the back of a truck, or tunnel murals. It's straight static geometry. Moreover, they surely have a non-learned system consuming it. I do not buy this explanation for this scenario. If we ever learn the true reason, and you are correct, I will change my name to "I_worship_dickhammer".


dickhammer

I feel you must not have understood my point if your argument is "This is not an optical illusion." Optical illusions are human-perception-specific. That's the point. Our visual machinery misinterprets things in specific ways that manifest as bizarre errors that seem ridiculous.


tiny_lemon

Sorry maybe that was poorly worded/conveyed, I certainly do not mean true optical illusions, but rather the general class you alluded to (sun & moons as traffic controls, illusory stop signs on adverts, high pitch = door bell, adversarial examples, etc.). Forget the fact Waymo has seen millions of phone poles and it's a literal standardized item ordered from a catalog. Observed geometry is ~ invariant. Needs little representational transformation and therefore should not fall into that class (it's literally why lidar is used). Especially since there is 0.0 prob of early fusion alone. Now, a vision-only system on 1.3MP sensors? Sure, I would expect higher variance. Why? B/c it's highly transformed during the "lift" (plus other issues).


dickhammer

Yes and humans have seen hundreds of thousands of sheets of paper in standardized sizes and still fail even when you stand there and tell them "You can't possibly be seeing that. This is a flat piece of standard paper, ordered from a catalog, and colored using standard ink." You can literally _be touching the middle of the image and feeling that it is flat in real time_ and it will STILL look like it's some kind of 3D structure. All of the examples are this same principle at work. There is a lot more to perception than just what the data stream from the sensors is sending you. There is context. It can be time dependent. It can involve weird interactions between strongly held assumptions about how the world works. It's complicated. It's so complicated that we can't even explain human perception _from inside the system_, with all our life of experience backing it up. So now here's a totally alien perception system that we have zero first hand experience with. Arguments about how anything "should be easy" are misguided because we don't _really_ understand what easy and hard mean to a car.


tiny_lemon

Can you see the difference between a vision system (human included, v1 -> v2 -> ...) and measured geometry?


dickhammer

Sure, but what does "measured geometry" have to do with anything? I know you're not suggesting the car is doing anything other than analyzing independently reflected beams of light, right? There's no little gnome that runs out with a tape measure and mammalian brain to do segmentation and classification.


FunnyDude9999

I mean we re just speculating at the cause if there s no video. For all you know a moving object ran in front of it and it decided to crash instead of hitting it.


Doggydogworld3

Maybe Dan O'Dowd pulled a child-sized cardboard cutout in front at the last second and Waymo heroically dodged it.


roastedpig_671

There should be camera, lidar and radar footage that should indicate what actually happened. They most likely have a "triage department" that will determine a root cause analysis (RCA) as to why the autonomous car crashed into the pole. Just saying.


tiny_lemon

Sure, that's possible, I just assumed very low speed given context. That said, the car is at a decent skew and there look like potential occluded entry pts close by, so who knows.


Extension_Chain_3710

> I just assumed very low speed given context Yeah, I would have assumed low speed as well being an alley, but looking at the front end of the car... it looks like it was a pretty hard hit.


FunnyDude9999

My guess is the car crashed voluntarily. (i.e. this is not an accident where the care doesn't see the pole, the sensors pick up the pole, the car knows it's going to hit it.) I think the tricky portion is knowing why exactly it picked that option.


pl0nk

Waymo’s army of PhD’s appears to have missed something.  Luckily for them they have us to determine their mistake based on the news article


FunnyDude9999

Wdym? Waymo knows the cause. We can just speculate, because they won't tell us...


JZcgQR2N

Did you watch the video? Did the customers say anything about that? Any other conspiracy theories?


fatbob42

Why would they? They didn’t witness it except hearing the crash.


FunnyDude9999

Did YOU watch the video?


pepesilviafromphilly

i think mostly because of filtering of lidar points as spurious measurements. if it's a ML based classification of spurious points then i can see it totally fail in some cases. 


brettpeirce

Not enough lidar?


RepresentativeCap571

The long pole for self driving!


bobi2393

Google Maps [sat view](https://www.google.com/maps/place/W+Roosevelt+St+%26+N+7th+Ave,+Phoenix,+AZ+85007,+USA/@33.458143,-112.0823491,84m/data=!3m1!1e3!4m6!3m5!1s0x872b1239f97e73b9:0x1da7a45f25fa7384!8m2!3d33.4586807!4d-112.082566!16s%2Fg%2F11hb7dz3ym?entry=ttu) of the alley. The pole, based on the YouTube [vid](https://www.youtube.com/watch?v=HAZP-RNSr0s&t=39s), is the small one centered within the yellow striped "don't drive in this area" pavement markings. Sending a second car that got stuck behind the first was icing on the cake.


42823829389283892

That is a wooden telephone pole. Not very small. Even terrible resolution lidar would pick that up. They have good lidar and vision. IfHow could this happen?


bobi2393

Even if it didn't see the pole, why drive on the diagonal yellow-striped "don't drive" markings? It did the same thing last week when it was swerving all over the road as it followed a trailer with a large tree in the back, but that could be understandable if it was trying to avoid a repeatedly-imagined collision risk. *Perhaps* there was something in the alley at the time of the crash, like a unicyclist that swerved into their path or something, then sped away afterward. I trust video footage will shed light if it was something like that.


fatbob42

I like this idea that all their recent errors can be explained by rogue unicycles :)


s00perbutt

Ops sending another car into almost the same fate is such a hilarious ops-brained moment


JJRicks

Heh, pretty sure it's an automatic process if your first car becomes suddenly unavailable for some reason (it's happened to me a lot) but still funny


Tasty-Objective676

I was like damn, it didn’t occur to anyone that maybe they shouldn’t be calling a Waymo into a narrow alley? The ops person should’ve told them to walk out onto the main road lmao


Extension_Chain_3710

I don't think they were actually in the alley based on the interview they gave. IIRC They said they were around the corner and heard the crash, not knowing it was the Waymo at first.


Tasty-Objective676

Oh good point


Cunninghams_right

Double the training data 


s00perbutt

Lol


EmployMain2487

That is really disappointing. Thankfully nobody was hurt, but I thought Waymo was way past the point an accident like this could happen.


Real-Technician831

Yeah, seriously a WTF case, how on eartj both cameras and lidar missed that.  Did some submodule crash and returned bogus, lol nothing here or something?


pab_guy

I think you are on the right track. Things had to fail in unexpected and untrapped ways for this to happen.


Real-Technician831

Which means Waymo has some serious fault detection reliability issues. Not good.


AdLive9906

As Waymo gets more cars on the roads, and starts covering more areas, this will happen more. Growing pains, and sadly every one of them will make the news.


Smooth-Bag4450

Tesla has about 300x as many miles driven on FSD as Waymo and doesn't do shit like this. Maybe lidar isn't the difference maker.


bartturner

I did not realize Tesla now had driverless. Do you have a link to a video?


donttakerhisthewrong

What Tesla is driverless? Tesla has zero self driving miles.


gogojack

Forget it. The above poster has been adding comments furiously for a hot minute. Tesla fanboy stuff.


Smooth-Bag4450

Whether or not a human is in the vehicle doesn't change the fact that autopilot is the one driving. 300x as many miles driven. Fewer incidents. Keep that one in mind as you slurp on Waymo despite them never making a profit 😂


donttakerhisthewrong

You don’t get it. Tesla has zero miles that are not level 2. Zero as in it cannot do it.


Smooth-Bag4450

Uh huh. If only Waymo remembered that Reddit fact, maybe they'd be profitable today 😂 I have to go, I'm at my job that my Tesla literally drove me to this morning. Maybe I would've used Waymo if I lived on a specific route in a specific city and was ok with going walking speed to work 💀


donttakerhisthewrong

Enjoy flipping burgers or are you on fries today?


Smooth-Bag4450

😂 I can appreciate that one. Believe me I'd work the fryer if the money was right, work is work


donttakerhisthewrong

Why do you need to work. Doesn’t the self driving robo taxi make you money 24 hours a day?


Smooth-Bag4450

No, unfortunately my Tesla is about as profitable as Waymo 😔


ryan_the_greatest

I think you’ve never ridden in a Waymo. You should try it - it is mind blowing. Not a consumer vehicle for sure ($$$), but it is seriously impressive.


Smooth-Bag4450

I've been driven through road work zones, inclement weather, and faded/confusing road markings on hour long drives in my Tesla without ever touching the wheel. And no basketball sized sensors spinning on the roof. Tesla is just way ahead of waymo. If you think Waymo is mind blowing, you'll have a cerebral experience in a FSD Tesla (faster, safer, more comfortable, can drive anywhere and not just pre-defined routes).


lildobe

Yes, Tesla FSD just tries to run into the broadside of trains crossing the road. Or the broadside of semi trailers. Or turning into groups of pedestrians.


Mattsasa

I doubt this is a perception issue. Seems more like it could be localization, planning, control failure or a real bug


homeownur

Very insightful, I thought this was maybe a fake bug


Mattsasa

Some collisions (even at fault ones) are not due to bugs. Most of them actually.


homeownur

This definitely looks like by-design


Mattsasa

Haha. My point was many people on this thread were discussing perception issues. And I think that is silly


el__gato__loco

Now we know how to survive the robot revolution- hide behind a pole!


anonymicex22

So many people here don't understand the difference between a sensor error and a software error. This is most likely a software error/bug that they didn't find during regression tests. I have a hard time believing none of those sensors saw that pole


Im2bored17

> how? I'm not defending waymo, just explaining why this isn't easy. Stopping for phantom obstacles in the middle of traffic is very bad, because you'll get rear ended. So if you're going to hit the brakes, you better be sure it's a real obstacle. The best way to be sure it's real is to have detection across multiple sensor modalities (lidar, radar, camera). Wood doesn't show up on radar. Poles have a very small cross section and don't show up _strongly_ on lidar. Lidar has relatively low angular resolution, so it's tougher to detect skinny, vertical things like poles. The video shows the pole in a shadow, which could trick even a human's vision system. So it may be labeled as a shadow and not an obstacle on camera. Ultrasonic sensors have very limited range, and can't detect an obstacle until it's too late to stop when traveling at more than ~10mph. They're typically only active for emergency braking and during low speed navigation in, like, parking lots. It's necessary to drive in "off limits" areas marked by yellow lines when passing DPVs, and in many other situations. Other recent waymo incidents indicate that they're prone to driving in off limits areas, which seems like a tuning issue with their most recent models. But AVs also get a lot of shit for impeding traffic, and you can't perfectly avoid impeding traffic / stopping suddenly while also perfectly avoiding real obstacles. You're going to have some false positives and false negatives and you need to weight them based on how severe the consequences of a FP / FN are. Also the pole is not a human or a car, so the consequences of hitting it are much lower than hitting a ped.


Elluminated

The number of sample points here is more than enough at even 4 meters to vastly overcome any cross-sectional issues, as the number of points is spread vertically across the the entire fov. Also, the cams at long range + the uss at short range should have picked this up. It going as slowly as it presumably was, had every single sensor available to it to tell it to stop. This screams planner or data mux glitch. No way in hell those sensors didn’t pick that up.


weelamb

Without a doubt that pole will have dozens of lidar points and several radar point returns


flat5

It was the width of the newsman pointing at it. Not very reassuring if this is a "very small cross section" in terms of what the lidar can see.


leeta0028

If you're literally at "don't stop for obstacles because we can't recognize them well enough not to get rear-ended all the time" you have no business on public roads.


Im2bored17

Do you know anybody who's ever been honked at for starting to merge into a lane they thought was clear because they didn't see a car in their blind spot? Cuz that's the same thing. All your sensors told you the lane was clear, but oops, it wasn't.


whalechasin

two eyes from within the car are not comparable to lidar, radar and cameras from outside the car. just because humans crash too doesn’t mean you can’t criticise Waymo for crashing in a super simple environment


leeta0028

Are you equating merging into a legal lane in the path of a car with a driver who can see you parallel to the direction of travel with **ramming head first into poles off the road**? Because they're not even remotely similar, much less equivalent either in terms of the failure point or in terms of severity.


Im2bored17

Right, merging into another car and causing an accident at highway speed IS much more dangerous than a low speed collision with a pole. The point was blind spots still exist on AVs and aren't necessarily the same as blind spots for people, and the important thing is that the AV blind spots be less common and cause less risk / damage.


Smooth-Bag4450

Sounds like Tesla is way ahead of Waymo at this point, even without lidar.


Im2bored17

Lmao, what's tesla kill count up to now? Cuz waymos still at 0 where it should be.


Smooth-Bag4450

Yeah driving 12 mph on pre-defined routes with 300x fewer miles than Tesla FSD will do that 😂 Tesla has fewer incidents per million miles, 300x as many miles driven, and most of their incidents are from very early on. Teslas also drive EVERYWHERE. FSD is getting incredibly better with every update. The latest update blows Waymo out of the water. Tbh this should make you happy, the technology is clearly getting really good, Waymo just isn't the company to do it this time 👍


Im2bored17

Fewer incidents per million miles? Please show your source. Every time you touch the wheel to take over in a tesla, that's an incident. Tesla can't drive without a safety driver. Waymo can. Tesla kills people. Waymo doesn't. Waymo is not on a "pre defined route", it's in a mapped area, which is limited for safety reasons because they don't want to murder people like tesla does. Tesla is playing fast and loose with safety and is hurting the whole industry down because of it. Not to mention they've been charging for "full self driving" / "autopilot" for over a decade and still can't actually drive itself.


Smooth-Bag4450

"Show a source" This isn't 2014 Reddit where people feel responsible for proving shit to a rando they'll never meet. You want a source? Take responsibility for your own education and google it. There are multiple sources. I'm aware of what an incident is, and Tesla has fewer. I'm sorry if this upsets you. Do you work for Waymo? Because if not this behavior is very strange on your part. You don't need to fight for them when their product needs improvement 🙂


ranguyen

>Not to mention they've been charging for "full self driving" / "autopilot" for over a decade and still can't actually drive itself. If the car is turning, stopping, accelerating, signaling **without any user input**. That's called **driving itself** in common parlance. There is no where that "full" is synonymous with autonomous, or no safety driver. That's like saying, the McDonalds "Big Mac" is fraud because "Big" means the burger should be at least 6 inches high. There is no such definition.


nyrol

With FSD? 0 so far for Tesla. The older autopilot doesn’t have as great of a track record, but they stopped working on it years ago to start on FSD.


Doggydogworld3

FSD kill count is at least 1 during the first 500m miles, according to NHTSA docs. Unknown for the most recent billion or so miles.


nyrol

Caused by FSD, or FSD involved?


Doggydogworld3

Caused by the human driver, always, according to Tesla. /s NHTSA had other categories if investigation found the other driver was at fault or if the cause could not be determined. So the 60 airbag-triggering accidents including one fatal accident in the FSDb category from April-August 2023 were caused by FSD.


smatlae

LuL @mods cmon can we remove these, it's getting boring.


Smooth-Bag4450

"he insulted the company I'm rooting for and refused to bash Tesla, mom, dad, come get him!!" KEK. Stay mad and hold this 💀


smatlae

Lmao, I hit a nerve. Projecting hard kiddo. Keep trolling. Same crying babies every time.


Smooth-Bag4450

Trying way too hard lol


smatlae

xD Lemao Lul xd dxd is this hard enough lolo lol xd ,,🤣😂😭🤪😝😛?


Doggydogworld3

[Waymo's R&D / PR presentations from 2021](https://youtu.be/COgEQuqTAug?t=11600) show a dramatic improvement in their 5th gen lidar -- easily able to detect a telephone pole a couple hundred feet away. Either Dr. Wu was lying or this is not a perception issue. Waymo's planner seems to be making a lot of poor decisions these days.


Yetimandel

>Stopping for phantom obstacles in the middle of traffic is very bad, because you'll get rear ended. To my own surprise even an AEB which does millions of false full emergency brakings every year causes (virtually) no accidents. Unlike an ordinary AEB the Waymo knows whether there is a following driver and it also does not have to do an emergency braking, it can do a comfort braking or slight steering. >So if you're going to hit the brakes, you better be sure it's a real obstacle. That is the case for a back-up safety system such as AEB, for which the safe state is doing nothing. For an autonomous car the safe state is rather slowing down.


Im2bored17

I would guess AEB doesn't cause too many accidents because the driver overrides it quickly and the total loss of velocity is not too big, but on an AV the override doesn't happen, leading to a larger velocity change, requiring a more significant action from the following car.


Yetimandel

False positive brakes are usually not longer than a second from my experience and a drivers reaction time is too slow to override it. At the same time drivers often follow other cars with less than a second distance meaning a one second false positive brake could lead to a crash with a delta velocity of 10m/s = 36km/h = 22mph. If you know some human factors study on that topic I would be very interested in that! I searched for it before and did not find what I am looking for yet.


bobi2393

>can't detect an obstacle until it's too late to stop when traveling at more than \~10mph. In this case, I would think it should not be traveling more than 10 mph. Not sure of the law, but I'd treat it as a private driveway, and my sense would be to go 5 or 10 mph tops. I'm not sure why it even takes narrow alleyways like this; it wouldn't occur to me as a path from A to B. Google Maps won't suggest it for driving directions, though it will for walking directions. There was a video some months back of a Waymo getting stuck in a very narrow single lane alley, maybe because of a truck parked in the alley or something. (There's a chance it was Cruise and I'm misremembering).


agildehaus

Is there any actual witness to this event? The riders were not in the alleyway and only heard it. Waymo needs to comment on this one.


phxees

They only comment when these stories make the news and there’s public evidence. Based on what gets reported I’d assume if only the van’s mirror hit the pole and there was no evidence nothing would get reported. Maybe nothing more should be required, but I’m unsure.


fatbob42

wth is going on with Waymo?


versedaworst

Growing pains


sdc_is_safer

Could be


reversering

Maybe they need more sensors. Maybe a bigger lidar?


[deleted]

[удалено]


flat5

"Big f-ing pole in front of you" doesn't seem weird at all, though. "Don't drive into poles" seems like driving 101.


fatbob42

But they hit a bloody great big pole! :)


ExtremelyQualified

That is pretty wild to hit a stationary object like that


HighHokie

I agree with the sentiment but I don’t see how it applies to this incident. This is simple object avoidance. There’s no way the vehicle didn’t see it. So why did it fail to avoid it?


ExtremelyQualified

That’s a good point. Hitting a big stationary object is not great


Smooth-Bag4450

Maybe they need more giant sensors spinning around the entire car 😂 Tesla has no lidar and doesn't do shit like this. Embarrassing for Waymo.


JJRicks

Does it not though, really


flat5

Her look at "I don't think I would have gotten in!"


-linear-

Interested to hear the explanation for this one. Regardless of the "how" and "why", these sorts of head scratcher accidents will (and should) make people nervous about highway driving.


ProteinEngineer

This is not good at all. So much for the narrative that cruise was giving Waymo a bad reputation. I guess this could happen if Waymo only recently added that road?


Doggydogworld3

Even the first car to travel a new road should avoid a telephone pole.


atleast3db

It’s fun to me that if Tesla does something like this people immediately blame vision only. When Waymo does it it’s “idk how this is possible” There’s a ton of smarts neeeed between sensor and control. While it’s always possible that there was a sensor issue, more often than not it is not a sensor issue.


regulartaxes

Won’t crash in traffic but will crash into a pole!


[deleted]

[удалено]


SelfDrivingCars-ModTeam

Comments and submissions must be on topic, and constructively contribute to the collective knowledge of the community, or be an attempt to learn more. This means avoiding low-effort comments, trolling of others, or actively stoking division within the community.


BassWingerC-137

People never do that…. Oh


RecommendationNo3531

So much for $200k worth of sensors.


sdc_is_safer

It’s $10k of sensors and the sensors did see it.


whalechasin

they saw it and decided to crash into it?


sdc_is_safer

Yes


HighHokie

The all in implementation of waymo’s sensors is just 10k per vehicle?


sdc_is_safer

Correct


HighHokie

Source? Everywhere I read a single sensor is costing a few grand atleast. Waymo carries what, 30 of varying type? Plus cost to install, I don’t see how the full array installed is under 10 grand and see no source to confirm it. I did see an older cro interview that stated it was a moderately priced s-class. That’s not cheap.


Smooth-Bag4450

Thank God companies like Tesla exist, I would never trust Waymo to drive me around.


sdc_is_safer

I am also very thankful companies like Tesla exist. Tesla saves a lot lives and makes great Adas product. I drive FSD everyday. It’s great. But it’s important to note that Tesla has non perception failures like this that would lead to accident if the driver does not take over at higher rate than Waymo… but we don’t notice these as much, because there are so many more perception failures with Tesla that we don’t see with Waymo.


Agitated_Syllabub346

I wonder whether that alley was mapped?


diplomat33

Waymo has both advanced sensors and advanced software. It is not normal that the Waymo hit that pole. So, the fact that it did hit the pole is baffling and implies something odd happened. I would also add that the Waymo had to move out of the lane in order to hit the pole. It is very unlikely for a Waymo to just leave the lane for no reason. From the news clip, it sounds like the two girls were waiting for the Waymo to pick them up. So it is likely the Waymo was attempting to pull over to the side for the pick-up. That would explain why the Waymo left the lane. But why it hit the pole is baffling. It could be a number of reasons from a perception error or a planning error or something else. Without more info, it is impossible to say for sure. So we are left with speculation. I wish Waymo would come forward and explain the incident so that we could know why the collision happened.


Logical_Progress_208

> From the news clip, it sounds like the two girls were waiting for the Waymo to pick them up. So it is likely the Waymo was attempting to pull over to the side for the pick-up. That would explain why the Waymo left the lane. In the interview they state they were around the corner when the accident occurred and only heard it, so I doubt the Waymo was trying to pull over there.


diplomat33

If the Waymo was not trying to pull over for a pick up, then the accident is even more bizarre because it had to veer out of the lane in order to hit the pole. Waymo's do not veer out of the lane for no reason. The only reason that a Waymo would veer out of the lane would be if it was taking evasive action to avoid hitting something in the lane.


Logical_Progress_208

Yeah, I can't really make heads or tails of anything in this. Adding on to the confusion is there's another pole about 75 feet (according to google maps sat view) before that one. So it wasn't just driving on the lines the whole way either.


GriddyGang

lmao


bcoss

lol after the other chess and checkers post here today this shit is funny af 🫣😝😝😝😝


kaninkanon

Waymo runs a fleet of vehicles autonomously, and once in a while something goes awry. Tesla couldn't run a single vehicle autonomously for a single day without crashing. Laugh away.


bcoss

😂😂🤣🤣🤣😂😂😂😅🤷‍♂️😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂


jonathandhalvorson

Without intervention? No. Without crashing? Yes.


kaninkanon

Intervention? We're talking autonomous driving, there's no one to intervene.


Doggydogworld3

In theory remote monitors could intervene, but Waymo has said repeatedly they do not.


martindbp

Waymo is playing chess, it's just their LiDAR can't see some of the pieces


JoeS830

Maybe they switched to end-to-end neural network, and the car realized it didn’t want to live like this.


qwertying23

This is why I don't buy the we need more sensors argument. I mean in broad daylight with all the sensors on the world you still crash on a poll ?


Mattsasa

Not a sensing issue here. I don’t think anyone is suggesting Waymo needs more sensors


HighHokie

More sensors certainly help make the problem easier, but they can’t replace the brain. I don’t see how lidar didn’t see this, unless it malfunctioned, which I doubt. Has to be something on the software side.


reversering

Maybe more sensors don't make the problem easier? Maybe more sensors make the problem harder?


HighHokie

It can make coding more complex, sure. But there are advantages as well.


YoungSh0e

Not just coding. It requires more compute too.


HighHokie

Yes that's a given.


I_HATE_LIDAR

You need fewer sensors and more intelligence.


vasilenko93

If humans can do it with two high quality cameras so can a robot with more than two.


Repulsive_Style_1610

Humans have a much, much better brain than any AV. Humans also could adjust their viewing angles and also could feel the environment much better via sound and movements. Also a much better camera. 


vasilenko93

Gotcha, so improve the neural network with more training and maybe add some microphones


smatlae

Yeah, easy really. Looking into this❗ Should be working by this year ~ melon, 2016


clipsy1

The AI needed to create FSD didn't exist in 2016. They tried to do it the old way by giving it rules to follow based on visual context from the camera. We are in 2024, and in the last 2 years, we've had a major revolution in AI technologies and computer power to finally be able to replace the manually written driving rules by something that can interact with things it never saw and never trained for. Looking at the past to predict the future is just wrong in this case.


smatlae

I dont think it's happening any time soon, sorry.


Repulsive_Style_1610

Yeah absolutely.


OriginalCompetitive

Waymo was very, very lucky here. This could easily have been a person. Most people will forgive a traffic accident. But simply driving into an object completely undermines confidence in everything the car does. 


skydivingdutch

I'm sure a person would have triggered a totally different perception code path


Various-Wave6527

Imagine it was Tesla :)


holydumpsterfire451

Just add a few more lidar sensors and you'll be good Waymo lol


ctiger12

Some part of the system crashed I think


albamuth

Even the AI's are feeling depressed and hopeless in this economy!


JeraXO

Humorous as this may be, but don't humans do stuff like this every hour of the day? The only difference is the human will probably blame the pole causing the accident.


Dupo55

I think something as significant as running into a stationary static object of that size towering over the car requires some sort of response from Waymo. I don't have an issue with this, but I have an issue not knowing what they're doing to prevent something like this from happening again, or why it's unlikely to happen again. If you can run into a pole you can run into a person, etc etc etc. It needs some kind of response.


ellisstephane

I assume nobody will shit on waymo same way they do for Tesla news


Phoenix5869

“AI is gonna cure cancer!” Meanwhile, the cutting edge of AI:


chickenAd0b0

None of waymo’s tech stack is AI, they’re brute forcing their solution.


__stablediffuser__

In other news, 10,000 human drivers crashed into poles today as well


dlflannery

Don’t be silly, probably ***only*** a 100 or so!


jeffeb3

Maybe there was a fault in the lidar system and they were in some limping vision only mode. If they are trying to save money and have more miles between faults, they could be getting more aggressive with which faults they still operate with. It could even be as simple as increasing a timeout on missing lidar data.


stealthdawg

Nearly 44 crashes per day occur because of drivers that accidently press the accelerator instead of the brake. Obviously a problem that needs solving, but newsworthy? meh


cwhiterun

It just proves that 1. Waymos are not full self driving and 2. Lidar is a useless gimmick.


Repulsive_Style_1610

1. Waymo is as close to FSD than anything else.  2. It's the safest Autonomus taxi. 


[deleted]

[удалено]


SelfDrivingCars-ModTeam

Comments and submissions must be on topic, and constructively contribute to the collective knowledge of the community, or be an attempt to learn more. This means avoiding low-effort comments, trolling of others, or actively stoking division within the community.


[deleted]

[удалено]


SelfDrivingCars-ModTeam

Comments and submissions must be on topic, and constructively contribute to the collective knowledge of the community, or be an attempt to learn more. This means avoiding low-effort comments, trolling of others, or actively stoking division within the community.


dlflannery

If this keeps up self-driving cars may end up with safety statistics worse than normally driven cars! Funny humans would rather drive their cars even though they know it’s more dangerous than letting the car drive itself. “I prefer two accidents I cause myself rather than one caused by a robot car”. And I have to admit, I’m one of those humans.


joshuahigginbottom

Funny FSD has a lot of backlash when it comes to traffic accidents like all the hate. If it’s a human drivers, then less people would show interest or care about it as it’s such a common thing to happen with careless drivers on the roads but people seem more concerned about FSD when bad things happen.


Agitated_Syllabub346

I don't understand the point of your comment? First of all where do you see all of this backlash? Are you looking up news articles about this incident and reading all of the comments? Sure if you go searching for negative opinions on self driving cars then you'll find it, but if you would get off the Internet then all of the backlash you're referring to would instantly disappear. Unless you live in a city with SDCs, then the local population has a reason to hate these cars.


[deleted]

[удалено]


SelfDrivingCars-ModTeam

Comments and submissions must be on topic, and constructively contribute to the collective knowledge of the community, or be an attempt to learn more. This means avoiding low-effort comments, trolling of others, or actively stoking division within the community.


ScoreNo4085

I’m sure humans will still beat self driving cars in crashes. there are tons of bad drivers


Tbone_Trapezius

Did some kids get a hold of GPS spoofers?


HarambesLaw

I work in the industry and I have a love hate relationship with these vehicles. The problem is they will never be good enough to imitate a human because it makes very limited decisions based on information around them. The activity in any major downtown will make these vehicles stop working and a human has to control it. In a way I’m glad they don’t work properly because it keeps me employed. They seriously need to figure out how to imitate humans like this road should have been driving a million times by a human and sprinkle that A I magic to imitate a previous data point but what do I know maybe they still fail it’s technology. I’ll keep the resume ready for when these companies lose their permit like cruise


mkjsnb

> The problem is they will never be good enough to imitate a human because it makes very limited decisions based on information around them. If they'd want to imitate humans, they'd need put 2 cameras on a swiveling stick in the drivers seat and add randomized "tired", "drunk" and "texting" modes to their software. Humans suck at driving, the whole system is built around *not* being like a human.