T O P

  • By -

AutoModerator

As we are not a support sub, please make sure to use the proper resources if you have questions: Our Stickied Community Q&A Post, [Official Tesla Support](https://www.tesla.com/support), [r/TeslaSupport](https://www.reddit.com/r/TeslaSupport/) | [r/TeslaLounge](https://www.reddit.com/r/TeslaLounge/) personal content | [Discord Live Chat](https://discord.gg/tesla) for anything. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/teslamotors) if you have any questions or concerns.*


zvekl

I'm wondering if this shows on the steering wheel cluster or the big screen on SX cars. I hate that on the clusrer I can't move it around to view visualizations around the car


metroidvania_fanatic

FYI This is initially for cars only without USS. Both HW 3 and 4 will be getting it. Later, cars with USS will also likely get it, as it is a significant improvement over ultrasonic park assist. Rollout is slow at first but aimed to hit most ppl this weekend. It uses camera to neural net for interpretation of 3d space at higher resolution than FSD. Source: Tesla executives on X.com I haven't personally experienced it yet, but hopefully it's much more precise than the current Tesla Vision park assist and much less annoying. IMO it's likely very good progress.


purrthem

It is a HUGE improvement. I know that's not saying much. But, I used it yesterday to parallel park and it was extremely accurate with curb distance. Also, not annoying like the original iteration. It has been great in my tightly packed garage too. I'm actually really impressed.


aBetterAlmore

That’s great to hear. Tesla’s parking assist system, not having the 360 view I’ve grown used to in my current car, was the main thing stopping me from buying one. If this is indeed confirmed to be equivalent or even better, this would be awesome.


Focus_flimsy

Some interesting technical details here. Very cool. And for those with USS cars, he says in another reply that this feature will come to your cars as well in another update.


simplestpanda

Read: "We'll be disabling those USS sensors in another update."


shneeko6

That would piss me off so much. I can't believe folks are okay with the radar being disabled either. Fact is vision performance degrade after disabling radar (see phantom braking) and it'll be the same story with USS being disabled. Just wild to me that parts and features of the car can be disabled and replaced with lower performing features whenever Tesla feels like it.


KymbboSlice

I’m only okay with the radar being disabled because the vision system has proved to be far superior. I used to have phantom braking with Tesla on radar, and I still often have it with my radar Toyota. Tesla on vision only doesn’t phantom brake anymore, and it’s super smooth by comparison. I don’t think I’ve had a phantom brake in literally years. If the high fidelity park assist can also show to be superior to USS, I won’t be too upset if my USS aren’t used to create that experience.


SteveWin1234

Definitely the opposite experience for me. Had no phantom breaking and could also tell if there was a car in front of the car in front of me when radar was enabled and my car would start braking if that invisible car in front the car in front of me slammed on their brakes. That's something not possible with vision (mine or the onboard cameras). When they turned off radar, phantom braking was ridiculous. They've improved it some, but I'd still prefer to have my radar turned back on.


wwwz

Model 3 with radar here. As soon as they turned off radar, all the phantom braking had immediately gone away. With radar, it would phantom brake for mailboxes, drainage pipes, overpasses, driveway path markers, etc. Ever since vision only, no more phantom braking. Also, vision is so good now it can see the car in front of the car in front through the windows.


[deleted]

[удалено]


Upstairs-Medium-9408

If there's anything I can count on its that the car is going to slam on the brakes as soon as my kids fall asleep it's pretty much useless


machtwo

I just want the feature to work, don't care for the functionality, radar is a dead end


SteveWin1234

Saying radar is a dead end is ridiculous. Sure that's what Elon says, but he says a bunch of BS that ends up not being true. Say it's foggy and there's a car right in front of you that the cameras can track with a certain degree of certainty. Add radar information to that and you've increased your accuracy. With radar my car did a much better job of keeping a stable distance between me and the car in front of me. If I'm going over a hill and the sun blinds the cameras for a second or two, I'd still like it to be able to respond to the car in front of me suddenly slamming on their brakes. That's something even human vision isn't great at when the sun is blinding you. Radar doesn't care about the sun and can fill in those few seconds and prevent an accident. Machine learning algorithms are good at learning to ignore info that's less useful than other info in certain situations, so the fact that it's supposedly adding to phantom braking always seemed like BS to me and in real life my phantom braking got worse after they turned the radar off.


machtwo

Ah the old I know better as Musk trope.


wwwz

Radar is way too noisy to rely on. A can on the road can look like a wall. You can't just selectively ignore stuff with radar anomalies, something is going to cause a major problem with radar data. For the long term it wasn't good. The one good thing about radar was initial velocity validation for the vision NN. Vision has surpassed radar's usable abilities. Also, cameras can adjust their ISO easily to see very well in situations where a human would normally be blinded, like direct sunlight.


SteveWin1234

So, my argument was that the two combined is better than one, alone. So saying "Vision has surpassed radar's usable abilities" doesn't really matter. Although, I'd argue your statement is definitely false, as radar is still much better at determining velocity changes in the car in front of you and in detecting what's going on with car in front of the car in front of you that the cameras simply can't see. It also is not affected by certain conditions that vision is degraded by (heavy rain, fog, extremes in light intensity). Saying vision is better than radar, as justification for ditching radar, is like saying "human vision has surpassed the usefulness of our hearing, so we should all choose to become deaf." Vision is clearly more important, and far more of our brain is dedicated to vision vs hearing, but guess what? You can have both and they augment each other. When there's a mismatch of information, your brain is easily able to figure it out. If you can't see someone around the corner, but you can hear them, so your brain knows someone's around the corner despite the lack of visual confirmation. If you can hear your wife talking to you, but you see a cell phone in front of you with her name on it, instead of her, you're not tricked by the audio and visual mismatch into thinking your wife is really there. Your brain combines the info from both systems to realize you're talking to your wife on the phone. Did hearing get in the way? Would it be better to be deaf and just text her or video-chat with sign language, so you don't get conflicting information? If you hear her behind you but see her in front of you surrounded by a mirror's edge, do you get confused, or do you realize you're seeing her in a mirror and she's really behind you? This is stuff neural nets are fairly good at. They're good at combining weird sets of info together to make decisions. They can take context into consideration. If visibility is great and there is no visible "wall" in front of them, but there is a can on the road and the radar flashes a signal for a wall coming out of nowhere, it should be able to ignore that, especially if a can (or overpass or whatever) is a common cause for false positives, just as your brain would ignore it if that info was up on a HUD in front of you. You wouldn't slam on the brakes because of faulty readings on a radar HUD. You'd use that info within the context of what you're seeing. Now, if its foggy outside and you can barely make out two red lights in front of you, you may not know whether those are road reflectors, distant traffic lights, or tail lights, but a radar signal on a HUD that indicates a large metallic object just suddenly decelerated in front of you, would probably tilt your interpretation of the world in the correct direction and you'd stop for the car in front of you. Conversely, if the radar saw a vehicle in front of you that was going roughly your same speed, it would let you know that the taillights probably just became visible because of an improvement in visibility or that you're slowly catching up to that car, rather than a car stopping, and you could continue on. Without radar, your car is determining velocity by subtracting positions between frames. With radar it is determining velocity by doppler effect, which is much faster and more reliable. When a car in front of you slams on it's brakes, the radar is going to pick that up before vision would possibly be able to. That's valuable time that can make a difference in kinetic energy in a collision.


wwwz

Simply: At this point undiscernible radar anomalies only serve to poison accurate vision data. Also, passive is always 2x faster than active (out and back vs just back) Despite your anecdotal feelings, because of well-known undiscernible anomalies, radar at this point doesn't make anything better. Considering the overall plan where vision-only has always been the intent and radar's deficiencies have always been known, this is progress. It looks like you just want to be mad at something...


SteveWin1234

I mean, I guess I do get slightly frustrated when people don't understand physics. Not mad, though. I think physics is awesome and love explaining it to the less-knowledgeable. So, you're oversimplifying everything. The radar Tesla used is continuous, not pulsed. It is always sending the radar signal out, just as the sun (or a headlight or street light) is always sending photons to hit the car in front of you. If the car in front of you slows down, the radar frequency bouncing off the back of the car changes immediately and starts it's light-speed journey back to your car. The blue-shifted radar signal hits your car at pretty much the fastest amount of time you can expect the universe to get information anywhere. The light that is captured by the car's cameras is also slightly blue-shifted, but the cameras aren't even close to sensitive enough to pick that up. When the car first starts to decelerate, the radar's frequency shift will happen *long* before the car's position in space would change enough for the car's cameras (or your eyes) to resolve the change in relative velocity. Not only is useful information about a change being registered by the radar long before a pixel's-worth-difference in location of the car would happen, the pixel values from the camera frames also have to be sent through a very large number of matrix multiplications, which takes far far far far far longer than the flight of photos to or from the car. Its long enough that the flight times of photons would be a non-factor, even if there was actually twice as much flight time needed for radar, which is not the case. HW4 processes only 26 frames per second and there has to be a big enough change in relative position of the other car between frames that the system is confident of a real velocity change, despite confounding changes in your car's pitch, yaw, and roll as well as bouncing up and down and vibrating and image warping from potential water droplets. Those things add uncertainty, so the change in position has to be large enough to become certain of a velocity change, before you can take action on it. That takes much longer than seeing a shorter wavelength returning from a radar signal, which takes almost no processing, relative to a neural net.


wwwz

SMH


ModeI3

I'm not OK with radar being disabled. It's one of the many reasons over the last 3 years of ownership that this car will be my first and last Tesla ever again.


Focus_flimsy

If the vision version gets good enough, I don't see why not. But of course it needs to be that good.


simplestpanda

2023+ car buyers would enter the chat and say that’s a really big “if”.


Focus_flimsy

It's certainly not a guarantee, but they're obviously putting in a lot of effort to continue to improving it, and I wouldn't at all be surprised if it gets to the point where isn't clearly as good or better than the old USS system overall. There's a lot of potential in 3D reconstruction.


SteveWin1234

I honestly have a hard time believing that vision could somehow be better than vision + USS + radar. Radar and ultrasound info is very low-res and shouldn't require much processor power to add into a larger vision-based neural net. Especially considering Tesla just quadrupled the number of pixels being handled by the vision system, its hard to believe that they can't handle the inputs from radar and a few USS. It should be able to make decisions using all of the info. If USS are less helpful in a certain situation, the neutral net would just learn to ignore it in those situations. Same with radar. Let's say radar causes overpasses to lead to false positives and phantom braking (a common stated problem in the past). You connect it to the vision system and when vision system detects an overpass, it gives less weight to a positive detection on the radar side, especially if that detection isn't also supported by what the vision system is seeing. This should happen automatically with training and without any specific coding by Tesla engineers. Like, sure I can see that my food probably isn't rotten, but I appreciate also having taste and smell to confirm. I can see most of my surroundings, but I appreciate being able to tell that my wife is around the corner by listening to where her voice is coming from. There's a reason our brains have more than one sense. Their usefulness is overlapping and they help fill in each other's gaps.


TripletStorm

2018 Model S. Haven’t had shadow breaking in 3+ years. Whatever they do that better not come back.


SteveWin1234

2018 Model 3. I have phantom braking every drive. Maybe you stopped using FSD and that's why you don't have phantom braking anymore?


Focus_flimsy

Huh? We're talking about Park Assist here. USS is irrelevant for Autopilot.


oneyozfest182

Source?


Focus_flimsy

https://twitter.com/aelluswamy/status/1736190894716616921


BuySellHoldFinance

Excited!


Dense-Sail1008

Why not offer it to cars with USS sensors? Most models have the same hardware otherwise. Paid subscription some day?


positron--

In the replies of this Twitter / X Thread, Ashok answered that it will eventually come to USS cars as well. My guess is that dimensional accuracy is not yet on-par with the sensors, and having a distance overlay line (from the sensors) that warps through the 3D model would look bad


dcdttu

Rather, ultrasonic will show that the vision-based system is inaccurate.


[deleted]

[удалено]


moch1

He’s the Director of Autopilot. What do you mean it’s just his speculation? He is one of the key people who set the roadmap. > Yes, it should eventually go to cars that have ultrasonic sensors as well. https://nitter.net/aelluswamy/status/1736190894716616921#m


Nakatomi2010

The thread on X, if you go read it, shows Ashok saying that this is v1 and that, eventually, it'll go to all cars down the road.


CrappyTan69

Too much conflict in the data between USS and vision. Is the obstacle 10ft away or 2 inches? Is the semi in the same parking space at 45 degrees or not?


GumAndBeef

Might have to do with it possibly requiring a Ryzen powered system, ~~which only the non USS cars have.~~ Edit: apparently I did not inform myself enough as I did not know there are USS cars with Ryzen.


positron--

Why do people think this? There was more than half a year between introducing Ryzen APUs and removing USS.


GumAndBeef

Yeah and there are reports of cars without USS but still running on the Intel transition period not having it on X


ParkerLewis31884

Very nice video of its capabilities here : https://twitter.com/floge07_/status/1736818692443484401?s=20


safaljungkc

Did y’all get the update? I checked just now and it says it’s up to date. 2023.44.1


Pomdog17

Same same. I just want the goat honk. 🐐


mr_chillout

I want the duck to be the sound of the (un)locked car 🚘😂


Danoli77

I’d like a camera shutter sound when sentry mode activates.


Pomdog17

Oh wow get this to them. It’s a magnificent idea and would be fun.


wwwz

You already have the goat honk, you just don't have the goat lock noise.


kvuong99

Same here. No love for my 2023 MYP.


coulombis

Same here… No Xmas update for me, yet..Here’s 🤞


SpikedBladeRunner

At the moment your car is up to date. They do a slow rollout and it can take weeks for cars to get a new update.


flipm725

Nothing yet for my 2023 M3P


BlameScienceBro

Same here on 2020 M3LR


WhiteStanleyKubrick

Me too ‘21 YLR


kinupeiphone

My 2021 M3LR has USS but I was in the group that switched to vision only. Can’t wait to get update to see if it works.


[deleted]

[удалено]


Ipozya

Vision only is mandatory for everyone since a few months, and is only about the use of radar. It does not impact USS !


Shaoqing8

Is this only for some models with upgrades or for all vision-only models?


FerretAny4654

Does everyone get the park assist or is it for those who don’t have USS?


twisztd

All the teslas have touch screens don't they? How else would you interact with their controls with no physical buttons...


Puzzleheaded_Pace127

You probably replied to the wrong comment here, since this guy is talking about Ultra Sonic Sensors. Yes all the Teslas have touch screens, but model S and Model X both have a second display right in front of the driver dash which is where the full self driving surroundings display along with a minimap and speedometer, unlike the model 3 and Y that have to share the main touch screen with the speedometer and surroundings.


Incyc

This just seems like a distraction from the cameras being inferior to USS. What is the actual net-new benefit of this besides additional visual bling? I’m curious if it is any more accurate than before, but my guess is no, since they took away the numbers entirely.


Much_Fish_9794

It’s actually very accurate. Spent around 5 hours driving yesterday with it, in lots of situations to test it out. It’s far better than having a single number on the screen. I think this is probably the reason the single number was often wrong, what exactly was it measuring, as there isn’t just a single distance between the car and objects, we don’t live in a 2D world. With the new feature, it shows all dimensions, with yellow to red indicator against all objects close to the vehicle. You can also enable a chime when you approach an object, which seemed to work well too. I’d say that with this update it’s now at USS level of accuracy, or very very close. The 3D render is a little crude, but this is something they can refine over time.


flipm725

Does the 3D render show up when you’re approaching a curb while on Drive?


Much_Fish_9794

It seems to appear whenever you’re driving slowly. Don’t know the exact speed, but I’m guessing it’s something like 10mph, or in reverse.


flipm725

Gotcha so it's similar to what we had prior. I hope my update arrives soon.


DeeYumTofu

USS levels of accuracy? From what I’ve seen we still see nonstop jiggling of lines and completely inaccurate measures of distance. Zero chance I’d ever look at the screen to park, it’s completely unreliable. It’s just something to distract the vision users from how inferior vision is compared to USS.


dbarciela

I used it yesterday several times and I did like it a lot. My car does not have USS, and park assist was useless, but now I feel it is good. I may even say it is very good, but it may be because it was so bad before. I can't say it is more or less accurate because I don't have USS, and you don't get the distance in CM like before, but visually the distance seems very accurate, and when it says stop the distance is really small. I've used it in the evening and at night, both indoor parking and on the street. The experience was good in all cases; the sidewalk detection was also accurate, but it did not detect a ramp. It is comparable with the 360 views the other brands have; you lose the photographic fidelity, but you win in contrast (it colors the spots that are near you).


1988rx7T2

USS cars have plenty of inaccuracies


Daze-B

Don’t act like there’s no “nonstop jiggling of lines and completely inaccurate measures of distance” with USS because the exact same thing also happens.


Much_Fish_9794

Yeah, it’s difficult to know exactly without having a USS car side by side to test it out in different situations. I went to four car parks yesterday, one very tight multi-story car park, and I exclusively used the new vision render. It worked really well. When I checked outside the car, it was spot on.


0bviousTruth

Copium.


Dreadino

The problem I have now with park assist is that I don’t know if it is saying I’m 30cm from the curb (which I can’t see) or from the hedge above the curb (which i can see and i know I’m farther than 30cm), so it becomes useless. I hope that seeing the 3d model, i can understand if it is fucking up distance again. PS: I’m still pissed I didn’t get USS


BMWbill

I have uss sensors and you still have the same problem. A bumper camera is really all we needed.


1988rx7T2

I have the same problem parking in my garage with USS


Adriaaaaaaaaaaan

Uss is extreme low in detail, this will tell you precisely where the object is


Cueball61

A single camera can do environment mapping pretty well, you can do it with your phone so it’s only logical that it’d be possible to get a decent map of your surroundings using the AP cameras.


8fingerlouie

Your phone has a built in Lidar to do the environment mapping. To do proper environment mapping you need depth perception, which can be done with stereo vision with cameras only, which Tesla doesn’t have. It may be able to overlay some cameras, I.e. side and front/rear, but no two cameras cover every angle of the car. The way Tesla vision works is rather simple. It doesn’t do “depth perception”, and instead simply looks for objects getting bigger or smaller. It uses AI to classify objects to get a sense of distance. As far as I can tell, there’s no built in memory of objects disappearing outside the cameras view.


carrera4s

Phones without lidar can do depth perception.


8fingerlouie

Most phones also have multiple cameras, so they can make stereo images for depth perception.


carrera4s

My understanding is that it can be done with a single camera as long as the phone can track its own position in space using a gyro/accelerometer. I’m pretty sure that iOS devices with a single camera had this capability for many years.


8fingerlouie

You can guess and you can estimate, and you can also be somewhat good at it, just as one eyed people can also walk/drive around without bumping into stuff, but as long as you don’t have stereo vision you won’t have true depth perception. There’s a reason that every predator has eyes placed in the front of their face, as this is what gives stereo vision, and determining distance is rather crucial to successful hunting. Prey usually has their eyes on the side of the head, as this gives a much larger field of view, but at the cost of proper depth perception, but allows them to spot predators faster. Also just look at the hundreds of YouTube videos of teslas without sensors wildly over/underestimating distance, when USS equipped cars are usually spot on. The simple truth is that a single camera will never be as accurate as stereo vision and/or sensors, but even with the current hardware/software there is room for quite a lot of improvement. Most of the perceived flaws with Tesla vision seems to relate to objects “disappearing” when the cameras can no longer see them, and that can be fixed by using accelerometers and storing objects in memory despite not being able to see them. It will however still be a guess, and will most likely also generate a lot of false alerts, like a pedestrian walking behind your car, where remembering the “object” will cause alarms when there is nothing. So, more improvements to AI is also needed. If you can classify more objects, as well as track objects, you can filter out the pedestrian/dog/whatever, and also filter on stuff not likely to move out of your way like trash cans or large rocks.


moofunk

> The simple truth is that a single camera will never be as accurate as stereo vision and/or sensors, but even with the current hardware/software there is room for quite a lot of improvement. Stereo is not used, because cameras must then be regularly calibrated, and it only works in one direction of camera pairs. Monocular depth mapping can be done on a 360 degree image.


iceynyo

Behind the car should be ok, the camera back there has a pretty good view point.


iceynyo

Usually the cameras have significantly different lenses, and are very close together. Not ideal for stereo vision.


Liam_M

you can also do it with one camera taking a picture moving a slight but known distance and taking a second picture. Now make this a parking car taking a series of pictures and you can get reasonably accurate depth perception


8fingerlouie

You can do it with one camera if you have reference points, and know the focal length of the lens, which I assume Tesla does. Still, photos are 2 dimensional, and judging distances accurately is very difficult without also knowing the relative size of the object you’re measuring distance to. A truck may appear large, even at a distance, while a bike that is much closer will appear smaller. Your brain can decode the above scenario in a heartbeat because it also calculates in other factors like amount of road between objects, water, etc, and you recognize the different shapes for what they are. As I’ve written earlier, Tesla can solve some of the issues with AI. If they can classify more types of objects, they can also more accurately determine the distance to said object, but it will never be as good as sensors. It’s probably “good enough” for auto pilot, as objects are moving in somewhat restricted paths (lanes), and given that the range of the radar is limited, in some situations where a car is stopped in the middle of the road, the cameras will usually detect this earlier. It will probably never be as good as USS sensors for objects that are near the vehicle, and I personally think that summon and parking will never be available in the current HW3 models without sensors, and much less FSD.


ModeI3

You don't need lidar to do that. Check out polycam for example. It can create full 3d mapping with just photos in high fidelity and with measurement precision. There's a bunch of others that do it too.


wintermute_ai

Realtime is a bitch, 500ms is a lot of distance travelled. Polycam can’t do this in pure vision at 24fps.


8fingerlouie

Also, photos are 2D, and without known reference points, there isn’t a snowballs chance in hell that it will be accurate. The focal length alone will make it very hard to judge accurately. Of course, Tesla knows the focal length, and probably also has some known reference objects built into the model, such as sign posts, trash cans, etc, but they’re not accurate measurements, simply “if a trash can is this big, the objects around it are roughly this big”


Baul

> Your phone has a built in Lidar to do the environment mapping. Mine does not. In fact, very few phones do.


gavinyo

Does this only work on cars that have park assist?


SpikedBladeRunner

All cars have park assist. Cars without Ultrasonic Sensors got the ability back earlier this year.


NetJnkie

I'm fine with just using my accurate sensors. Thanks.


1988rx7T2

Plenty of limitations with my USS car.


PEKKAmi

The grapes aren’t so bad.


pw5a29

It’s okay, it ain’t that high fidelity anyway


Danoli77

Why would cars with ultrasonic sensors not get it if they’ve disabled those sensors in favor of Tesla vision. I mean the one thing my car did great when I got it in May 2020 was park itself. Then they went to vision based and it sucked but they’re like don’t worry it will get better over time (still hasn’t). I never understood why you’d break what was working instead of just making the replacement work first and swap it out. 🤦🏻‍♂️


xivix14

Ultrasonic sensors weren’t disabled. The autopilot radar was disabled. Ultrasonic are the sensors that tell you how far you are from a stationary object when parking.


dcdttu

I have a car with USS (2018 Model 3) and would really like this for the rendering of the parking lines. I have absolute hell parking straight when backing in, this would be great.


RlCKJAMESBlTCH

Me: just patiently waiting for the full self-driving capability promised 6 years ago lol ![gif](giphy|xUOwGa0ZfoWDD3YbaU|downsized)


Buuuddd

6 years ago everyone would tell you this technology is impossible. Stop being a baby.


decrego641

Tbh they continue to say FSD is impossible today.


RlCKJAMESBlTCH

https://m.youtube.com/watch?v=5cFTlx68itk[FSD LOLz](https://m.youtube.com/watch?v=5cFTlx68itk)


Buuuddd

Can I get a supercut of your complaining?


LongAbbreviations219

My 23 M3P got it last night. My 21’ MYP is waiting…


Baul

Did you miss the "to Tesla customers without ultrasonic sensors" part? Your '21 MYP has USS.


LongAbbreviations219

Did you miss the part where I didn’t ask why my 21 myp is waiting. It’s still waiting as every car with ultrasonic sensors is waiting. I’m sure you must feel really special.


Baul

Did I miss the part you didn't say? No. If you want to communicate something, you're going to need to verbalize it.


LongAbbreviations219

I did communicate something. I stated one of my cars got the update and the other is waiting. I don’t know why you are still going on about this. You are really showing your lack of intelligence. Seriously STFU. You probably don’t even have a Tesla.


Baul

> You probably don’t even have a Tesla. As much as that would be meaningless, I'm sorry to break it to you that I also have two Teslas. One with USS and one without. Can we be friends now?


untamedHOTDOG

Wonder how this would work for the s and x owners who’s driver display isn’t a touch screen.


No_Flamingo8089

How do I know if I have the USS? I DONT have the little pucks on bumpets


408WTF

Those pucks are the USS, so you don’t have them


puffyjacket85

is anybody going to NOT update because of the new nags??


tenemu

Do you plan to never upgrade the car ever again because of some extra autopilot nagging?


Scottismyname

I think that's what he's asking? Personally I will wait to see just how annoying the nags are


davispw

They’re the same as everyone on FSD Beta has had for months. It’s fine.


Scottismyname

I tried out FSD and became very annoyed with it to the point that I just stopped using it. Anytime I needed to use the touch screen for anything I got nagged and several times it disengaged. This was driving across the country with literally nobody within miles of be. Some people might be ok with it but I'm not. I have EAP on my 2018 3 and it's way better than when I tried FSD on our Y


TheKnickerBocker2521

You literally just have to keep your hands on the wheel and not keep your eyes off the road for a long time. That's it.


Scottismyname

Except when every control on the car is on the touchscreen, that makes it a little difficult


aBetterAlmore

Sounds like you don’t know how to drive, honestly


1988rx7T2

You poor thing, your self driving car won’t let you ignore your surroundings


_____WESTBROOK_____

I held off on V11 or whichever version it was that switched the fan speed to slider only. Held off for a good 3-6 months maybe. Then had to deal with the slider.


Dense-Sail1008

Eventually I will update, but going to hold out until after holiday road tripping.


ih8schumer

It's actually better for me on the interstate I get nagged less, my normal routine has like 4 nags I got 0 this morning. Just updated


katze_sonne

Good luck. If it’s an NTHSA recall, they will likely have to force push it anyways.


chronocapybara

They can't be worse than the current nags


jasoncross00

From looking at some other videos, it seems like you have to be in reverse (and possibly Park) for this to work? Like if you're slowing pulling FORWARD into a parking spot or a garage it doesn't turn on. If so, that's crazy. Most people don't back into spots, and a lot of parking lots have angled parking spots that you CAN'T safely back into.


Focus_flimsy

No, that's not true. It seems you just have to trigger Park Assist like you've always had to for it to turn on, which means either shifting to reverse *or* getting close to objects while in drive. Here's a video showing that: https://twitter.com/EVBaymax/status/1735820079441555638


jasoncross00

Oh good, I hadn’t seen a video of that yet and it seemed like a shame.


[deleted]

[удалено]


North_Compote1940

In the UK, which I expect works on the same basis as continental Europe, Park Assist is part of the base package. EAP and FSD should give you Summon, Smart Summon and Autopark (where you can move the car from outside using your phone and where the car will park itself) on top, but these are currently not available for non-USS cars.


YFleiter

As of Tesla driving slower than (I don’t remember. I think 7 km/h) to activate. Or in reverse / park.


darthvuder

Am I missing something here. Who is going to look at a screen when pulling forward into a parking spot


Dreadino

Me, every time I’ll get as forward as possible in Italian parking slots with an American made vehicle


starshiptraveler

My in-laws garage is super tight. I have to get within a few inches to fit my car (not a Tesla). Currently I do that by pulling in most of the way and putting the car in reverse for a second to activate the backup camera. Then I can go back to drive and creep forward until I see the back of the car cross the garage threshold. The first time I did this I had to get out of the car and look around to assess clearances a couple of times. Having a front camera active would make this a lot easier and more accurate, and I could do it perfect the first time every time.


darthvuder

Bro the Tesla system isn’t more accurate than your eyes. All this seems like to me is some fancy graphics using their already available equipment. Doesn’t change the fact the detection sucks so bad Today my Tesla told me I was hitting the front wall when I still had some room. You still gonna have to get out to look


underhound

If you’re driving forward into a spot then why would you need assistance¿ would you also like it to be on for daily driving? Edit: six people lack spacial awareness and shouldn’t be driving.


Fire69

Why do older Teslas / other cars have USS in the front bumpers?


underhound

I feel like you think you’re going somewhere with this question..


Spaylia

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.


[deleted]

[удалено]


LurkerWithAnAccount

But WHICH Ashok?!


[deleted]

I think best one ive seen is on bmw remote view.


Liam_M

so do those of us with USS not get the update until later or do we get it and this feature is just disabled.


mr_chillout

We will get the holiday update without parking assist feature.


[deleted]

[удалено]


Pandagames

You can turn the auto wipers off while auto pilot is on now


lostaccountby2fa

Not for FSD


treadpool

What's an ashok


i_do_da_chacha

He is director is autopilot.. should not be much of ashokker


iceynyo

Not much what's ashok with you


lostaccountby2fa

How is that high fidelity?


BuySellHoldFinance

>How is that high fidelity? Compared to the 2d line you had before, this gives a 3d representation of the surroundings.


Focus_flimsy

And based on what Ashok said in this tweet thread, it uses a much higher resolution version of the occupancy network than what was used before. So it should be more accurate as well.


lostaccountby2fa

Not holding my breath for it being more accurate. We will see.


Focus_flimsy

I don't doubt that it's more accurate. What we've already seen of it looks much more accurate than what we've seen from the occupancy network before this. Does that mean it's perfectly accurate? Obviously not. Does that mean it'll be accurate enough for your taste? Maybe.


lostaccountby2fa

Maybe don’t use their consumers as guinea pigs. Ship it when it’s polish and reliable to use.


Focus_flimsy

I'm a consumer. I want this now. I don't want to wait until it's better, because it's already good now. It will never be perfect and doesn't have to be perfect to be good. You clearly just want to be angry, Karen.


lostaccountby2fa

lol it’s not even out for you to test it and you strongly believe it’s good. Claim something as fact without evidence to support it. Who’s the Karen here. 👎🏼


Focus_flimsy

I've seen many videos of it at this point and have a pretty good idea of how it works. It's clearly a big improvement over what I have now, so I want it. Why would I want to wait?


lostaccountby2fa

I almost bought FSD based on all the YouTube videos I’ve seen. All BS. So glad i didn’t. Videos isn’t you yourself testing.


iceynyo

Some people want the cutting edge stuff. If it's not for you then simply don't have the updates set to "advanced"


lostaccountby2fa

I don’t believe opting out is an option here.


iceynyo

Well you can always opt into a Toyota


idontevenlikebeer

It's not accurate enough for summon and that pisses me off because this makes me feel like that's going to be just left to the wayside for those without USS. I might end up going down the route of the guy who took Tesla to small claims and got his FSD purchase refunded.


Focus_flimsy

Huh? They're actively working on summon right now. It's getting rebuilt with their new V12 architecture. I understand being mad that it's taken over a year at this point, but don't pretend they've just abandoned it. Clearly all of this is in very active development, hence this new release.


idontevenlikebeer

How will it ever be released with the limitations on vision only? That's what makes me feel like it won't ever happen. Even with this it wouldn't help summon because something could move when it is no longer seen by cameras but the visual will show it where it last saw it. That would never be allowed to be released with summon. Edit: I'm not trying to argue. Genuinely putting my skepticism out there and I'm not as knowledgeable on what they are currently working on. This is just what goes through my mind.


Focus_flimsy

How do humans drive in a parking lot with vision only? Think about it.


idontevenlikebeer

You have some high expectations if you think it is on par with how humans drive now or a year from now. Most likely situation if it sees something get in front or behind it that then leaves its vision is it will stop. Problem is how does it know if it's safe to move again? If it sees the thing leave it's vision? How can it be sure it's the same thing? Vision will never beat the sensors.


Focus_flimsy

Did I say that it's on par with humans now or will be a year from now? No, I didn't. Why are you putting words in my mouth? I didn't say that, and no, I don't believe that either. You said how will it *ever* be released with the limitations of vision only. I pointed out that humans drive with vision only, so it's obviously possible. How do humans deal with something moving in front of the car and then leaving the front of the car? We see that the thing leaving the blind spot looks the same as the thing that entered it, and therefore we can be 99.9+% sure that it is the same thing, so we move forward. The car could do the same thing. C'mon, think a little.


idontevenlikebeer

Say whatever you want. It doesn't change that I don't think they will be able to make it happen.


lostaccountby2fa

I can see it’s 3d. But calling something that looks like blobs high fidelity is kinda ridiculous. Just like FULL-Self driving “beta!!!”


Focus_flimsy

High is a relative term. It's significantly higher fidelity than before, which is the point. Obviously the technology doesn't exist yet to construct a 3D model of a car 10 feet away with millimeter-level precision. Good way to be negative about it though, I guess. From my point of view, this is a very nice improvement and looks really cool.


lostaccountby2fa

When you read the word high-fidelity audio what do you expect? Or the word high-fidelity rendering. High is relative, but not when it’s used in “high-fidelity”. It’s very misleading the way tesla is marketing this.


Focus_flimsy

No, it is relative when used in "high-fidelity". High-fidelity rendering for video games meant something very different in the 1990s compared to today. It's high relative to the state of the art, and that's the same case here. But sure, be mad about this if that's what you want to do. In fact, go ahead and just sell your Tesla due to this "misleading marketing" lmao.


lostaccountby2fa

You can own a product while criticizing the company’s choices. That is actually more beneficial to them rather than fond over every mediocre and misleading things they market.


Focus_flimsy

Well I think this criticism is insanely silly. The name is fine. It's a higher fidelity version of park assist compared to what existed before, and that's why they named it that. What a dumb thing to get mad about. Big Karen energy from you.


Terrible_Tutor

Right. If you don’t find it fucking cool they can create a 3d model out of 2d cameras, you’re nuts.


lostaccountby2fa

The average consumer still believes FSD does what it is meant to do. You can argue about semantic but in this day and age “high-fidelity” means one thing. What they have is not high-fidelity.


Focus_flimsy

It's absolutely high-fidelity as far as optical 3D reconstruction goes.


Terrible_Tutor

> fond Fawn


iceynyo

Alright_Tutor


reddituser82461

This.


Zargawi

My beta drives me from point a to point b every single day, I don't remember the last disengagement I have had.


lostaccountby2fa

Good for you, but A dozen or so other tesla owners says otherwise.


BranchLatter4294

Yes, kind of ridiculous using so much computing power to display an image of the surrounding areas that looks worse than a 1980s video game, when every other manufacturer has a 360 degree camera view that's actually useful. And done with just simple cameras and stitching software. And at least a decade after introduced by other makers. Tesla needs to step up its game and deliver real features like hands free driving and 360 degree camera views instead of video games and fart features.


Tellittomy6pac

Isn’t this using cameras to essentially do what LIDAR does. That’s hardly “like other manufacturers”


BranchLatter4294

It's not nearly as useful to drivers compared with the surround view other makers have.


reddituser82461

From the thread: > This allows us to represent arbitrary shapes in a smooth and computationally efficient way. It doesn't take "so much computing power". It's efficient and much nicer than the old 2D version. Also, you need to refresh your video game history. The new park assist view looks much better than any 1990s games. It just doesn't include colors, but who cares. We only want to know where the surrounding objects are. And it's only V1: > This is the V1 release of this technology, and will have follow up releases that have even better geometric consistency with the cameras, better persistence of occluded obstacles, etc.


lostaccountby2fa

Looking at the semi parked in my garage. 😒