Why Tesla removed Radar and Ultrasonic sensors? | Andrej Karpathy and Lex Fridman
By Lex Clips
Summary
Topics Covered
- Extra Sensors Bloat Systems
- Lidar Adds Unnecessary Entropy
- Vision Necessary and Sufficient
- HD Maps Unsustainable Crutch
Full Transcript
Tesla last year removed radar from um from the sensor suite and now just announced that it's going to remove all ultrasonic sensors relying solely on
Vision so camera only does that make the perception problem harder or easier I would almost reframe the question in some way so the thing is basically you
would think that additional sensors by the way can I just interrupt good I wonder if a language model will ever do that if you prompt it let me reframe your question that would be epic this is
the wrong problem sorry yeah it's like a little bit of a wrong question because basically you would think that these sensors are an asset to you yeah but if you fully consider the entire product in its entirety
these sensors are actually potentially liability because these sensors aren't free they don't just appear on your car you need something you need to have an entire supply chain you have people procuring it there can be problems with
them they may need replacement they are part of the manufacturing process they can hold back the line in production uh you need to Source them you need to maintain them you have to have teams that write the firmware all of all of it and then you also have to incorporate
them fuse them into the system in some way and so it actually like bloats the organ the a lot of it and I think Elon is really good at simplify simplified best part is no part and he always tries to throw away things that are not
essential because he understands the entropy in organizations and an approach and I think uh in this case the cost is high and you're not potentially seeing it if you're just a computer vision engineer and I'm just
trying to improve my network and you know is it more useful or less useful how useful is it and the thing is if once you consider the full cost of a sensor it actually is potentially a liability and you need to be really sure
that it's giving you extremely useful information in this case we looked at using it or not using it and the Delta was not massive and so it's not useful
is it also blow in the data engine like having more sensors and is a distraction and these sensors you know they can change over time for example you can have one type of say radar you can have other type of radar they change over
time I suddenly need to worry about it now suddenly you have a column in your sqlite telling you oh what sensor type was it and they all have different distributions and then uh they can they just they contribute noise and entropy
into everything and they bloat stuff and also organizationally has been really fascinating to me that it can be very distracting um if you if all if you only want to get to work is Vision all the resources are
on it and you're building out a data engine and you're actually making forward progress because that is the the sensor with the most bandwidth the most constraints on the world and you're investing fully into that and you can
make that extremely good if you're uh you're only a finite amount of sort of spend of focus across different facets of the system and uh this kind of
reminds me of Rich Sutton's a bitter lesson it just seems like simplifying the system yeah in the long run now of course you don't know what the long run it seems to be always the right solution yeah yes in
that case it was for RL but it seems to apply generally across all systems that do computation yeah so where uh what do you think about the lidar as a crutch debate
uh the battle between point clouds and pixels yeah I think this debate is always like slightly confusing to me because it seems like the actual debate should be about like do you have the fleet or not that's like the really important thing
about whether you can achieve a really good functioning of an AI system at this scale so data collection systems yeah do you have a fleet or not is significantly more important whether you have lidar or
not it's just another sensor um and uh yeah I think similar to the radar discussion basically I um but yeah I don't think it it um
basically doesn't offer extra uh extra information it's extremely costly it has all kinds of problems you have to worry about it you have to calibrate it Etc it creates bloat and entropy you have to be really sure that you need this uh this um
sensor in this case I basically don't think you need it and I think honestly I will make a stronger statement I think the others some of the other uh companies who are using it are probably
going to drop it yeah so you have to consider the sensor in the full in considering can you build a big Fleet that collects a lot of data and can you
integrate that sensor with that that data and that sensor into a data engine that's able to quickly find different parts of the data that then continuously improves whatever the model that you're
using yeah another way to look at it is like vision is necessary in a sense that uh the drive the world is designed for human visual consumption so you need vision is necessary and then also it is
sufficient because it has all the information that you that you need for driving and humans obviously has a vision to drive so it's both necessary and sufficient so you want to focus resources and you have to be really sure if you're going to bring in other
sensors you could you could you could add sensors to Infinity at some point you need to draw the line and I think in this case you have to really consider the full cost of any One sensor that
you're adopting and do you really need it and I think the answer in this case is no so what do you think about the idea of the that the other companies are forming high resolution maps and
constraining heavily the geographic regions in which they operate is that approach not in your in your view um not going to scale over time to the
entirety of the United States I think I'll take two as you mentioned like they pre-map all the environments and they need to refresh the map and they have a perfect centimeter level accuracy map of everywhere they're going to drive it's
crazy how are you going to we've been talking about autonomy actually changing the world we're talking about the deployment on a on a global scale of autonomous systems for transportation and if you
need to maintain a centimeter accurate map for Earth or like for many cities and keep them updated it's a huge uh dependency that you're taking on huge dependency it's a massive massive dependency and now you need to ask yourself do you
really need it and humans don't need it um right so it's it's very useful to have a low-level map of like okay the connectivity of your road you know that there's a fork coming up when you drive an environment you sort of have that
high level understanding it's like a small Google Map and Tesla uses Google Map like similar kind of resolution information in the system but it will not pre-map environments to send me a
level accuracy it's a crutch it's a distraction it costs entropy and it diffuses the team it dilutes the team and you're not focusing on what's actually necessary which is the computer vision problem
Loading video analysis...