> This is a result of Tesla’s strategy to rely upon Tesla Vision to replace all sensors – which they’ve successfully used most recently to replace the ultrasonic parking sensors (USS) on their vehicles. Tesla believes that vision is the solution to achieving all self-driving capabilities, and this includes the elimination of extraneous sensors such as USS and radar.
Why are they so hell-bent on using nothing but video?
Combination of cost savings and “first principles” thinking.
They tried to save $5 per car by not including a rain sensor. It’s a few million dollars in savings at their scale. At the same time, they spend time and resources engineering a “smart” solution to auto wipers using cameras. And they continue to spend effort on that because it still doesn’t work.
Obviously it's not the $5, but the research that matters. We are beta-testing their vision based rain detection algorithm for them, for free (in fact we are paying $5 to test it).
The weirdest part from the outside: reliably detecting it's raining should have obvious positive impact on the driving side as well. For instance to better interpret the tires slipping, or even anticipate worse precision from some of the sensors.
You do not want to start making handling adjustments based on rain on the windshield, when the road could already be slick from prior rainfall or other conditions.
I like this take a lot and think its spot on. It feels very much like one of those decisions within a startup where the team decides to build their own tool that does the same thing as an equivalent commercial offering and consider it cheaper without thinking about the time to develop, maintain and improve.
It feels to me like another one of Elon's stubborn whims. Sometimes these result in great innovations, like Tesla's giant car part 'gigacasting'. Other times, they result in deciding to rename Twitter "X" and forcing a team of engineers to spend over a year combing meticulously through the codebase to remove all references to "twitter.com"
> Tesla now focusing more on developing self-driving vehicles than on pushing for huge growth in EV sales volume, which many investors had been counting on.
Yep, don't count on Tesla for replacing fossil fuel cars. They had a good headstart but hopefully other manufacturers will now take its place to fill growing demand
In all likelihood auto manufacturers won't really be responsible for replacing fossil fuel cars. Designing and building the cars is straightforward enough, they need continued leaps energy storage and infrastructure to make that change possible.
> Tesla now focusing more on developing self-driving vehicles than on pushing for huge growth in EV sales volume, which many investors had been counting on.
Oh, come on. Yes, Musk announced that Tesla would announce a self-driving car with no steering wheel in August 2024. Of course, he said that in 2016, in 2018, in 2020...[1]
Reality is that Tesla, while signed up for California DMV's autonomous vehicle test program, didn't report any miles driven last year.
It actually says that they were attempting to gigacast their new vehicle as a single piece rather than three pieces, but that proved difficult so they are going back to gigacasting three separate pieces.
But your point stands: they are still gigacasting, just for now they are not doing the very ambitious "gigacast in a single bound" plan.
This reminds me of Steve Jobs and his absolute refusal to let flash run on the iPhone. I think we're better now for that, but we had a few rough years for that.
Maybe, or maybe it's like his refusal to allow apps to force everything to be a web app, and Tesla will realize they are better off just having more sensors and more data
I kinda agree with steve jobs on both of those calls. if your "mobile app" is just a wrapper around a web page, I'd rather just have a bookmark shortcut on my homescreen. what's the value add?
I don't think either of those is very similar to the wiper thing. one is using the leverage of owning a major OS to force third parties to do something better for the end user in a context where the stakes are pretty low (ie, no one is going to die). even if it works out in the long run, the wiper only benefits tesla. and in the meantime it is a serious safety risk for the end user.
Frame damage is actually pretty straightforward on cars that actually have a frame (i.e. aren't unibody).
The problem with repair costs in modern cars is all the sensors, cameras, complex paint jobs, airbags, etc. Those expenses add up quickly, combined with the added costs of body panels
and trim for certain brands and costs really get insane - it shouldn't be a surprise that car insurance costs keep growing much faster than inflation.
Definitely depends on the type of damage. Cracks and rust aren't a big deal at all, a twisted frame due to a corner collision is a royal pain.
In the context of price to repair though, I'd still rather be dealing with a car with a damaged frame rather than a modern car with loads of sensors, body panels, etc that all need replacing.
I think most of the time if you are in a serious enough accident to damage the gigacast in a way that it needs replacing, that would've totalled most other cars as well.
not doubting you but how does it take a year to remove twitter references?
surely twitter has some internal code search tool like Sourcegraph which would let them easily search for all variations of “twitter”, right? /twi?t*e?r/gi
I would like you to do this for one of your larger projects at work - just change its name with a simple regex like this - and see how many unit tests fail.
"But I'll just fix those!", you might say, and you might. Except your project is likely not as big as twitter, and fixing every problem is likely to become a much bigger problem.
Plus, you now have to catch all the stuff not caught by unit tests, which also includes design elements that a person has to look at.
You’re right about Elon but I’ve thought about it and it’s flawed reasoning. Humans don’t do it “with vision” we do it with five senses and the human brain. The computer is severely limited compared to the human brain, so in the context of things like LiDAR, the computer needs all the help it can get. And for rain sensing, the camera doesn’t see the windshield like a person does!
Vision and hearing are the main senses we use to driving, and touch a distant third. Using taste, smell or proprioception means something is probably wrong with your vehicle.
Personally, proprioception ranks ahead of hearing for me.
Vision is first, but proprioception is how assess nearly everything else. Braking speed, acceleration, throttle input, cornering, traction, hydroplaning, tire pressure (even in specific tires), alignment issues (down to the specific wheel), balance issues, toe rod issues, transmission issues, engines issues, wind, payload, towing, etc, etc, etc.
I could probably tell you within 100lbs how much and roughly where payload is in the vehicle simply by feeling how the car is driving.
I literally cannot do racing game because I can’t sense the vehicle. Flip side, I used to think professional racers were crazy for being able to call in subtle issues to their crew chief. Turns out when you drive a specific car enough, you learn how it out to feel.
I thought about smell but actually vehicles do malfunction, and smell can be an important indicator. “Something is wrong with your vehicle” is a real scenario that actually happens pretty often.
I feel we must be using smell all the time to be able to detect when a burning odour begins to arise. It’s just that we’re not aware of it when nothing unusual happens.
Driving without proprioception is probably almost impossible. That's how you know where your hands are, how the steering wheel is turned, where your feet are, how to move your hand to the indicator lever, etc.
That is a reasonable argument that it is likely to be theoretically possible. It’s absolute madness to jump from “theoretically possible” to production without actually inventing and testing the technology. What is going on over there at Tesla? I suspect an emperors new clothes type of situation where employees are too terrified to state basic facts, like “we tested 20 ideas out but none worked, we haven’t figured out how to get this to work yet.” Not the kind of culture I want designing a 2 ton vehicle I’m trusting my families lives to.
> I suspect an emperors new clothes type of situation where employees are too terrified to state basic facts, like “we tested 20 ideas out but none worked
Or maybe the emperor is a "idea #16 is good enough for me" type of person?
Giving Musk the benefit of the doubt, probably his thinking is more along the lines of:
"I'd like to solve this with vision. Sure, we can ship a $5 part to do it the tried-and-true way, but if we do that then we're probably never gonna solve it with vision, because there will no longer be a pressing need to do so. So I'm gonna force us to not ship our cars with a proper rain sensor in the hope that it will continually push my teams to get this shit done and do the improbable".
There’s no doubt to give him the benefit of… even in your version they still decided to ship a basic feature that doesn’t work in a production automobile. Other car manufacturers push teams hard to develop new tech also, but they also test the heck out of them so they’re 100% functional before they would put it into production.
Piech was famous for outlandish demands on engineers at VW, but his demands took the form of the desired test results: the Phaeton had to be capable of being driven all day at 186 mph with an exterior temperature of 122 °F whilst maintaining the interior temperature at 72 °F. And they did it…
Fair point- that does seem like a likely explanation, I am just horrified that it would lead to putting broken systems on a mass produced production automobile.
That sounds so cool when a billionaire vision-guy says it. Too bad reality is that "pressing need" isn't magic, and sometimes just isn't enough. Like this time.
This is true in a sense, but humans incorporate so much more information. Like if I drive past a sprinkler, I know there will be a short burst of water hitting the window. But this requires a fairly complex assessment of the surrounding environment.
But it can be done better with more sensors. Why handicap the technology? It seems to me like the compromise isn’t worth it here.
My phone can take my face scan in a room that’s pitch black, because it uses infrared sensors and a projection of dots. It’s able to check who I am, regardless of the natural light in the environment. If that technology was limited to “only a camera” it would be way worse.
A camera is cheaper than LiDAR. But it’s also less capable.
As best I can tell. Other cars just use a dedicated infrared sensor to detect the water? Seems like a weird corner to cut. Seems like tesla cut those in favor of just having other cameras attempt to fill in? But seems like that isn’t working.
For seeing rain it's mostly that they are offset back from the windshield more. People over 50 barely have variable focus vision and it isn't really important in driving.
> Tesla likes to avoid solutions that only solve a single problem – such as adding a rain sensor. It is an additional manufacturing complication, adds additional cost to vehicles, and segments Tesla’s vehicles between model years.
I don't know anyone who thinks the vision-based parking "radar" actually works well. Indeed, I've heard of several incidents of people bumping into things.
It’s… ok. Much better since the update. Normal parking sensors aren’t infallible either. My colleague scraped his BMW i4 last week and it has proper sensors.
I think they falsely believe human only needs one sense to navigate: sight. But we actually use our hearing, smell, and touch too. So, multiple type sensors is actually better than only one.
Even if all the other senses were gone, the human eyes are hooked up to the human brain. Elon's references to cameras being eyes always glosses over the image processing part.
Also more basic things like whether your feet are touching the pedals, that you're gripping the steering wheel at all, and how much force those things are taking.
Touch is extremely important when driving a vehicle. You won't know which controls you're using and how much force you're exerting upon your pedals otherwise.
Unless that smell is oil. Or gasoline. Or burning rubber. Or petrichor. Or just plain old smoke.
Or the touch is the feeling of road surface changing, or the steering wheel getting harder to turn / having no effect at all, or the rush of air if a door flies open...
That's a limitation of us. We need touch and indicators to gauge how effective we are, but a machine can precisely throttle 23% or keep the steering wheel at 19 degrees without need to gauge how hard it is pushing the actuators. Touch is nice, but hardly crucial for driving.
Fully agree - also it’s a bit weird on Tesla’s part to think that there could be no possible way to make things better than a human in some regards. If humans had radar, it would probably make us more capable, not less.
Our eyes are also attached to an unfathomably efficient supercomputer. The most advanced one in the universe, in fact, by a gigantic margin compared to anything humans have made.
Because they think they can build a cheaper car that way.
Same reason they haven’t even offered an instrument display in front of the driver as an option, and replaced the indicator stalks with buttons (which from reports works acceptably well in California, not so much in, say, the UK).
Underlying it all is Tesla’s belief that the driver is a legacy feature that is going away Real Soon Now. Frankly, it seems like they’re 90% of the way there (with all that implies about how long it will take to get the rest of the way).
There's multiple videos demonstrating Tesla's that self drive themselves into obstacles, including trains, because video cannot properly figure out obstacles.
Why are they so hell-bent on using nothing but video?