Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner. Only this time, the test […]
13.6 million views in three days. That’s impressive.
I really enjoyed this video, but one thing bugs me a bit. Why does he specifically call attention to Tesla as a brand and not Luminar? If he’s trying to be scientific about the technology, it should be “lidar vs imaging”. Or if he does want to call out brands, “Luminar vs Tesla”. Not that I’m defending Tesla, but this just seems weird to me when his videos are typically more educational.
If Kleenex were the only ones doing facial tissue, then this could be, “toilet paper vs. Kleenex”, and you’d be wondering “why isn’t this Charmin vs Kleenex?” while Charmin happened to be the TP brand they chose because they had access to it.
Tesla is the only one doing camera-only self driving, so there’s no point in delineating the two. Lidar you can expect from any other brand, so it’s a token choice in this instance, especially for an engineering entertainment video.
In an interview post video release he mentions only using the Tesla as is the only one he’s aware of only using cameras. Luminar wasn’t named as a car brand or anything really as this was not intended as a promotion for them.
He even said during the interview he loves his Tesla and will more than likely buy another. Which tbh was a bit disappointing. But I do think highlights most of the criticism here is largely unfounded and fanboy antics.
I’m sure he knows Tesla gets the most engagement. And I also believe Tesla is the only company that doesn’t use lidar and Musk has been very vocal about lidar being unnecessary for self driving.
Are there any other car brands that are relying on only cameras for their assisted driving features?
If you mean like, self-driving then I don’t know. Subaru has lane-assist and adaptive cruise control that uses only cameras though. They use two cameras for binocular vision though so it might not fall for this, I don’t know. Would be a good test.
I probably should have said “only cameras”. Subaru, like the rest, backs the cameras up with radar to avoid situations like in the video.
Edit: welp, apparently not in the front
I think I accidentally deleted my post so sorry if this is a duplicate.
I can poke around in my wife’s outback to verify again, but as far as I’m aware, Subaru doesn’t have any forward radars. Having a set of properly calibrated stereo cameras works amazingly well though. Whatever Tesla is attempting, while still kinda impressive, isn’t nearly as polished with the number of phantom breaking events and stuff like this I see complained about online.
Blind spot I believe is radar, and backward is a combination of sonar and radar if I’m not mistaken.
Well, then it would be interesting to see if Subaru fails these tests too. Especially the visual impairment ones.
Just because you seem to be quick to read these and I wanted to mention the vision limitation one after I understood it better.
In my experience with the Outback, it should either work just fine, or if visibility is too bad for it to work reliably, it won’t let you engage it (or warn you and turn itself off it conditions deteriorate while engaged.)
If the speed difference between the car and the object is over 32mph (at least for 2018 model year if I’m remembering the number in the manual correctly), I believe it will fail because it doesn’t have enough time to identify the object. It will do it’s damndest to stop, and should be able to scrub off a solid amount of speed, but there will still be some sort of impact just due to pretty clearly spelled out system limitations.
I can poke around in my wife’s outback to verify again, but as far as I’m aware, Subaru doesn’t have any forward radars. Having a set of properly calibrated stereo cameras works amazingly well though. Whatever Tesla is attempting, while still kinda impressive, isn’t nearly as polished from what I see with the number of phantom breaking events and stuff like this I see complained about online.
Blind spot I believe is radar, and backward is a combination of sonar and radar if I’m not mistaken.
My experience with two different brands is that ACC is unable to detect static objects. Its only strength is detecting moving vehicles in front of you. So even if you come to a stoplight with cars already waiting there, it will just continue going. Other systems must then engage emergency breaking if they detect something but at regular speeds that would be too late.
Because there’s little reason to think different lidar systems would perform much differently on these tests and Tesla is the big name that uses exclusively imaging for self driving.
Wouldn’t many humans drive into such a wall too?
I read the article but not seen the video.
It’s not about the wall. That’s just for shits and giggles (and clickbait). The real problem is that the Tesla also ran over a mannequin in rain and fog, while the Lexus with lidar did not.
They show a still photo in the video, where the engineer comments on that. He highlights the wrinkles in the wall image, imperfections and shadows that a human can see. The way he told it, it was hard to miss to a human
It’s very clear it’s a wall. Kinda like old looney tunes.
Eddie Valient’s road lines into a wall trick was prescient, too.
He didn’t even use the Tesla full self driving. He used the ancient autopilot software and even with that he apparently manually disengaged it before impact. Seems pretty disingenuous to me.
and even with that he apparently manually disengaged it before impact
Source?
For example this and many more articles
The first kid test they did with both auto-pilot and self-driving (or whatever you call that). Was that different for the later tests?
Both uses the same sensors. It’s not like Tesla has some hidden better cameras when you use fsd.
My point is that FSD is much, much more advanced piece of software. It’s wrong to label the video self driving when you are not using FSD. Autopilot is just adaptive cruise control that keeps the car in lane.
Autopilot is just adaptive cruise control that keeps the car in lane.
Anyone who watches the video in question knows this statement is misleading. Autopilot also stops when it detects an obstacle in the way (well, it’s supposed to, but the video demonstrates otherwise). Furthermore, decades old adaptive cruise from other brands will stop too because even they have classic radar or laser range-finding.
If even the most basic go no-go + steer operation based on computer vision can’t detect and stop before obstacles, why trust an even more complicated solution? If they don’t back-port some apparent detection upgrade from fsd to the basic case, that demonstrates even further neglect anyway.
The whole point that everyone is dancing around is that Tesla gambled that cheaping out by using only cameras would be fine, but it cannot even match decades-old technology for the basic case.
Did they test it against decades old adaptive cruise? No, that’s been solved, but they did test it against that technology’s next generation, and it ran circles around vision not backed by a human brain.
Autopilot hasn’t received any updates for years. Tesla is only focusing on FSD. This makes your point invalid.
Autopilot hasn’t received any updates for years.
Like I said, demonstrates neglect.
Ok so every car manufacturer is also demonstrating negligence because they can’t even get their updates working in the first place.
If you close your eyes, it doesn’t matter that you’re wearing glasses or not.
If the car sensors could not pick up the wall, what software version is using does not matter.
Tesla is only using vision. Software makes ALL the difference. If you don’t have a brain it doesn’t matter if you have eyes or not.
It’s wrong to label a Tesla or any of its software as ‘full self driving’.
Quite clearly Mark demonstrated that the safety systems are engaged in what ever mode he had it in; otherwise the vehicle would never stop for the obstacle in front of it.
@lemmyingly @yesmeisyes Tesla’s safety systems only do emergency stops for certain
stationary objects (cars, bicyclists, pedestrians). The real test would be to see if FSD would actively plan to drive through that wall.You can see that even AP wasn’t enabled in most of the test so it’s not a test of FSD.