Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
The rain test was far more concerning because it’s much more realistic of a scenario. Both a normal person and the lidar would’ve seen the kid and stopped, but the cameras and image processing just isn’t good enough to make out a person in the rain. That’s bad. The test portrays it as a person in the middle of a straight road, but I don’t see why the same thing wouldn’t happen at a crosswalk or other place where pedestrians are often in the path of a vehicle. If an autonomous system cannot make out pedestrians in the rain reliably, that alone should be enough to prevent these vehicles from being legal.
Who owns the White House right now?
Anyone with half a brain could tell you plain cameras is a non-starter. This is nearly a Juicero level blunder. Tesla is not a serious car company nor tech company. If markets were rational it would have been the end for Tesla.
“Wile E Coyote was here”
This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.
You’d think they have cameras with higher dynamic range and faster auto exposure in their cars by now. Nope, still penny pinching.
If only elon hadn’t insisted on not using lidar or anything other than just visible light cameras
Even then the cameras aren’t really up to snuff to do the kind of things you need cameras for, like signs, lane markings, traffic lights etc.
Mark Rober is about to be listed as FBI public enemy #1 :(
There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:
Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.
This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻♂️
Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!
Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.
Once that happens, Level 4 driving will come standard
Uhhhh absolutely not. They would abandon it first.
There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.
If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.
What is far more likely is that policies simply wont cover accidents due to autonomous systems.
If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.
This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.
Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.
If no one is liable then it’s tempting to deliberately confuse them to crash
deliberately confuse them to crash
Won’t the people doing that be committing attempted murder?
Self driving cars don’t need to have anyone on board
Ask the KIA boys how much they care about murder charges.
To be fair, the roadrunner it was following somehow successfully ran into the painting.
To be fair, I’d be surprised if half the humans driving didn’t do the same.
this. watching the video, I had some trouble telling the difference. sure, from some angles it is obvious, but from others it is not.
That said, other cars, with more types of sensors, would probably have “seen” the obstruction on the road.
Tesla is trash.
Fuck Elon musk.
To any artists in Austin,TX: you have your work cut out for you. Godspeed.
New Trump-Hitman challenge unlocked
can someone do this to trumps tesla please
In silicon valley there is an episode where a bunch of phones explode because of a software problem. A lot like the pager attack trump got a trophy for. And musk could take any of these cars and “self drive” them to where ever, and “update” their discharge parameters or something, then boom. The trucks are 10k lbs too. Bet you could take a small building down with one without much fuss. They are pretty fast. Scary shit. Musk is a huge problem. Watch all gov envoys being his swasticars and then he can take people out russian style. opps, accident, again.
Wonderful news.
Can this be solved with just cameras, or would this need additional hardware? I know they removed LIDAR, but thought that would only be effective short range, and would not be too helpful at 65 km/h.
Can this be solved with just cameras
Theoretically yes, but in reality, not with current technology.
but thought that would only be effective short range
LIDAR actually has quite a long range. You can look up some of the images LIDAR creates, they’re pretty comprehensive.
Teslas never had LIDAR. They did have ultrasonic sensors and radar before they went to the this vision only crap.
If for some bizarre reason you would want to stick to cameras only, you could use 2 cameras and calculate the distance to various points based on the difference between the images. Thats called stereoscopy and is precisely what gives our brains depth perception. The issue is that this process is expensive computationally so I’d guess that it would be cheaper to go back to lidar.
Theoretically, yes. A human would be smart enough not to drive right into a painted wall, using only their eyeballs combined with their intelligence and sense of self-preservation. A smart enough vision system should be able to do the same.
Using something like LIDAR to directly sense obstacles would a lot more practical and reliable. LIDAR certainly has enough distance (airplanes use it too), though I don’t know about the systems Tesla used specifically.
LIDAR certainly has enough distance (airplanes use it too)
As I understand it, this is uncommon and mostly used for topological mapping.
Most commercial aircraft use a radar, augmented with a GPS-based terrain map, for their ground proximity warning (EGPWS, “Enhanced Ground Proximity Warning System”).
I could be wrong though, I’m not a pilot.
Good question. I don’t know if they ll succeed but they have a point that humans do it with just vision so why can’t ai do at least as well? We’ll see. I’m happy someone is trying a different approach. Maybe lidar is necessary, but until someone succeeds we won’t know the best approach, so let’s be happy there’s at least one competing attempt
I gave it a try once and it was pretty amazing, but clearly not ready. Tesla is fantastic at “normal” driving, but the trial gave me a real appreciation how driving is all edge cases. At this point I’m no longer confident that anyone will solve the problem adequately for general use.
Plus there will be accidents. No matter how optimistic you may be, it will never be perfect. Are they ready for the liability and reputation hit? Can any company survive that, even if they are demonstrably better than human?
It works pretty well as a highway assist. I never use it on city streets because its so slow and hesitant which is worse.