NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Tesla drives into Wile E. Coyote fake road wall in camera vs. Lidar test (electrek.co)
desixavryn 22 hours ago [-]
I am a massive fan of Mark Roeper. Unfortunately he completely f**d up this one. He tested using Autopilot, not the latest FSD on HW4, which is worlds apart in capabilities. It is possible that the latest FSD would also crash, but that would be a valid test of FSD capabilities. Testing using Autopilot and calling it "FSD crashes" is a HUGE misrepresentation of facts. I am hoping Mark will post an update to the video.
vasco 22 hours ago [-]
If anything is misrepresenting it's both the name autopilot and the name "full self driving" for two things that neither are autopiloting or full self driving.
razemio 21 hours ago [-]
I think I already had this discussion on HN. I am not sure why most people think autopilot is hands of the wheel without looking. There literally is no autopilot, which does not require constant attention from a human. This is true for planes, ships and cars. Hell this true even for most drones.

FSD is very much correctly named. It does not say you can go for a sleep. It just means, the vehicle is capable of full self driving, which is true for most conditions and something most cars are not capable of. How would you have named it?

PostOnce 21 hours ago [-]
> I am not sure why most people think autopilot is hands of the wheel without looking.

Well, because they named it autopilot.

Autopilot means it pilots... automatically. Automatic pilot. Not Manual Pilot, not with intervention, automatically.

They could have named it "driver assist", "partial-pilot", "advanced cruise control", "automatic lanekeeping" or anything else, but they named it Autopilot. That's Tesla's fault.

razemio 20 hours ago [-]
Why is it named like this on ships or planes then? You most certainly need to intervene, when other vehicles are approaching. Just because this happens less in those environments, the name autopilot certainly fits.

If you go back in history, Tesla was one of the first manufacturers, to provide something more than just a lane assist. It is fair in my opinion to search for a different name, that distinguishes yourself from the competition.

cwillu 19 hours ago [-]
For the same reason ships have starboard and port, and cars have driver's side and passenger's side. The terminology is different, the surrounding culture is different, and ignoring that will result in tragedy.
iknowstuff 18 hours ago [-]
Tesla established the name autopilot in vehicles. they decided it’s exactly like planes and ships. There is no other concept of autopilot in cars.
cwillu 14 hours ago [-]
That's not how words work. They chose that word because it's evocative of exactly what most people think while being just vague enough to give them regulatory deniability.
gruez 21 hours ago [-]
>Well, because they named it autopilot.

do commercial pilots lean back and whip out their phone when they turn on "autopilot"?

Yoric 21 hours ago [-]
A few years ago, I sat in an Italian airplane with the cockpit door open. The pilot and copilot spent most of the trip speaking to each other, using their hands to emphasize their words (if you've ever seen two Italians arguing passionately, you'll see what I mean).

As far as I could tell, they only had their hands on the instruments during take off and landing.

It was a bit weird :)

MegaButts 21 hours ago [-]
Commercial pilots have approximately 100x as much training as your typical driver.
MadnessASAP 21 hours ago [-]
Yes, there's some caveats to that but the simple answer is yes.
watwut 21 hours ago [-]
Commercial pilots have that point hammered into their heads. Tesla did everything in their power to convince buyers that "driver is there only for legal purposes" and they can chill instead of driving.
Hamuko 21 hours ago [-]
No but they have years of training.
iknowstuff 18 hours ago [-]
Teslas monitor drivers’ attention via the in-cabin camera.

ffs this is such a silly argument

sorenjan 21 hours ago [-]
It's not unreasonable for people to think that "autopilot" means something that automatically pilots a vehicle. According to a dictionary automatic means "having the capability of starting, operating, moving, etc., independently". Whether or not that's how actual autopilots in airplanes and ships works is irrelevant, most people aren't pilots or captains. Tesla knew what they were doing when they chose "autopilot" instead of "lane assist" or similar, like their competitors did. It sounds more advanced that way, in line with how their CEO have been promising full autonomous driving "next year" for a decade now.

It's also worth noting that the recent ship collision in the north sea is thought to be because the autopilot was left on without proper human oversight, so even trained professionals in high stakes environments make that mistake.

razemio 20 hours ago [-]
Automatic means something different. Here is the definition of what an autopilot is:

The autopilot controls the watercraft or spacecraft without the need for constant manual control by a human. Autopilots do not replace the human operator, rather the autopilot assists the operator in controlling the vehicle or aircraft, allowing the operator to focus on broader aspects of the operation. (Wikipedia)

I do not like Tesla but to this day I don't get the outrage about their autopilot capabilities. It 100% fits the definition of what an autopilot does.

sorenjan 20 hours ago [-]
Like I mentioned in the first post, I don't think the definition of what autopilot means in terms of ships or airplanes is relevant. Tesla chose the name because it makes people think the auto part refers to automatic, and then assume the car can drive automatically.

Combine this with how they released a video[0] with the description "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." at the same time. This video was about their FSD and not autopilot, but I think it's reasonable to assume most people won't know the difference. It's deceptive on purpose.

[0] https://vimeo.com/188105076

cwillu 19 hours ago [-]
The _only_ relevant question is what the average person understands “autopilot” to mean in the context of a car. This isn't a question that can be declared answered by pulling out a dictionary.
digitalPhonix 16 hours ago [-]
> There literally is no autopilot, which does not require constant attention from a human. This is true for planes, ships and cars.

Can’t comment on ships, but autopilots in planes definitely match SAE level 3:

- Hands off

- Do not require constant attention

- Will alarm and indicate to the user when attention is required

ie. “when the feature requests, you must drive [fly]” from the SAE nomenclature: https://www.sae.org/blog/sae-j3016-update

And this is autopilot that’s been in commercial aviation for decades (so long that it’s started filtering down to general aviation).

razemio 13 hours ago [-]
This isn't true. The autopilot in a plane does nothing but keeping it's course. In this regard it does less than cruise control. It will not even disengage if you forget it is turned on some planes. TCAS will warn you ahead of time, that something needs your attention. This isn't failsafe. You can not (irl should not) do something else, like reading a book, while autopilot is in action.

There is a study which links autopilot to 19 crashes and 175 minor incidents (which have been reported) sine 1983. If you take those numbers an put it into perspective, how often planes crash in general, it is not minor. Same goes for ships.

https://www.faa.gov/sites/faa.gov/files/data_research/resear...

digitalPhonix 1 hours ago [-]
The checkride required to get an IFR rating quite literally requires you to “read a book” (brief the approach plate to the flight inspector) while the autopilot conducts the initial parts of a precision approach.

A 20 year old avionics suite (GTN 450) does much more than cruise control - you input a flight plan including an approach, it will fly the flight plan, capture the approach signals (VOR/localiser/whatever - which is far more complex than “keeping course”) all the way down to approach minimums.

The report you linked is specifically about general aviation - not about commercial (part 121) aviation. GA lags behind by decades (still uses carburettored engines burning leaded fuels).

In any case, it’s talking about error cases for autopilots, not the operational domain for when they should/shouldn’t be used. The key take away was improved training for identifying autopilot malfunctions and not reduce/eliminate the usage of autopilots in certain scenarios.

> You can not (irl should not) do something else

I’m curious, what autopilots have you used and what capabilities did they have?

oxfordmale 21 hours ago [-]
Mercedes has gained approval to test Level 4 autonomous driving. Level 4 is considered fully autonomous driving, although the vehicle retains a traditional cockpit, and the driver can request control at any time. If a Level 4 system fails or cannot proceed, it is required to pull the vehicle over and bring it to a complete stop under its own control.

I would argue that it is getting very close to what people think autopilot can do. A car that, under certain circumstances, can drive for you and doesn't kill you if you don't pay constant attention.

iknowstuff 21 hours ago [-]
Ah yes the Mercedes L4

https://m.youtube.com/watch?v=h3WiY_4kgkE

The one which needs a leading vehicle to follow below 40mph on some stretches of freeways? Try to look up videos of it from owners that are not press or mercedes reps.

digitalPhonix 15 hours ago [-]
That’s clearly marketed as Level 3. As in, level 3 is publicly available and usable on public roads right now.

From: https://www.mbusa.com/en/owners/manuals/drive-pilot

> Raising the bar in autonomous driving technology, Mercedes-Benz is the first automobile manufacturer in the US to achieve a Level 3 certification based on a 0-5 scale from the Society of Automotive Engineers (SAE). Under specific conditions, our technology allows drivers to take their hands off the steering wheel, eyes off the road — and take in their surroundings

(Specific conditions being the lead car and speed limits you noted, but that’s not what the person you’re replying to is talking about)

Additionally to their level 3 system, they’ve been granted permission to test Level 4 systems not for public use/availability, on a prototype vehicle:

https://www.wardsauto.com/autonomous-adas/mercedes-benz-gain...

razemio 20 hours ago [-]
Not sure about other countries but in Germany L4 is only active on freeways up to certain speeds. The video you linked is not showing this. At least the parts where I skipped through.
iknowstuff 18 hours ago [-]
With a lead car, on select sections without any significant bends, in perfect weather, and with a speed limit*
watwut 21 hours ago [-]
They even take legal responsibility for crash in those circumstances. Because they trust it to work in specified circumstances.
crooked-v 21 hours ago [-]
> It does not say you can go for a sleep.

That means, by definition, it's not "full" self driving.

razemio 21 hours ago [-]
Why? It is capable of full self driving. That has been proven numerous times. It is however not capable to do this all time. I ask again, give a better name that describes the function FSD currently offers. It is most certainly not a lane assist.
stavros 18 hours ago [-]
My 1990 Fiat is also capable of full self driving for short periods of time, if I put a rock on the gas pedal.

If "full" doesn't mean "completely autonomous", what would you call a system that can drive a car in exactly all the circumstances a human can?

razemio 13 hours ago [-]
The name does not matter. In the terms it has to say L5.

Joking aside... 95% of the drives are with 1 or no manual disengagement. The disengagement does not have to be critical. I would always disengage to park because I am faster. Is this not enough to call something FSD?

https://teslafsdtracker.com/

cwillu 19 hours ago [-]
If I'm on a road trip with somebody, and they're driving, I expect to be able to sleep in the passenger seat. “Full” implies that level of competence, which includes knowing its own limitations and therefore when to pull over.

I can think of half a dozen terms off the top of my head that would better describe what its doing, but they aren't quite as punchy, and critically, they won't give the impression that the car is capable of functioning without oversight. Being able to obliquely claim-without-claiming is very much the point of the current terminology.

   * trip control
   * trip assist
   * co-pilot
   * auto steer and throttle
   * partial self-driving (i.e., what it's actually doing!)
   * hands-free steering
razemio 13 hours ago [-]
Why does full mean to you never to interact with the car. This would mean L5 driving. They never claimed that. Even the mercedes system which is L4 is only working under certain conditions.

All the names you have given suggest, that it is not capable of managing a complete trip but it is.

If you do not need to do anything for 95% (often 100%) of the trip, which is true for most FSD drives, this would not qualify as FSD for you? Then we can end this discussion and just have different opinions.

cwillu 9 hours ago [-]
Correct. And indeed, L5 is called “Level 5: Full driving automation”. The only reason the word “full” is even in there is because of previous abuses of the notion of “self-driving” cars that couldn't. “Full-no-we-really-mean-it-this-time self-driving”
ModernMech 6 hours ago [-]
"Full self driving beta (supervised)" always felt to me like report_draft2_(final)--revised.docx
kibibu 16 hours ago [-]
> I ask again, give a better name that describes the function FSD currently offers

"Mostly Self-Driving"

razemio 13 hours ago [-]
Haha :). I would have liked this name.
kibibu 13 hours ago [-]
You could go with "Full Sorta Driving" or "Full Somewhat Driving" and you don't even have to change the initialism.
ModernMech 6 hours ago [-]
"It mostly self drives.... mostly" - Elon Musk as Newt from Aliens
akmarinov 21 hours ago [-]
> It just means, the vehicle is capable of full self driving, which is true for most conditions and something most cars are not capable of.

But that’s not true.

It’s not capable of fully driving itself, hence why the supervised part was added

razemio 20 hours ago [-]
It most certainly is. Is it reliable? No, not at all. However there are many YouTube channels showing certain routes, where this car is capable from driving itself from a to b without the need of manual intervention. Including parking.
cwillu 19 hours ago [-]
“Reliable” is implicit in “fully self-driving”, otherwise I could call my car self-driving when it manages to stay in its lane after I let go of the steering wheel.
razemio 12 hours ago [-]
I just checked the current numbers on FSD Tracker. It is at 95% now for the last 30 days. That's a number I would call reliable under supervision. It is way higher than it used to be. It was at 73% when I checked it the last time.

https://teslafsdtracker.com/

razemio 13 hours ago [-]
Again. Present a name, which fits and describes the capabilities of FSD. It can under certain circumstances drive a route completely on its own. This is not a lane assist, not (advanced) Cruise Control, not HDA and not Super Cruise.
akmarinov 9 hours ago [-]
Naming is hard, but FULL self driving definitely ain’t it

Otherwise what would you name a L5 system that 100% drives autonomously all the time without any input?

cwillu 9 hours ago [-]
Exactly. They knew exactly what they were doing when they named it.
SAI_Peregrinus 49 minutes ago [-]
Partial Self Driving.
cwillu 10 hours ago [-]

   * trip control
   * trip assist
   * co-pilot
   * auto steer and throttle
   * partial self-driving (i.e., what it's actually doing!)
   * hands-free steering
ryandvm 6 hours ago [-]
I think you're ignoring that most car drivers are not versed in the tools and jargon of the air and shipping industries. It really isn't relevant what "autopilot" means in various professional contexts, what matters is what somebody with a high school education (or less) thinks that "autopilot" means.
jbs789 21 hours ago [-]
This is an interesting point. Maybe the problem is most people don’t drive boats or planes so are not familiar with the experience in those contexts. I think you’re right from the boat standpoint a “auto pilot” means you set a heading and the sails/rudder is adjusted to get there.
quink 22 hours ago [-]
Here's what the official Tesla website has to say about FSD vs. Autopilot:

> In addition to the functionality and features of Autopilot and Enhanced Autopilot, Full Self-Driving capability also includes:

> > Traffic and Stop Sign Control (Beta): Identifies stop signs and traffic lights and automatically slows your vehicle to a stop on approach, with your active supervision.

> > Upcoming: Autostreer on city streets

Since I don't see a stop sign, or a traffic light, I cannot imagine how that makes any difference or can in any way be considered a complete f*k up, or how that's a "HUGE misrepresentation of facts". These things, listed here copied verbatim from the website of the manufacturer, are completely irrelevant to what was being tested here. It's like arguing that a crash test is invalid because the crash test dummy had a red shirt instead of a yellow one.

quink 22 hours ago [-]
Furthermore:

> Active Safety Features

> > Active safety features come standard on all Tesla vehicles made after September 2014 for elevated protection at all times. These features are made possible by our Autopilot hardware and software system [...]

No mention of FSD anywhere in that section. Tesla fanboys, pack it in.

TheAlchemist 21 hours ago [-]
The good old argument about the "latest" version on the "latest" hardware... As Tesla influencers have been saying for the past 5 years 'it will blow your mind'.

In the very first phrase he says "I'm in my Tesla on Autopilot"...

modeless 20 hours ago [-]
The video title says "self driving car". This is clearly dishonest when they intentionally did not test the feature named "self driving" in the car shown in the video thumbnail, nor disclose the fact that such a feature does exist and is significantly better than what they tested.

This video is a real missed opportunity. I would love to see how FSD would handle this and I hope someone else takes the opportunity to test it. In fact, testing FSD is such a trivially obvious idea that the fact that it's not even mentioned in the video makes me suspicious that they did test it and the results didn't match the narrative they wanted.

sschueller 22 hours ago [-]
If he doesn't know the difference how is the average car buyer that sees Elon sell such features supposed to know?
bryanlarsen 22 hours ago [-]
It's $8000 for FSD. You're going to know whether you bought it or not.
jayd16 22 hours ago [-]
You won't know you haven't, which is the whole point.
Hamuko 21 hours ago [-]
What if I buy second hand? I heard there's quite a few Teslas available on the used market.
21 hours ago [-]
zaptrem 22 hours ago [-]
To be clear, this is like buying a car that has traffic aware cruise control available as an option, but turning it down, then insisting the TACC is broken and dangerous because it doesn’t work on your car.
tredre3 22 hours ago [-]
That's a poor analogy because what Mark was testing was emergency breaking and collision avoidance, which is part of Autopilot.

https://www.tesla.com/support/autopilot#active-safety-featur...

zaptrem 21 hours ago [-]
The title of the video says “Self Driving Car” which can and is misleading viewers into thinking it’s a test of Tesla’s “supervised Full Self Driving” product since they do not sell other products that use that term. FSD at one point had a significantly more advanced video to voxel model as part of the perception stack that possibly could have detected this wall (though I believe their planner is now end to end and only gets video input, so I’d be really interested to see if it fails here)
stavros 21 hours ago [-]
To be fair, the Tesla did definitely demonstrate full emergency breaking.
crooked-v 21 hours ago [-]
ba-dum tish
root_axis 19 hours ago [-]
So are you suggesting the 8k upgrade can detect some dangerous obstacles that the standard version can't? I doubt that.
iknowstuff 22 hours ago [-]
It wasnt even on autopilot when he crashed it. At best, he was testing automatic collision avoidance while pressing the accelerator, not autonomy.
mbreese 21 hours ago [-]
You see how this is worse, right? If automatic collision avoidance doesn’t work, why would you expect FSD to do better? (Or more to the point - why would a prospective buyer think so?)

And if collision avoidance doesn’t work better, then why isn’t FSD enabled on all cars — in order to fulfill this basic safety function that should work on all Teslas. Either way you look at it, this isn’t good. Expecting owners to buy an $8K upgrade to get automatic stopping to work is a bit much. Step one - get automatic stopping to work, then we can talk about me spending more money for FSD.

(And yes, I’m still bitter that my radar sensor was disabled).

iknowstuff 21 hours ago [-]
Teslas automatic collision avoidance is the best on the market lol you can trivially google it
SR2Z 14 hours ago [-]
Why don't you actually "trivially Google it" and see what comes up? Spoiler: it's not Tesla.

Normal, non-luxury cars from virtually every other manufacturer come STANDARD with radar. It's hooked into AEB and cruise control. Even new CIVICS come with radar AEB.

A radar-based system generally offers instantaneous speed and depth information. The best Teslas can do with cameras is guess at distance and then take finite differences to estimate speed - i.e. they make an estimate from an estimate.

The entire rest of the self-driving industry has known this forever - it's why Waymo, Cruise, Zoox, et. al. (who have actually tackled the long-tail cases of autonomy) not only use radar/LIDAR, but slap multiple of them on their vehicles.

Even a human being cannot reliably brake based solely on visual information.

nickthegreek 20 hours ago [-]
This video literally shows you otherwise.
iknowstuff 18 hours ago [-]
The other car is not for sale. The video is sponsored by the lidar manufacturer.
dzhiurgis 18 hours ago [-]
Bro he faked the video. How gullible you have to be...
nickthegreek 17 hours ago [-]
Is this like a vibe you have or a fact that you know. I’m for sure open to that possibility, please provide link to the breakdown.
dzhiurgis 11 hours ago [-]
He disabled autopilot right before the crash. Before crash - you can see autopilot is on with rainbow road visualization, then cuts into his reaction, then at crash it's disabled - https://youtu.be/IQJL3htsDyQ?t=942

Caveat: Autopilot is known to disable itself right before the crash (if it senses it) tho. I doubt this is the case because he would've obviously mentioned.

Whatarethese 3 hours ago [-]
It was. It disengaged when it knew it couldn't prevent the crash. You can see it in AP before the crashes.
Prickle 14 hours ago [-]
He's testing emergency breaking. It's not some pay-to-receive service surely? It should be always on, regardless of Autopilot or FSD.

It has come as a default feature with many new vehicles in the modern age.

Are you trying to say that Tesla's FSD emergency breaking and AutoPilot emergency breaking are different? It's emergency breaking. None of that should matter.

And if it does matter, we are dealing with the possibility that Tesla is selling a deliberately worse product at a lower price point, in exchange for risking the lives of drivers and passengers.

There really is no meaningful difference here, because the result SHOULD be the same, regardless of what feature was enabled or disabled.

Whatarethese 3 hours ago [-]
AP should be able to handle this and could if you used a lidar equipped Tesla from prior years before they went to pure vision.
ModernMech 2 hours ago [-]
They never had LiDAR; they did have RADAR but removed that.
XorNot 22 hours ago [-]
Why would FSD be any more capable? Like why would anyone expect that? I get how this could happen, but this isn't advanced navigation it's basic collision avoidance of things in front of the car, something I'd expect autopilot to do at a bare minimum.
hokumguru 22 hours ago [-]
They use completely different machine learning models for each of the two features
nickthegreek 20 hours ago [-]
So braking that won’t hit children is soft locked behind a paywall? I don’t think that is a great argument for the perspective buyer.
iknowstuff 22 hours ago [-]
Entirely different software stack.
Prickle 14 hours ago [-]
That should also be entirely irrelevant.

It's emergency breaking. There shouldn't be differing results on whether you paid an extra $8000 usd or not.

FSD does not advertise better emergency breaking as a feature. (Last I checked anyway.)

iknowstuff 2 hours ago [-]
The title shouldn't say self driving car then should it.

Most if not all vehicles for sale will plow right through a styrofoam wall like that regardless of whats painted on it.

21 hours ago [-]
XorNot 22 hours ago [-]
Like I said, I get how this could happen. But it is wild for a company proposing they've "almost" solved FSD to not be able to execute basic collision avoidance - a core, fundamental capability of FSD - to not have deployed that capability into the hardware of a car they claim is completely capable of FSD with that hardware.
iknowstuff 21 hours ago [-]
No, collision avoidance has to put a greater degree of trust in the actual driver of the vehicle, the human, and only step in if it’s absolutely certain the human is about to hit something they did not intend to hit.

For example, when Tesla detects that you are about to accelerate into a human or a wall, they slow down the acceleration so that you have time to react, but they don’t stop it altogether.

That’s very different from FSD’s decision-making process.

jiggawatts 22 hours ago [-]
It has better depth-from-vision estimation that not only uses stereo vision, but can combine multiple frames.

In theory, it can handle the painting on a wall scenario. In theory.

I’d like to see it tested!

This wasn’t it.

losvedir 22 hours ago [-]
For a while, FSD was a totally different system from Autopilot. FSD was the giant new neural net training approach from end to end for all kinds of driving, while Autopilot was the massive explicitly tuned C++ code base for highway driving from the early days. The goal was to merge them, but I haven't followed it closely lately, so I don't know if they ever did.
22 hours ago [-]
timeon 21 hours ago [-]
I'm afraid HW4 is still not lidar-like.
thunky 19 hours ago [-]
> He tested using Autopilot, not the latest FSD on HW4, which is worlds apart in capabilities

Unless HW4 adds lidar then it doesn't matter. The video shows lidar outperforming cameras.

2OEH8eoCRo0 7 hours ago [-]
I'm also a massive fan of people and spell their name wrong.
dzhiurgis 17 hours ago [-]
He tested AEB, you can see autopilot is disabled right after the cut: https://youtu.be/IQJL3htsDyQ?t=942

This is faked as hell.

benatkin 22 hours ago [-]
He didn’t completely f up, it was a valid test, and a misrepresentation of branding, not facts. The connotation of the name autopilot overlaps significantly with FSD. Nobody forced Tesla to name it that rather than enhanced cruise control or something.

Testing a wider release rather than a more exclusive one meh.

db48x 21 hours ago [-]
Why do you say that? The autopilot of an airplane doesn’t dodge obstacles either.
rtkwe 21 hours ago [-]
TCAS on a wide variety of Airbus chassis can actually automatically fly the collision avoidance directives.

https://skybrary.aero/articles/autopilotflight-director-apfd...

Granted it's muuuuch easier in planes because 1) they're already being routed to avoid each other by humans when the route is setup, 2) they continually announce their position with a high degree of accuracy, 3) generally speaking only two planes will be that close to each other at a time in the air so deconflicting doesn't cause huge ripple effects and 4) the sky is really big but also gives them a 3d dimension to play with where streets are 2 dimensional.

db48x 15 hours ago [-]
I was a little bit imprecise. Because the article was talking about a stationary obstacle, literally a wall, I was still talking about stationary objects when I said that airplanes don't avoid obstacles. The autopilot of an airplane will not avoid any kind of stationary obstacle, not even a mountain. It can certainly be done; several militaries use terrain–following guidance in both airplanes and missiles. But there’s nothing about the word “autopilot” that should cause people to think that the car will avoid stationary obstacles. An autopilot follows a route and that’s it.

Some aircraft can avoid other aircraft under some circumstances but those aren’t stationary obstacles and unlike random walls that people build, or mountains, they are all tagged with transponders.

rtkwe 7 hours ago [-]
Except terrain following radar also exists for military aircraft. Civilian planes don't have it because it's not something they want or need to do not be cause the capability doesn't exist.

If the plane is anywhere near terrain on autopilot things have gone horribly wrong and a person should be flying anyways.

https://en.m.wikipedia.org/wiki/Terrain-following_radar

digitalPhonix 15 hours ago [-]
> The autopilot of an airplane will not avoid any kind of stationary obstacle, not even a mountain

Well a core part of the flight is colliding with the ground at the end.

But also the hazard of a stationary object doesn’t exist - crew receive an IFR clearance and fly that route; that route won’t have stationary obstacles.

Aviation’s autopilots aren’t “eyes-off” in VFR so that’s a meaningless comparison.

crooked-v 21 hours ago [-]
All the other drive assistance systems out there with proximity radar would have detected the wall. None of them claim a grandiose name like 'autopilot'.
gruez 21 hours ago [-]
>All the other drive assistance systems out there with proximity radar would have detected the wall

All? Of the 30 cars tested, 7 got "Poor" grades for IIHS's "front crash prevention rating". I spot checked a few of the failing cars and they all supposedly use radar.

https://www.iihs.org/news/detail/automakers-make-big-strides...

SR2Z 14 hours ago [-]
From your link:

> The poor-rated vehicles also struggled in the tests with the passenger car target. Most failed to slow enough in the 37 mph test with the target centered to qualify for additional AEB testing. However, in most trials with the passenger car and semitrailer, they delivered timely forward collision alerts.

Detecting a motorcycle is significantly harder than detecting a semitrailer, which is what this kind of wall would look like to a radar.

While walls like this are a contrived example, I distinctly remember a case from a few years ago where Autopilot failed to detect a semitrailer, drove under it at full speed, and decapitated the driver.

modeless 21 hours ago [-]
You are mistaken. Automotive radar does not detect stationary walls because it cannot tell the difference between a metal sign near/above the roadway and a wall blocking the roadway. Automotive radars are useful only for moving obstacles.
desixavryn 22 hours ago [-]
Still, watching the video people will take away that Tesla FSD is not safe (whereas only autopilot is relatively unsafe). Nothing can be further from the truth, considering I have been using some form of FSD/AutoPilot since it started. One very interesting things happens when you drive with FSD. Because the FSD in-cabin camera is watching you extremely closely, it is literally like having 2-drivers driving.

I insist that my wife and kids drive with FSD, because then I know then that FSD is forcing them to watch the road. Think about that! Use the latest FSD on HW4 for a while (I do 90% of my drives with FSD) and make an informed decision. IMO, not using FSD is killing/hurting people.

dgrin91 18 hours ago [-]
I watched the video, the Wile E. Coyote fake wall stuff is a gimmick meant to draw kids in. That, however, par for the course of his videos; they are designed to hook kids into engineering with silly things and secretly teach real engineering before getting to the punch line.

In this case the real engineering is that Tesla's choice of relying on only visual camera has fundamental issues that can not be solved with cameras only. Namely, visually blocking elements, such as heavy rain, fog, or even blinding lights just pretty much can not be solved by camera-only sensors.

(though I guess one "solution" would be for the system to say I can't see enough to drive, so I'm not going to, but then no one would buy the system)

nashashmi 15 hours ago [-]
It is also a promotion for Lidar tech, Telling future engineers to be cautious of camera vision-only driving systems. I agree with Mark. But not because of the tests he intentionally created to make Lidar look better.

I really do hope camera only tech will do better than this. But I also hope that lidar technology will eventually make it better. Right now , LIDAR needs much heavier computing power to be reasonably powerful.

mordymoop 22 hours ago [-]
I don’t live near a lot of Wild E. Coyote fake road walls. I get the sense it’s more of a Midwest thing.
viraptor 22 hours ago [-]
But that wall may come to you. There's a few large truck trailers I've seen completely covered on a side with a picture which could be interpreted as a road / sidewalk / smaller car / whatever.
rocauc 21 hours ago [-]
In the 2019 fatal Tesla Autopilot crash, the Tesla failed to identify a white tractor trailer crossing the highway: https://www.washingtonpost.com/technology/interactive/2023/t...
ModernMech 5 hours ago [-]
And to be clear, this is the second time a Tesla failed to see a tractor trailer crossing the road and it killed someone by way of decapitation, the first being in 2016:

https://www.theguardian.com/technology/2016/jul/01/tesla-dri...

Notably, Tesla's response to the 2016 accident was to remove RADAR sensors in the Model 3. Had they instead taken this incident seriously and added sensors such as LiDAR, that could detected these obstacles (something that any engineer would have done), the 2019 accident and this "cartoon road" failure would have been avoided. For this reason I believe the 2019 accident should be considered gross professional negligence on the part of Tesla and Elon Musk.

crooked-v 22 hours ago [-]
Also, any wall at a T-intersection with a mural that looks vaguely landscape-y.
taneq 22 hours ago [-]
I dunno, I generally try to avoid driving off the road into a landscape.

That said, I very well might drive into a high-res billboard of the exact scenery behind the billboard, if I wasn’t expecting it on a long straight country road. The in-car view in that video looks pretty damn convincing, sure you’d know something was off but I wouldn’t bet on 100% of drivers spotting the deception.

Maybe next video he can show that Autopilot won’t detect a land mine, or an excavation just after the crest of a hill.

crooked-v 21 hours ago [-]
The point of driver assistance systems is to be better at this stuff than humans, and most actually are because they don't leave it up to just cameras and "we'll figure out how to make it work as good as radar/lidar eventually, I promise".
bryanlarsen 21 hours ago [-]
> and most actually are

From the testing I've seen, there are no Automatic Emergency Braking systems on the market that are anywhere close to being as good as a human that's paying attention.

phil9909 20 hours ago [-]
The important catch here is "paying attention". I guess the way Tesla is marketing their "Full Self Driving" technology leads to people paying less attention. If you market your technology as "Emergency Breaking System" people will be less likely to pay less attention.

Also, an emergency breaking system doesn't have to be better than humans in general. It is sufficient to beat humans in certain facets like reaction time (kid jumping onto the road) or tough lighting. LiDAR has it's own weaknesses I guess. But if the driver and LiDAR can hit the breaks we get best of both worlds.

davidcbc 20 hours ago [-]
The one used for comparison in the actual video certainly is in some situations. Lidar can see things that eyes cannot. They aren't perfect by any means and not necessarily better than humans in all situations, but they are clearly a net benefit
dzhiurgis 18 hours ago [-]
> they are clearly a net benefit

Do you have any data to prove it?

davidcbc 16 hours ago [-]
gruez 21 hours ago [-]
>There's a few large truck trailers I've seen completely covered on a side with a picture which could be interpreted as a road / sidewalk / smaller car / whatever.

Truck trailers are typically 3-4 ft off the ground, and have obvious borders.

thrill 18 hours ago [-]
They're obvious to you and me - are they obvious to a camera only Tesla "autopilot"?
viraptor 21 hours ago [-]
Are you prepared to try that the "obvious border" is enough? Would you fully trust that there will be no issue?
gruez 21 hours ago [-]
I'm prepared to trust that the obvious border, the fact the trailer is 3-4 feet off the ground (thereby creating obvious visual separation), and the fact that I've never seen a deceptively painted trailer, that this isn't worth worrying about. I'd be far more worried about it not being able to recognize off-ramps, or "activists" vandalizing it as a political statement.
mikequinlan 22 hours ago [-]
>Midwest thing.

Southwest I think.

timeon 21 hours ago [-]
It failed also on kid mannequin. Do you have many kids around?
Mo3 22 hours ago [-]
Aside of all the other obvious reasons to not get a Tesla these days this is #1 imo. Camera feeds and a neural network are not enough for self driving, no matter how much they're training. Never ever.
nickthegreek 22 hours ago [-]
At the very least they seem to have downsides that the can be easily overcame with a lidar system/combination of them. That alone is enough to help me decide when its my family that would be the passengers.
crooked-v 22 hours ago [-]
And every other modern auto-braking safety system, except for Subaru for some reason, incorporates at least basic proximity radar.
FeloniousHam 6 hours ago [-]
Have you actually used FSD for, say, a week?

I'm a fan and everyday user of FSD. It's not perfect, but it's immensely useful, and frankly amazing. It works. It drives me to work and back everyday, with maybe 1 or 2 interventions each way.

I've never met any actual Tesla driver who was confused by the marketing (they're all tech guys). Some like it, some don't think it's worth the money, but everyone understands what it is.

(I'm not arguing against more detection hardware, but engineering is about trade offs.)

bingofuel 3 hours ago [-]
But not every Tesla owner is a tech guy. Not every Tesla driver lives in perfect sunny conditions. And in this case, a misinformed (or not informed at all) buyer's "engineering trade off" is death.
hnburnsy 15 hours ago [-]
Agreed, but I believe that camera feeds and infrastructure changes could be enough.
bpodgursky 22 hours ago [-]
I am not claiming that Tesla FSD is at this point, but it is obviously possible to use cameras and neural networks to drive a car, because that is literally how humans drive.
ryandvm 5 hours ago [-]
The problem with that argument is that a Tesla isn't nearly as clever as a human. I have never thought a white tractor trailer disappeared as it crossed in front of me against a bright sky. I know that I should drive a little bit slower on Halloween evening because there are kids running around. I know that I need to be a little more cautious driving past the playground or dog park. I have object permanence.

As it is, AI just isn't smart enough to rely on only vision and they ought to be taking advantage of every economically viable sensor to make up for the difference. Hell, I thought the goal was to do better than humans?

stnmtn 2 hours ago [-]
I get your point, but do you think, over the course of a decade, the average human driver or the average car with tesla FSD is more likely to have an accident where the the fault is their own?
seszett 22 hours ago [-]
Do Tesla cars have stereo vision, though?

I don't think they build a 3D model of the world around them at all, while humans do (not only based on our stereo vision) and largely rely on that to drive.

sorenjan 20 hours ago [-]
They do build a 3D model: https://youtu.be/6x-Xb_uT7ts?t=129

Although I think it's interesting that even in his demo there are cars popping in and out of existence in the 3D visualization of the environment. I don't think that makes much sense, if a car is observed and then occluded the logical conclusion should be that it's probably still there, maybe add an increasing uncertainty to its location, instead of assuming it disappeared.

bryanlarsen 20 hours ago [-]
Tesla switched from a 3D model that with the popping glitchiness you describe to a 4D model without the popping several months ago.
SR2Z 14 hours ago [-]
Sure, but during the entire time of Autopilot/FSD so far, Teslas have been crashing in ridiculous ways and killing their drivers.

The "but they've patched that failure!" excuse has ALSO been around, and while the cars are definitely getting better, they're still also definitely killing people in ridiculous ways.

ModernMech 22 minutes ago [-]
Probably because they're not actually patching anything, but playing a shell game.

If they were serious about patching issues, they would have added LiDAR already. They're trying to software patch their way out of the fact their perception stack is essentially blind to huge classes of objects drivers encounter every day.

andsoitis 22 hours ago [-]
Apparently, Tesla has stated that they do not use paired cameras for ranging or depth perception, which is the standard approach for stereoscopic vision.

Instead of stereoscopic vision, Tesla's neural net processes the data from the cameras to create a 3D view of the surroundings.

https://electrek.co/2021/07/07/hacker-tesla-full-self-drivin...

iknowstuff 22 hours ago [-]
They do have 2-3 front facing cameras and their e2ee does necessarily get the same understanding of the world around it.
vel0city 22 hours ago [-]
Humans have more senses than just vision. Also a few cameras aren't completely covering the range of human vision.
intrasight 22 hours ago [-]
Yes, but I don't think you could argue the point that the visual sense is probably 95% of it. But even so, it could be decades before computers achieve the sensory capabilities of the human visual system. Why not, during those hundred years, add some lidar to the sensory mix. Just because it didn't evolve in humans doesn't mean it's not a good thing. Bats and dolphins use the biological analogue very effectively.
bcrl 21 hours ago [-]
Vision isn't the only problem. Can computers do a decent job of theory-of-mind of other drivers? What about seeing a tire bouncing down off an overpass that's visible to the driver? Software developers love saying that it's the 95% that matters, but in safety critical systems, that simplification simply does not work.
intrasight 18 hours ago [-]
> Software developers love saying that it's the 95% that matters

Perhaps if they're early in their career. But any who have done challenging projects know that it's the last 5% that's the hardest and that often matters the most.

iknowstuff 21 hours ago [-]
Their cameras have better dynamic range than humans, not to mention the neural net can consume multiple exposures simultaneously. From surround video.
vel0city 17 hours ago [-]
> Their cameras have better dynamic range than humans

This is obviously untrue if you've ever watched videos from sentry mode.

iknowstuff 16 hours ago [-]
Sentry mode videos are sdr
aplummer 19 hours ago [-]
Also humans suck at driving compared to what we expect of machines
bpodgursky 22 hours ago [-]
? Do you use your sense of smell to drive a car? Are deaf people allowed to drive cars?

No, the cameras we have now, or at least the data processing, is probably not there yet, but it's absurd to claim it's "never" possible. It's obviously either possible in a year or fifteen years, all basic hardware is advancing fast.

vel0city 17 hours ago [-]
> but it's absurd to claim it's "never" possible.

I've never made such a claim. I use self driving features in my car a good bit. They use more than just cameras though.

> Are deaf people allowed to drive cars?

Sure, but I'd say someone unable to hear things would be disadvanted at driving and potentially less safe. But you've also completely ignored the sense of feeling, understanding the feedback you're feeling on the wheel. Ignoring the feeling of the wheels on the road. Ignoring the feeling of the g forces as the car moves. Ignoring the feeling of body sway when it's windy.

You're also ignoring as others have mentioned the theory of mind of how other things actually behave around the streets. We're still not 100% sure these automated systems can actually comprehend these things.

All things a computer could sense, no doubt. But it's far more than just a few cameras.

bpodgursky 14 hours ago [-]
> Camera feeds and a neural network are not enough for self driving, no matter how much they're training. Never ever.

This is the OP claim I am saying is absurd.

tredre3 22 hours ago [-]
> it's absurd to claim it's "never" possible

Absolutely correct. The problem is that Elon has been saying that for ten years and we're not really closer to it being true.

So, in the meantime, would it kill him to admit that he was wrong about lidar?

manquer 22 hours ago [-]
The actual problem statement is can such a system do so safely. If it is just to drive a car all you need are some stepper motors and a raspberry pi.

The goal is to reduce accident and fatalities not eliminate jobs .

If LiDAR has 1/10th the fatality rate of camera setup and is less costlier than 10x the value of human life(as used in legal proceedings) then it still the only viable option

rtkwe 21 hours ago [-]
Neural nets generally don't have the same level of interconnections and neurons as the human brain because it's tougher to train. I agree in principal that the human brain should be possible to replicate in a computer but I'm not sure we can do it with the current design of single direction connections between nodes. I highly doubt we'll be able to scale it down to a machine that's reasonable to fit in a car any time soon though and that's what Tesla is promising they can do with the current hardware package installed in cars (never mind that they also promised it'd be possible with the last major revision and had to walk that back and will have to install upgraded electronics in people's vehicles, unless they're just going to strand all HW3 owners who paid of FSD but the hardware is to wimpy to handle it unless that's changed since the robotaxi event).
nickthegreek 20 hours ago [-]
Don’t we want it to be like way better than humans at driving? The dream pitch was that we wouldn’t have tens of thousands of preventable deaths every year. So install the damn lidar. At some point it is coming down to penny pinching and that means that preventable deaths number will not sink to the promise.
Aloisius 22 hours ago [-]
I think artificial neural networks was implied and there is a world of difference between how biological and artificial neural networks function.

That said, I'm not sure that's what the barrier is. I think humans would have trouble driving with a fixed low resolution camera too.

ModernMech 5 hours ago [-]
Eyes are not cameras and brains are not neural networks. Until we have artificial techniques that can match the performance and functionality of this amazing biological equipment we have, our cars must rely on other sensors like LiDAR to make up for the deficiencies in our artificial perception networks that are not overcome by current techniques.
jayd16 21 hours ago [-]
A human is not just a NN, but I do wonder how well a human could drive given the Tesla camera feed. Seems like it would surely be worse than behind the wheel.
Detrytus 22 hours ago [-]
1. Human "neural network" is few orders of magnitude more powerful than the best neural network available right now.

2. This might be obvious question but: Humans have two eyes and they are few centimeters apart so that we could use stereoscopic vision to estimate the distance. Does Tesla FSD even attempt to do something similar?

bpodgursky 21 hours ago [-]
It is legal and safe to drive with 1 working eye.

Humans have many ways of modeling distances in the real world that do not rely on stereoscopic depth.

22 hours ago [-]
beepbooptheory 22 hours ago [-]
Hubert Dreyfus rolling in his grave..
lowmagnet 19 hours ago [-]
I do wonder if this is provable via information theory.
sMarsIntruder 22 hours ago [-]
You started with a bias and ended with another one.
dist-epoch 21 hours ago [-]
> Camera feeds and a neural network are not enough for self driving

I guess we should ban humans driving then.

sMarsIntruder 13 hours ago [-]
I’m noticing how comments with common sense gets easily downvoted.
desixavryn 21 hours ago [-]
Have you used latest FSD on HW4 recently? If not, please try it out for a few days and then come back to correct your comment :-)
windward 9 hours ago [-]
Honest question: why does the HW4 matter? Hasn't FSD been sold as a service all cars with older hardware can use?
Hamuko 21 hours ago [-]
Have they managed to fix the "parking sensors"?
desixavryn 21 hours ago [-]
Using vision only on my HW4-Model-Y it seems to work fine.
modeless 21 hours ago [-]
Adding to the weirdness of this video, it appears Mark Rober faked his footage to make it look like he was using a Google Pixel to record screen video, but he was actually using an iPhone as can be seen in the screen reflection. And he put the "G" logo in the wrong orientation in the faked shot.

Also it's weird that he's acting like he's so special for having seen the inside of Space Mountain as if it's some kind of secret. Millions have seen it all lit up. Back when the PeopleMover/Rocket Rods attractions were running it was a common sight, as the track ran through Space Mountain and sometimes it would be under maintenance with the lights on. And of course in emergency situations they turn the lights on as well.

Another one: he claims they use thin curtains to project ghosts on in the Haunted Mansion which is true, but while he's talking about it he shows footage of a part of the ride that uses Pepper's ghost which is a completely different (and more interesting) technique. Some of the ghosts he shows while talking about it could not be achieved with the curtain method.

Come to think of it, Pepper's ghost could fool lidar. Maybe that's why he didn't talk about it even though it would have been more interesting. It would have been inconvenient for his sponsor. Someone setting up a reflective surface across a road is probably about as likely in the real world as a Wile E. Coyote-style fake road wall.

hnburnsy 15 hours ago [-]
Mark posted a video on X showing him getting up to speed, engaging autopilot 4 seconds before the wall, and autopilot disengaging 1 second before hitting the wall.

https://x.com/MarkRober/status/1901449395327094898

>Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas.

The playing field level here was significantly slanted comparing a production Tesla, driven by Mark, engaging 10 year old autopilot technology, against a non-production test vehicle, not driven by Mark, using I would assume the latest in LiDAR technology from Luminar.

Volvo sells the EX90 with a Luminar LiDAR sensor (not active it looks like). Why wasn't it used with Mark driving?

morgannewman 2 hours ago [-]
[dead]
sitkack 22 hours ago [-]
We already knew this to be true by the clusters of Tesla fatalities around certain bay area off ramps.
93po 20 hours ago [-]
this is factually just not true, at all. like outrageously not true. why would you just completely make something up like this?
sitkack 18 hours ago [-]
What are you talking about?

1) numerous reports of teslas not seeing tractor trailers or fire trucks.

2) numerous reports even on this site where teslas under lane assistance repeatedly and predictably behaved erratically on CA off ramps.

Any optical only system will suffer from optical illusions, this cannot be avoided.

sitkack 4 hours ago [-]
93po 1 hours ago [-]
Literally at the top of the page:

> Tesla Deaths is a record of Tesla accidents that involved a driver, occupant, cyclist, motorcyclist, or pedestrian death, whether or not the Tesla or its driver were at fault

Even the FSD stat says "FSD related" and not "FSD at fault".

Yes, Teslas are involved in accidents that cause death like literally every other car on the road. However to claim they're somehow uniquely dangerous, when in fact the stats show that they're uniquely safe, is ridiculous.

sMarsIntruder 22 hours ago [-]
This statement goes against any report and analysis on basic autopilot, not even FSD.

Just like the video, that if you’d take some time to watch l, you’d see that’s just basic autopilot.

Data is saying other things, but if we want to deny gravity, I’m ok with it.

davidcbc 20 hours ago [-]
What data?

If you link to Tesla statements that's marketing not data.

sMarsIntruder 13 hours ago [-]
davidcbc 5 hours ago [-]
Yes, it is, and it is a misleading comparison. It's is comparing Tesla autopilot, which can only be used in certain circumstances, to the US average which includes scenarios where autopilot wouldn't be used.
sitkack 21 hours ago [-]
Chill bro, you can have FSD when it comes out.
anotherboffin 22 hours ago [-]
Oh but no worries, FSD is a “solved problem” and should be done in 18 months or so…
devnullbrain 22 hours ago [-]
Oh dear, your timeline casts doubts on the ability for a Tesla to self-drive from LA to New York before the end of 2017
sMarsIntruder 22 hours ago [-]
In fact, he didn’t use FSD.
timeon 21 hours ago [-]
Not one the other car either but that one have not failed.
ravenstine 22 hours ago [-]
Remember, you gotta break some eggs to make an omelette; every time something crashes, explodes, or kills – that's a good thing! /s
deedubaya 22 hours ago [-]
Where can I buy the alternative lidar based car?
randerson 22 hours ago [-]
That looked like a Lexus with its insignia blacked out, so I'm guessing it was a custom build. But if you want a car that comes with LIDAR, look at the Volvo EX90.
hnburnsy 15 hours ago [-]
Not in use yet on the EX90, just collecting data...

https://www.volvocars.com/us/cars/ex90-electric/features/

>Lidar equipped on early production vehicles will begin in data collection mode and over time the technological capabilities of the lidar will expand to include additional sensing scenarios as part of our Safe Space Technology, including the ability to detect objects in darkness at highway speeds up to 820 feet ahead.

crooked-v 22 hours ago [-]
GM is working on it, but no indication yet when the lidwr version will be released. Currently their Super Cruise uses radar and cameras, plus pre-scanned lidar maps of roads that it compares everything against in real time.
hnburnsy 15 hours ago [-]
Volvo EX90 has the sensor, but it is not active.
wmf 20 hours ago [-]
Polestar 3 is the only car with lidar in the US that I know of.
hnburnsy 15 hours ago [-]
Not yet...

https://www.polestar.com/us/polestar-3/safety/#lidar

>LiDAR upgrade subject to availability based on production cycles. Deliveries to start mid-2025.

jayd16 21 hours ago [-]
If you get a 2020(19?) model 3 you get the proximity radar as well.
bspinner 21 hours ago [-]
Since it could be misleading without this info: radar has been removed from newer model years.
chvid 22 hours ago [-]
In China.
wnevets 22 hours ago [-]
Tesla also drives into tractor trailers because they think they're clouds
tzs 17 hours ago [-]
The test would have been more interesting if it had included a couple more or so production cars. From what I've read only a couple of cars are available with that expensive Luminar LiDAR.

I'd like to see how more ordinary automating braking systems would fare, which I think usually use radar and camera.

hnburnsy 15 hours ago [-]
>The test would have been more interesting if it had included a couple more or so production cars. From what I've read only a couple of cars are available with that expensive Luminar LiDAR

There are none you can buy in the US that have LiDAR active, but certainly Luminar could have gotten one from its partner Volvo and had Mark driven that instead of the professionally driven test rig that was used.

rocauc 21 hours ago [-]
I wonder how long until techniques like Depth Anything (https://depth-anything-v2.github.io/) provide parity with human depth perception. In Mark Rober's tests, I'm not sure even a human would have passed the fog scenario, however.
hnburnsy 15 hours ago [-]
Fair questions from a Tesla Fan Boy...

https://x.com/SawyerMerritt/status/1901481711789109259

>I have some questions:

  1) Your video "Can You Fool a Self-Driving Car?" uses Luminar’s latest tech but not Tesla’s latest FSD software. Why?

  2) Autopilot was turned on at 42 MPH in your YouTube video but you turned it on at 39/40 MPH in your clip above. Why? Multiple takes?

  3) In the clip above, Autopilot was activated MUCH closer to the wall than in the YouTube video clip. Why?

  4) In your video above, you turned on Autopilot 3.8 seconds before hitting the wall, but it appears you gave Luminar a much longer head start with their tech "activated." Why? Am I wrong in my assumption?

  5) Why was putting a child dummy/doll behind the wall a useful thing to do? What car would possibly see or react to a kid through a wall after crashing into that wall?
eldaisfish 22 hours ago [-]
Here is the original Mark Rober video - https://www.youtube.com/watch?v=IQJL3htsDyQ

I dislike the fact that Mark's videos appear to increasingly borrow from the Mr Beast style, which is very distracting. There's also the fact that half the video has nothing to do with cars in the first place.

The main result here is not surprising - Tesla's vehicles are plagued by a litany of poor engineering decisions, many at a fundamental level. Not using Lidar for depth detection is beyond stupid.

nickthegreek 22 hours ago [-]
I found the disney side video extremely interesting and entertaining. It would have been nice to just pad each of them out individually with enough content to have them stand on their own while meeting the criteria he needs to appease the algorithm.
jbaber 22 hours ago [-]
I know what you mean. But he's trying his best to be the new Mr. Wizard and youtube has a very long list of demands.
pavel_lishin 22 hours ago [-]
> I dislike the fact that Mark's videos appear to increasingly borrow from the Mr Beast style, which is very distracting.

Yeah. My daughter likes him, but this latest thumbnail made me roll my eyes.

jnwatson 21 hours ago [-]
Don't hate the player, hate the game. This is what attracts the eye balls.
pavel_lishin 21 hours ago [-]
I think I'm allowed to dislike the game, and to stop watching his videos.
2OEH8eoCRo0 22 hours ago [-]
The first half explains lidar in laymen's terms
ricardobeat 21 hours ago [-]
Shouldn’t they have tested a human driver too? I have the feeling a majority of drivers would also go right through it if unaware of the setup, as it’s such an inconceivable scenario.
crooked-v 21 hours ago [-]
The whole point of driver assistance systems is to be better at this stuff than humans drivers, and most of them, with much less grandiose marketing, would have "seen" the wall in time to emergency brake.
riehwvfbk 21 hours ago [-]
And the whole point of this "test" was to go "hurrrr Elon is dumb" and get the "smart" people to click.
nickthegreek 20 hours ago [-]
The thing that made Elon look dumb was his car not doing what his competition cars can while dismissing and removing the tech that enabled this safety.
elaus 20 hours ago [-]
Sorry, but did you even watch the video? I found the tone to be quite neutral, and he gives Tesla credit where credit is due. Besides, the video isn’t even about Tesla or “Elon”; it’s just one of multiple applications of LIDAR discussed in this video.
rtkwe 21 hours ago [-]
A human would (or at least should) slow down going through the water and super dense fog that fooled the Tesla. From the shot behind I could spot the kid through the water blasts, I don't know how it looked in the car.

edit: on rewatch you can pretty clearly see the kid through the rain.

TrackerFF 22 hours ago [-]
But seriously, how much more would Tesla have to spend on each unit, if they were to include a LIDAR sensor? why not optical and LIDAR?
bspinner 20 hours ago [-]
Or reintroduce proximity radar at least.
22 hours ago [-]
CaffeineLD50 18 hours ago [-]
Nothing new hear in this well produced video championing LIDAR.

Tesla now suffers from a toxic brand way worse for its future than missing lidar.

sMarsIntruder 13 hours ago [-]
This is probably the right answer to all the other arguments I read above and below.

Just an easy target for haters now and then.

21 hours ago [-]
15 hours ago [-]
22 hours ago [-]
22 hours ago [-]
wetpaws 22 hours ago [-]
[dead]
22 hours ago [-]
22 hours ago [-]
riehwvfbk 21 hours ago [-]
> The last scenario of a Wile E. Coyote-style wall with a fake road painted on it was obviously not realistic

...shows a photorealistic road on said wall. Last I checked, human drivers didn't have organs capable of LIDAR. Most would have crashed into this ridiculous obstacle.

silok 12 hours ago [-]
I think this is a valid point, the tesla is not necessarily worse than humans. This whole test setup is built around the advantages of lidar which of course will detect the obstacle.

Human driving through painted wall:

https://www.youtube.com/watch?v=0NRerHC7g-0

dzhiurgis 18 hours ago [-]
He actually disabled the autopilot, you can clearly see the system is off: https://youtu.be/IQJL3htsDyQ?t=942
hnburnsy 15 hours ago [-]
Mark posted a video on X showing him getting up to speed, engaging autopilot 4 seconds before the wall, and autopilot disengaging 1 second before hitting the wall.

https://x.com/MarkRober/status/1901449395327094898

Did he do the same with LiDAR only turning it on 4 seconds before an impact?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 19:15:31 GMT+0000 (Coordinated Universal Time) with Vercel.