Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla Says Autopilot Makes Its Cars Safer. Crash Victims Say It Kills (nytimes.com)
34 points by fortran77 on July 6, 2021 | hide | past | favorite | 87 comments


So he changed langes without checking for traffic, crashed, the child didn't wear a seat belt and died, and now they're suing.


Apparently he was forced to change lanes. But the reason they're suing seems to be that the Tesla autopilot did not slow down at all and the Tesla's driver was distracted.

In my opinion, this is yet another case of people being lulled into a false sense of security that the car drives itself, when in reality the Tesla's driver should have been attentive and should have reacted.

"The police faulted the Tesla driver — not the car — for his inattention and his driving at an unsafe speed"


> In my opinion, this is yet another case of people being lulled into a false sense of security that the car drives itself, when in reality the Tesla's driver should have been attentive and should have reacted.

In every similar story I always fail to imagine how the driver can possibly stay in a state of "active attention" 100% of the time, while also not doing anything active. That's not how much of the human race works.

At this point it's a lot easier to just actively drive, instead of having to make a sudden split-second decision while being dulled by the inaction. I feel like non full-autonomous driving just doesn't make sense at all.


It's also why airplane pilots do much more than they theoretically would have to – under normal conditions, a modern autopilot should be able to handle the whole flight without human supervision.

But then the human won't be reacting fast enough when he is needed to handle an unforeseen problem, so they're kept in the loop and active, just in case.


I wonder how many flaps you could possibly check on your Tesla during your commute (i'm being sarcastic).

Also, I could imagine that an airplane is, from a totally theoretical and simplicistic point of view, much easier to manage for a computer, even for simply having a lot of space to maneuver, and knowing constantly what other airplanes are doing.

But driving a car? How can a computer possibily handle a scenario where edge-cases make for a good 85% of each travel, unless you drive in the desert?


It doesn't have to be a 1:1 replacement. The computer has some advantages that can make up its lack of brains by having lower latency, not getting distracted, better sensors, better maths when it comes to motion ("objects in the mirror may be closer than they appear") and more training than a human will ever experience.


I totally agree. But also, critically, an auto-pilot is precisely expected to have "a brain", it's implicit in the name.

Maybe we should just agree that auto-pilot, in its common interpretation, is currently just a sci-fi concept, and communicate more honestly what a computer inside a car can't and can do.

Apparently, dodging weird obstacles is something it struggles with. Which is a problem, since all driving is, essentially, dodging weird obstacles.


I agree with you. So maybe the system could expect the driver to actually drive.

I've never driven a Tesla, but I've seen a similar thing in a Citroën I rented once. On the highway, with cruise control, the car could practically drive itself. It would follow the lane, slow down if cars in front got too close, accelerate back to the set speed when they went away, etc.

Once or twice I completely lifted the hands off the wheel, and pretty quickly it would start complaining. So with this system, the driver actually had to be engaged.


I'm pretty sure a Tesla also starts alerting and alarming if you take your hands off the wheel


How was he forced to change lanes when traffic in front of you slows down? The correct reaction is hitting the brakes yourself and switch to another line only if it is clear, but that was not the case here.


He indicated for at least 3 seconds first. That's not really a panic swerve. He might have simply misjudged the existence or speed of the Tesla.


> a truck in front of them slowed

Doesn't sound like "forced to change lanes" to me.


> So he changed langes without checking for traffic

The article says no such thing.

You could just as easily interpret the events as: "He put on his turn signal, which Autopilot ignored."

A human driver should have been driving defensively and realized they needed to give the person space.

A human driver usually would have hit the brake as soon as they saw a turn signal.


> The article says no such thing.

If you change lanes on a freeway, and a vehicle is coming up behind you at normal freeway speeds, the only way for you to be surprised by it is if you didn't properly check for traffic.

> You could just as easily interpret the events as: "He put on his turn signal, which Autopilot ignored."

It's both. He didn't check properly and Autopilot ignored the turn signal.

> A human driver should have been driving defensively and realized they needed to give the person space.

I suppose, but if you're in a crash and your argument is that the other driver should have been defensive then you're basically admitting you caused it. (in situations with two drivers and no mechanical failure)

> A human driver usually would have hit the brake as soon as they saw a turn signal.

"Usually" I'd agree is true, but lots of drivers would expect to pass and then let them change lanes behind.


It's not just that the Tesla driver ignored the turn signal, but it also ignored traffic conditions. Left lane traffic had slowed down enough that the Tesla is seen passing 2 cars on their right before hitting the truck. California law puts responsibility on all drivers to be drive at a "reasonable or prudent" speed given the conditions. Maintaining a speed where you're quickly passing cars on the left and right — and not slowing down at all despite a vehicle signaling for 3 seconds and then fully being in your lane for about 1 second — is arguably not "reasonable or prudent".

https://leginfo.legislature.ca.gov/faces/codes_displaySectio...


I don't think the speed they were passing other cars was unreasonable. My best estimate is slightly above 50 for the white car, versus the tesla going 69/70.

Not slowing down for the signal is bad behavior but I don't know if you're obligated to slow down for a signal and it's not really a driving speed issue. Someone signalling isn't a road condition.


It's better to assume he did check properly. The problem is the car behind didn't slow down AT ALL, that would be VERY surprising, leading to the last second attempt to swerve.


It's possible, but if you enter a lane in a way that requires the car behind you to brake, that's just as bad as not checking. And just letting off the accelerator right after seeing the turn signal would not have slowed the Tesla down enough to avoid a collision, that situation needed braking.

Also, a driver on the freeway continuing to go the same speed should not be "VERY surprising"!


You're right, it does seem like the truck likely would've been forcing the car behind to brake. And that's bad behavior. I wonder what speed the truck was actually going at. Letting off the accelerator probably would've only slowed it down to 60mph.

On the other hand the Tesla probably shouldn't have been going that fast when the speeds of cars in both adjacent lanes are going that much slower. I'd expect a human driver would typically feel more wary in such a situation, wary for that exact situation of a truck changing lanes.


Does turn signal give you right of way in US? It usually is indication that you will switch lanes, but is there any rule that forces the drivers in the lane to be switched to accommodate you?


No, it doesn't give you right of way in any state in the US. But it doesn't stop drivers in the Bay Area from thinking it does.


Not from the US so don't know what the rules say, but I do agree with the parent's comment.

"Driving defensively" doesn't mean "give them the right of way". It means that if they put on the blinker, they're probably thinking of changing lanes even though it may not be safe to do so (and don't have the right of way).

It's the type of situation where it's better to be safe than right.


I have said, and will continue to say, that the greatest sin of Tesla is that using autopilot requires good judgement from the drivers, and that is an unreasonable expectation.


While I don't have any criticism of you opinion, I've noticed that people who say "I'm always saying ..." usually seem to have a simplistic and rigid understanding of what they're talking about. It's a way to mark yourself as probably a bit of a crank and unwilling to have any interesting conversation about the topic.


That's a fair observation. I may be a bit of a crank :-D


>Driving safely requires good judgement and that is an unreasonable expectation.


Autopilot that requires constant attention of the pilot is not really an autopilot, is it?


If you have ever flown an airplane with autopilot you would realize you are completely wrong.


Most people have never flown an airplane, let alone one with autopilot.

To the layman, autopilot and self-driving are synonymous (as epitomised by this scene from The Darwin Awards: https://youtu.be/PNF7Ru1mMeo)


That might be true, but that is not the user's expectation and marketing at Tesla knows this isn't the user's expectation.


You're right, and they should never have named their driving assistance tech autopilot because it gives a false impression of its capabilities.


>An autopilot is a system used to control the trajectory of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems)

https://en.wikipedia.org/wiki/Autopilot


So the Autopilot in a Tesla is designed to allow the driver to check the weather and plan her destination?

Aircraft don't often cluster together meters apart and travel in the same directions. If they did, Autopilot would certainly be used differently. This is why I think it is irresponsible to use the term for automobiles.


>for example,


Yes, a terrible example, for the reasons above.


I mean, have you seen some of the people on the road?


They're literally calling it "Full Self-Driving Capability" though (with the upgraded version).


hardware


The story and the video evidence assert that his turn signal was on for a full 4 seconds before making the lane change


I don't know how driving in the US works, but here in Germany a turn signal doesn't give you any rights to change lanes if there is oncoming traffic, no matter how long it's been on. The driver changing lanes is most likely at least partially at fault here.


The NYT story cites Tesla court filings about the police's determination — "The police faulted the Tesla driver — not the car — for his inattention and his driving at an unsafe speed," Mr. McCarthy wrote. IANAL, but given no other info about the investigators' conclusions, I wouldn't necessarily assume that the truck driver didn't have some* fault, and California isn't a state where it's either/or when it comes to blame — i.e. both drivers can share partial blame and recover damages.

California has a couple of laws [0] that mandate "reasonable or prudent" speed, e.g.

> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.

The truck met the requirement [1] for signaling a change (100 feet beforehand). Is it "reasonable or prudent" for the Tesla driver to maintain speed (nevermind accelerate at the last second) with the truck slowly entering the lane? Especially with the left lane traffic having slowed down enough that, in the span of the 5 second clip, the Tesla is seen passing 2 cars on their right before hitting the truck?

[0] https://leginfo.legislature.ca.gov/faces/codes_displaySectio...

[1] https://leginfo.legislature.ca.gov/faces/codes_displaySectio...


Legally yes. Anybody who drove a truck or even utilitary vehicles with blindspots know to give way when a truck activate its blinkers, and honestly even when i did not know how shitty it was i was carefull, because my driving instructor told me a dozen time about truck blindspots.


In Germany you are not allowed to pass on the right, so racing into a car merging from the left would be right out.


Indeed, sitting in a taxi on US highways was a very scary experience with cars overtaking on both sides.


Overtaking is a big word, you maybe meant passing? With the keep your lane doctrine that is something to be expected.

Here in Switzerland we are allowed to PASS traffic on the right hand side to (since Jan.01.2021). But you are NOT allowed to overtake, as in change lanes actively.


I admit that I'm not terribly familiar with the correct vocabulary even in my native language. Thanks for pointing that out. I meant passing.


The truck touches the white line at 2 seconds, crossed into lane at 3 seconds, fully in the lane at 4 seconds.


Did you even read the article ?

It never said he changed lanes without checking for traffic. And the police put the blame squarely on the Tesla driver.


As the Tesla was going with constant speed, the person switching the lane must not have checked for it.


In many countries passing on the right at highspeed it illegal. If traffic on the left lane was coming to a stop then it is irresponsible for the cars on the left to pass at 65mph.


Passing cars that stand still without slowing down doesn't have to be illegal in order to be stupid. There's clearly enough blame to share in this situation.


> In many countries passing on the right at highspeed it illegal.

Are those countries that also don't let you stay in the left lane?

But even if it's illegal to pass on the right, it's still a terrible idea to change lanes to the right without looking behind you!


I did. Since he moved into the path of a car that AIUI had right of way and wasn't showing any sign of yielding, even after several seconds, I assume that he didn't notice, and that he didn't notice because he didn't really look.

(Do vehicles already in a lane have right of way in the US, relative to vehicles who might cross into that lane from another?)


That's not true at all. If you watch the video, you can see the lane was clear to basically 5 car lengths (1 second per car length), and he absolutely had right of way. He gave ample time for signalling and he was gradually merging into the lane, which would give plenty of time for any normal driver to respond to. At the last second he realizes the Tesla wasn't slowing down AT ALL, and he attempts to swerve back.


> you can see the lane was clear to basically 5 car lengths (1 second per car length)

5 car lengths is less than a second at freeway speeds. By the time the truck started changing lanes, you can look at points on the ground moving and see that it was about a second ahead of the tesla. Changing lanes that close and without speeding up into the new lane was not safe, autopilot or not.


I was referring to the rule of seconds. For example, below 40mph it's recommended to leave 2 seconds between you and the car in front. Above 40mph, it's recommended to leave 3 to 4 seconds between you and the car in front.

The car length in this context is just the length the car travels per second.


> The car length in this context is just the length the car travels per second.

Is that how that term is used?

Either way, that truck was far under that suggested amount of space when it crossed into the tesla's lane. It was around one second, well under two let alone more. Measuring against the ground, of course, not relative speed.

At the speed limit, vehicles are going about 100 feet per second. At the start of the video they appear to be about 3 dashed lines apart (120ft?) and by the time the truck is about to switch lanes they're two dashed lines apart (80ft?)


The video was 5 seconds long, the truck was already signalling and slowly merging into the other lane at the beginning of that 5 seconds. It takes a human driver 1.5 seconds to react, there was 3 whole seconds for a human driver to slow down properly.


There definitely was time for the tesla to react. But I'm discussing the truck driver's actions. They signaled for a reasonable amount of time, then cut in front of the tesla despite it not slowing down. And I'm discussing your claim of 5 seconds of gap, because when you follow the ground movement there wasn't that much space between the vehicles, especially once the truck actually crossed between lanes.

Honestly the truck driver would have been better off not signalling, or changing lanes immediately upon signalling, because then there would have been much more space and they probably wouldn't have slowed down so much.


He was rear-ended.


If you read to the end, you'll see that you're really wrong. Try reading first, then commenting.

> The video saved by the car [the Tesla] Mr. Yalung was driving shows it passing vehicles on the right and left. Four seconds before impact, Mr. Maldonado turned on his blinker. It flashed four times while his Explorer was in its original lane. A fifth flash came as his truck was straddling the lanes. In court documents, Mr. Maldonado said he had noticed the Tesla approaching rapidly in his rearview mirror and tried to swerve back.


That's victim blaming. Yes the driver should have taken better precautions - the driver is responsible for all passengers to wear seatbelts, and to look behind - but it's the Tesla driver that crashed into the truck. Whether they had enough time to react or whether the truck cut them off is probably apparent enough in the footage, that's up to a judge to decide.

Anyway, autopilot is not to blame; if anything comes out of the lawsuit, it's that autopilot should not be called that because it lulls drivers into a false sense of security. In the end, the driver is still responsible and the family should put the blame on the driver. It really sounds like the family is holding Tesla responsible instead of the actual person responsible, the driver of the Tesla. But in a court, they might not win since the driver of the truck may have made an unexpected maneuver, and the death of the child may be blamed on the truck driver because he wasn't wearing a seatbelt.


Are seat belts mandatory in California for passengers?

In New Zealand the second quickest way to get a ticket is driving without a seat belt.


> Are seat belts mandatory in California for passengers?

Yes, they are. The driver is responsible for ensuring all passengers under 16 years old are buckled in.


You made me curious: what is the quickest way to get a ticket in NZ?


Drive real fast.


Running a red light, probably.


Both can be right. What matters is the ratio of deaths caused to deaths avoided.


Rationally that's what matters, but the response of the courts and regulators could make autopilot uneconomic.


Human fault aside, anybody care to speculate why Autopilot failed here? Was it a shortcoming of the radar, the vision system, or the fusion if the two? Is ultrasound used at all for AP or auto emergency braking?

I recently watched a great talk by Andrej Karpathy about Teslas transition away from radar towards vehicle speed estimation using only vision: https://youtu.be/gZ2SeiLjaEc

I wonder if it would have prevented the tragic outcome here.


The NYT had an Carnegie Mellon professor examine the data and video and quotes his speculation:

> In most of the video, the Tesla maintained a speed of 69 miles per hour, but just before impact it briefly increased to 70 m.p.h. then slowed in the final second, according to data from the car.

> Mr. Rajkumar of Carnegie Mellon, who reviewed the video and data at the request of The Times, said Autopilot might have failed to brake for the Explorer because the Tesla’s cameras were facing the sun or were confused by the truck ahead of the Explorer. The Tesla was also equipped with a radar sensor, but it appears not to have helped.

> “A radar would have detected the pickup truck, and it would have prevented the collision,” Mr. Rajkumar said in an email. “So the radar outputs were likely not being used.”


My guess is the issue with sensor fusion.

Radar throws a lot of false readings so there are many situations where it is ignored.

For examples, readings of stationary objects are mostly false.

My guess is that the speed difference was so big that also in this case Tesla decided that the radar reading was stationary and false.


Hmm I think if a vehicle is moving even 5 mph it would be easily picked up by radar, and its only truly stationary things that are a radar "blind spot".

I suspect its a vision or fusion problem--as the article points out, the lighting wasn't great.

Why doesn't the car use ultrasound for AEB? Perhaps risk of false positives coupled with the fact that its range is so short it would only have time for limited braking anyway?


Stationary objects, also called obstacles. These systems can give people false sense of security and expectations, exactly at the wrong times. That kind of unreliability may be worse than having to drive and be attentive as a driver.


I'm not sure what the limitations of radar are, but doesn't seem like it was being used in this crash considering the Tesla car just drove right into the truck.


Radar is a critical component of Autopilot and every other automatic cruise control system to my knowledge. So it was almost certainly active. A Tesla with failed radar system would not allow the driver to engage AP.


I don't believe it's a critical component of Autopilot. From the last public video release in 2019 [1], it seems like they use the stereo cameras to derive depth information, and use radar as a kind of ground-truth in informing the model. I would think the source-of-truth for what the Autopilot system thinks is the position of objects is the stereo camera, i.e. the radar is not part of the online flow, but is used offline to correct the model.

See specifically at 2:20:22 [2]:

> We use that radar to annotate what the vision is seeing, the bounding boxes that come out of the neural networks, so instead of human annotators telling you that this car, in this bounding box, is roughly 25 meters away, you can annotate that data much better using sensors.

I take that to mean that the radar is used as part of supervised learning for updating the model offline, whereas just the cameras are used as input to the model.

[1] https://youtu.be/Ucp0TTmvqOE?t=8254

[2] https://youtu.be/Ucp0TTmvqOE?t=8422


I find it interesting that there seems to be a self-selection process here. Irresponsible people seem to be attracted to "hey I can just put a car in autopilot and ignore the requirement that I have to be attentive." Unfortunately these people are a menace to the rest of us driving on the road. And these same people are going to going to get a technology banned that can save lives and injuries.


Regardless of who is at fault, I don't understand this love affair with the Tesla autopilot. Relax but with remain vigilant with your hands on the wheel. This doesn't make sense to me.

You're either operating the vehicle or not. The hard part of daily driving is paying attention, not barely turning the wheel to change lanes. What is the purpose of the current autopilot?


Autopilot lulls drivers into a state of incompetence. This is an involuntary physiological response that humans cannot control. It’s like giving alcohol to the driver, where Tesla is the bartender.




I'm sure the subeditor deliberately chose a headline that set up some cognitive dissonance.

But I'd lay long odds that both things are true: the Tesla autopilot does indeed make it's cars safer, and it also kills people. A similar thing can be said about vaccination: it kills people, while on average making them safer.

We've seen what a mess that situation has created for vaccines, driving the rise of the anti-vaxer movement. I wonder it we will see the rise of anti autopilot movement. Judging by some of the comments here, perhaps we already have.


Can anyone explain how US traffic rules operate?

Where I live if someone enters your lane and you hit them it is always their fault. You have right of way in your lane that is absolute - no one can cut you off.

So why is Tesla driver at fault?


In California's vehicle code, there's a rule requiring drivers to keep a reasonable speed based on conditions:

https://leginfo.legislature.ca.gov/faces/codes_displaySectio...

> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.

Maintaining the speed limit as another car gives 4 seconds of signaling and spends at least a full second in your lane is arguably not "reasonable".

Where you live, if someone makes a reasonable signal to move into my lane, and I decide to not brake and rear-end them — you're saying that would be an automatic payout in my favor?


> Where you live, if someone makes a reasonable signal to move into my lane, and I decide to not brake and rear-end them — you're saying that would be an automatic payout in my favor?

Even if you have an obligation to brake and mitigate the problem, the fact that you have to brake sure sounds like they are the actual cause of the problem.


it varies from state to state in the US




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: