This site may earn affiliate commissions from the links on this page. Terms of apply.

Both Uber and Tesla suffered major cocky-driving setbacks in recent weeks. Both companies were involved in fatal traffic accidents — Uber, when its vehicle struck a pedestrian, and Tesla after a Model X driver, Walter Huang, died when his vehicle struck a concrete median in Autopilot style. Huang's family has hired the law house Minami Tamaki to investigate the situation and has claimed a preliminary study shows the Autopilot system is clearly deficient. Tesla, meanwhile, continues to put blame entirely on the driver.

"(Our) preliminary review indicates that the navigation system of the Tesla may accept misread the lane lines on the roadway, failed to detect the physical median, failed to restriction the car, and drove the machine into the median," Minami said.

The family has claimed that Huang complained about problems with Autopilot in that specific area. Tesla argues that these very points undermine any argument that Autopilot was to blame for Huang's expiry. The company released the following statement to ABC News:

TeslaStatement

According to telemetry from the Model Ten, the driver's hands were not on the wheel for six seconds preceding his expiry, despite multiple warnings to appoint with it.

The trouble hither goes deeper than the question of whether Tesla's Autopilot is linked to Huang's decease. Proponents of cocky-driving cars have often pointed to the fact that tens of thousands of people die in automotive accidents every twelvemonth. Human-controlled driving isn't pretty, and your average homo isn't particularly good at it. Self-driving vehicles could very well improve on a bad situation.

But non much ink gets spilled on the inevitable transition periods, during which self-driving cars aren't going to be as good at driving equally their homo counterparts. Information technology'southward piece of cake to explain Level 5 cocky-driving to people, because that's when the automobile is going to be capable of doing everything. The lower levels, which give the vehicle fractional control in certain circumstances, tin only function properly if the driver is completely aware of their limitations and capabilities.

A contempo video shot by someone from the Chicago area attempted to replicate the California accident and very nearly succeeded. The vehicle heads directly towards a slab of concrete earlier the commuter takes the wheel once again. In the absenteeism of any indication that Huang was attempting to commit suicide or suffered a heart attack, stroke, or equivalent event, we can at least conclude his death was unintentional and that he took his hands off the bike because he believed the Tesla Autopilot would accurately guide the vehicle. And while the Uber crash from last month isn't the main focus of this story, we tin also assume that the driver in that incident had no intention of killing a pedestrian.

The fundamental problem with cocky-driving vehicles that aren't capable of full, robust, Level 5 performance (and none of them currently are) is that at some point, the vehicle is going to decide it can't handle road weather. The human being driver may or may not be aware that conclusion has been made. Even if the driver is aware of it, he or she might not be able to react quickly enough to prevent an accident. And the smarter these systems go, the greater the take chances that the car might make i decision to evade a catastrophe while the driver attempts to have a dissimilar, confounding activeness.

Self-driving cars really could revolutionize transport long-term. They could modify the dynamics of vehicle ownership, help older people retain cocky-sufficiency, and slash the rate of death associated with drunk driving, distracted driving, and exhausted driving. The fact that some of these gains could take several decades to fully make it given how long information technology takes vehicle fleets to plow over is no reason not to pursue them. Merely uncertainty around cocky-driving vehicle intelligence and operational characteristics is however a problem today and information technology's going to be a problem for the foreseeable future. The liability questions aren't going to become away any time soon.

The NTSB has revoked Tesla's status every bit a political party to the investigation of the crash. In order to operate alongside the bureau, Tesla is required to respect the confidentiality of the investigation. In taking the position that Huang was solely responsible for the incident, Tesla broke that requirement. The NTSB has a less rosy view of Autopilot'due south current functionality than Tesla does.