Tesla  (TSLA) – Get Free Report chief Elon Musk has been saying for years that fully autonomous vehicles are right around the bend. He said in 2016 that it would only take two or three years for a Tesla to drive better than a human. When that didn’t happen, he said in 2018 it would be possible to remotely summon a Tesla across the country. 

And in 2019, he claimed that robotaxis by 2020 were more than doable. 

“I know I’m the boy who cried FSD,” he said on Tesla’s second-quarter earnings call. “But man, I think we’ll be better than human by the end of this year.”

Related: Engineering whistleblower explains why safe Full Self-Driving can’t ever happen

Tesla’s Full Self-Driving, however, remains stuck with an important caveat; though the car can drive itself reasonably well in most situations, it requires the constant hands-on, eyes-on attention of the driver, keeping the system at a Level Two designation. 

The thing currently preventing Tesla from making that much-anticipated jump to a Level Three system, according to Omer Keilaf, CEO and co-founder of Innoviz Technologies, is Musk’s strange aversion to lidar, a system that uses lasers to determine range and distance. Innoviz has been a global manufacturer of the technology since 2016.

The different levels of autonomy

A Level One system, best exemplified with something like adaptive cruise control, is a driver assist program where the drive remains in full control of the vehicle. A Level Two system, a designation Tesla’s FSD currently holds, is an advanced driving assistance system (ADAS) where the car can handle primary driving functions but the driver must be prepared to take over (hands-on, eyes on). 

Self-driving robotaxis from Waymo have been expanding into new territories across the country. 

picture alliance/Getty Images

Teslas will engage in a “nag” if the driver takes their eyes off the road or their hands off the wheel; repeated violations can result in the suspension of FSD access to that vehicle. 

A Level Three system, which Mercedes-Benz  (DDAIF) – Get Free Report just started to roll out in the U.S., allows for conditional automation (hands off, eyes off). The conditions at this stage are specific and extremely limited; the outside environment must be clear and bright, the route must have been previously mapped by Mercedes, the car must be going less than 40 mph and there must be a car in front of the Mercedes. 

A Level Four vehicle, such as a robotaxi, is highly automated and requires no human interaction. It runs on specific routes and is programmed to simply stop itself if the vehicle runs into any problems. And a Level Five autonomous system would be that “better than human” model that Musk is always promising.

The problem with Tesla’s approach

For a system to reach Level Three autonomy, it needs to have a baked-in redundancy in the event that the main system fails. This redundancy allows for the “eyes-off” approach. 

In a Tesla, the main system is the car’s cameras and neural networks.

“Humans drive with eyes & biological neural nets, so makes sense that cameras & silicon neural nets are only way to achieve generalized solution to self-driving,” Musk said in 2021. 

If that system fails, the only redundancy is the human driver. 

“In the Level Two system, you have cameras that are allowing the car to make driving decisions, but if they fail — and they can fail due to low light conditions or direct sun or a drop of water that blurs the camera — you need to have something to back it,” Keilaf told TheStreet in an interview. In a Tesla, that backup sensor “is actually the driver,” which means eyes have to stay on the road at all times.

In order for an automated system to move away from Level Two, there would need to be a sensory backup. 

“You need to have a sensor that can back the car, that can provide backup to the camera,” Keilaf said. “The only sensor that is capable of seeing sufficient resolution and range etc. is the lidar; in order to move away the driver you need to add a lidar.”

More Tesla:

Tesla chief Elon Musk says he’s ‘not building a house anywhere’ in wake of federal investigationHere’s why the Tesla bears are very wrong, according to Wedbush analyst Dan IvesTesla’s hidden ‘Elon Mode’ has NHTSA regulators extremely concerned

Though Musk remains convinced of the power of his cameras and neural nets, Keilaf said that, at the moment, the only way to build in that redundancy is with lidar.

“I think today it’s pretty clear that they are stuck in a glass ceiling,” Keilaf said of Tesla. “And I expect them to adopt the lidar at some point because they are going to be stuck in Level Two and all of the other car companies are going to leapfrog them just by using a lidar.”

Indeed, Mercedes’ Level Three Drive Pilot system takes advantage of lidar, radar, ultrasonic sensors and cameras, a combination that experts have said is the only way to operate a truly safe self-driving car. 

Though Keilaf isn’t sure when Level Four and Five systems will become a norm, building out Level Three systems is the first step toward getting to that point of general autonomy. 

“It’s a platform that will allow the customers to continue to build confidence. It’s eventually a question of confidence because from the capability point of view of the vehicle, it’s already capable of being Level Five, it’s a matter of whether it’s sharp enough,” he said. “The more you will gather more kilometers, you will actually be able to actually calculate your risk.”

Tesla is currently facing a number of investigations into the safety of its FSD technology. A lawsuit accusing Tesla of wrongful death in an Autopilot fatality entered trial Sept. 28. 

Forget Tesla – Sign up to see what stocks we’re buying now