Self-driving cars are sort of the holy grail of automobile manufacturing. Perhaps it's the futuristic allure of climbing into a vehicle that is able to transport you from point a to point b while you work, or take a nap, just enjoy the scenery. Or, maybe it's the idea that we'd all be more productive if we didn't have to waste time with tedious tasks like changing lanes, or navigating to an important meeting.

There's also, I suppose, an argument to be made that computers can (in theory) handle many of the functions of keeping a car safely on the road and moving in the right direction better than a human. Except, we're clearly not there yet.

Tesla's Autopilot capabilities (described by the company here) have come under intense scrutiny after a series of crashes, including at least one resulting in a fatality. That crash was the subject of an NTSB report released Tuesday, which faulted the car's Autopilot feature as being at least partially at fault.

The main findings of the report included the following:

The Tesla's Autopilot lane-keeping assist system steered the sport utility vehicle to the left into the neutral area of the gore, without providing an alert to the driver, due to limitations of the Tesla Autopilot vision system's processing software to accurately maintain the appropriate lane of travel.  

The report also mentions that "Tesla's collision avoidance systems were not designed to, and did not, detect the crash attenuator at the end of the gore... consequently, the forward-collision warning system did not provide an alert and the automatic emergency braking did not activate." 

As a result, the Model X collided with the attenuator and then crashed into two additional vehicles. The driver later died from blunt force trauma sustained in the accident.

To be fair, the problem isn't just the limitations of the autopilot capabilities. Another real problem is that drivers aren't ready to rely on those capabilities as a substitute for what they're supposed to be doing while they sit behind the wheel of a vehicle moving at more than 70 mph.

The NTSB was clear that its investigations "continue to show that the Tesla Autopilot system is being used by drivers outside the vehicle's operational design domain." At the same time, it criticizes Tesla for not limiting the conditions that Autopilot can be used.

Here's the problem: when you brand your driver assistance features 'Autopilot,' people assume that it's capable of more than it really is. Autopilot gives a sense that the car is able to drive itself. That's simply not true.

Tesla does offer additional features it describes as "Full Self-Driving Capability," but even those are only designed to control the vehicle in certain situations like while driving on the highway. As a result, drivers make the mistake of believing their cars are able to handle situations the technology isn't ready for.

Or, even more importantly, we're just not yet ready for the technology.