Sometimes you really don't want to have to say, "I told you so." Some of Tesla's employees are feeling that way right now. A few weeks ago, the company revealed that a Model S in autopilot mode crashed into the side of a truck on May 7, after it failed to see the truck against a brightly lit sky. The human aboard was killed. 

Today, CNN reports that several Tesla employees (most of them anonymous) said they tried to warn CEO Elon Musk and other top execs that the technology was not reliable enough to trust with people's lives. One said he was pulled over on suspicion of drunk driving while using it. Though the company tells drivers to remain attentive to the road and put their hands on the wheel while in Autopilot, this employee believed some consumers would ignore those warnings. "I was scared someone was going to die," he told CNN.

A Tesla spokesperson declined to comment for this piece but requested that quote this statement previously given to CNN: "Safety is a top priority at Tesla. We constantly build updates to our software and specifically Autopilot features to continue to improve the feature and driver experience."

However, several employees told CNN their safety concerns were dismissed as overly cautious by Musk and other top execs. Musk's approach was "don't let concerns slow progress," they said. Since the accident, Musk has pointed to statistical data that shows one fatality for every 89 million miles driven by humans, whereas the May accident is the first known fatality in more than 130 million miles of Autopilot driving. If autonomous cars are safer than human-driven ones, the reasoning goes, we'll save more lives by bringing the technology to market as quickly as we can, even if there are a few fatal crashes along the way.

The limits of data.

This attitude makes perfect sense for the data-driven Musk, and it might even make sense to anyone looking at the issue from a purely statistical viewpoint. The problem is that basing decisions purely on data means failing to see the whole picture. For example, consider the very real likelihood that the Tesla crash will make people everywhere distrust autonomous cars, and substantially slow their adoption in the market. Or that Congress may respond by introducing stringent limits on them. Both these results are likely, and either might well lead to fewer autonomous cars on the road instead of more.

To make matters worse, Musk has displayed disturbing arrogance throughout this episode. After the crash was announced, various observers--and the Securities and Exchange Commission--questioned why the company did not disclose the accident before a public offering of $2 billion which happened about 10 days after the crash. In a series of condescending tweets, Musk told Fortune that the fatal crash "was not material" to Tesla's business outlook. Fortune pointed to the quarterly report Tesla had filed just three days after the crash, warning that "...we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death," and specifically calling out Autopilot as a technology that could result in such claims and materially affect financial performance.

Even more arrogantly, Tesla has ignored calls for the simplest of fixes: Change the name of "Autopilot" to something else, as Consumer Reports and auto industry experts have requested. Although the FAA defines autopilot systems much the way Tesla does--as something that can assist a human who must nevertheless remain alert and ready to take over--popular perception is that autopilot systems can fly planes on their own while the pilots nap.

But at least Musk gave in on one important item. The Tesla employees report that he originally wanted to let drivers play videos on the car's central console but abandoned the idea after several of them raised liability concerns. Good decision. The driver whose Tesla crashed into the truck was reportedly watching a Harry Potter video on a portable DVD player at the time.