This is the saddest tech news of the year so far.

Today, Tesla announced that a driver was killed last month in Florida when a Model S in self-driving mode struck a tractor-trailer on a divided highway. Tesla reported the incident to the NHTSA and posted about it in their blog today for the first time.

The blog post explained the details of the crash this way:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

While there have been incidents with Google cars in California, including one accident involving a bus, this is the first reported case a fatality related to autonomous driving.

The news comes as a shock to the automotive industry, Tesla employees, and even the robotics industry because the Autopilot mode is the first commercially available self-driving tech that can steer the car and handles the acceleration and braking. It's not fully autonomous and only works on the highway. Other car companies like Volvo offer a somewhat similar tech. On the Volvo XC90, for example, the car can also steer, brake, and speed up but only for short periods.

The Tesla Model S is the most advanced car on the road, and this is a major setback for a company that is prepping the lower cost Model 3. However, it's important to understand what it means for self-driving cars. Having driven in most of the latest cars that offer similar technology, extensively tested the Model S, driven in the Cruise Automation test car, there have been mishaps. I can tell you that we live in a dangerous world and there are always unknown variables. Part of the issue with autonomous driving that is causing some of the hold-ups is that many variables remain untested--e.g, road rage incidents, sudden swerving.

However, Tesla has done a good job of warning drivers about Autopilot mode. When you drive, if you do not place your hands on the wheel after a while or if the cars drives erratically, you will see a prompt. If you go to fast on a side road, it will disable itself. While some drivers initially started recording videos with their hands and feet away from the controls while reading the paper, most of the drivers I've met and the experts I've talked to have said those are mostly meant as viral videos.

There is a danger, but this is not a game-changer for robotic driving. It's a sensitive issue, and a  recent report suggested that the technology has a long way to go before it can make moral decisions, but in the age of automation, accidents will happen. I know friends and family that have been involved in serious accidents. The main point to make here is that robotic tech can save lives, that the car can steer out of most situations and brake for you. I've had an Audi brake for me suddenly to avoid an accident, you become well aware of how it can help.

Say a prayer for the family, proceed with caution.