Car crashes cause over 37,000 deaths a year. Self-driving cars sound like a great idea in theory, right? But recent events, such as car crashes and problems with the technology, may delay their use for at least a decade.

Here are five things to think twice about when we talk about self-driving cars.

They can be hacked.

There have been reports of researchers "hacking" cars by simply putting stickers on typical street signs. The researchers found that they can exploit vulnerabilities in the car's visual classifying system. For instance, they were able to make a stop sign look like a 45-mile-per-hour sign with just the use of some simple-to-make stickers.

Ethical issues abound.

Cars come with "ethical issues"--just like anything created by human beings. When we drive, we make 1000s of mini-decisions, many of which are ethical in nature. Researchers are trying to create "ethical algorithms" to help the car figure out what to do when it is making these ethical decisions. For instance, should the car save a driver or the four kids crossing the street. A simulation created by a group at M.I.T., called Moral Machine, brings some of these "Trolley Problem" scenarios to the public. However, there are other ethical issues with autonomous cars, including privacy issues and liability concerns.

Profit matters over safety.

The companies are thinking about profit--not your safety. Sure, they may lower accidents and reduce emissions, but the real reason companies are doing this is for profit. There's still a bunch of technological, ethical, and regulatory issues that have not yet been ironed out. Accidents and other issues are still a huge problem, particularly on bridges, in bad weather, in city traffic, and in high-speed circumstances.

Communication breaks down.

While much of driving is paying attention and avoiding obstacles, a large part of driving is communicating with other cars. You need to be able to signal to others and read their signals, you may need to "wave" at someone to go ahead at a stop, or you need to be able to observe the "body language" of a car just like you might with another human being. Automated cars are still struggling to communicate with each other.

Automated is not always safer.

With driverless cars, the driving is not fully automated, and similar to planes and their pilots, human beings can take over during problematic roads or situations. A human driver still needs to react if there is an issue--a tedious prospect. For instance, when human beings are driving automated cars, they can get lulled into thinking that they don't have to pay close attention anymore. If you aren't paying attention, you may also not respond as quickly to an alert or possible danger as you would if you are in the "driver's seat" so to speak.  

Published on: May 16, 2018