2018 has not be a good year for progress when it comes to self-driving cars.
I'm an advocate for the technology and making it widely available someday, and feel the concept holds promise if the first step involves a dedicated, barricaded lane similar to what you see on highways from San Francisco to LA. On city streets, a dedicated self-driving car lane that's also cordoned could work. If the car is driving at 30 miles-per-hour but can only ding up a fender on a cement barrier, it's a little embarrassing and an insurance nightmare, but the overall consequences are not that severe.
The problem lately is that the technology has caused much worse accidents, some including fatalities. Google has proven again and again that, after driving five million miles with a self-driving car, there's a chance it will all work out. But one major mistake calls the entire concept of autonomous driving into question. Now, there's a new setback.
Recently, a driver in the UK switched from the driver's seat to the passenger's seat during a morning commute on the M1 highway. (Another driver filmed the incident.) He was fined, banned from driving for 18 months, and has other charges against him, like unpaid work time. The case is interesting for a few reasons.
One is that we're "not there yet" when it comes to autonomous cars. It's shocking to see an unpiloted car because, in general, few of us are ready to really accept the technology. On a closed road, where it would be impossible to film a car driving itself, there's less at stake. On an open road like the M1 highway in the UK? It seems dangerous, although it's also dangerous to film another car with a phone while you're driving.
The technology might be improving every day, but the perception about whether cars should drive on their own is stuck in reverse. Each new incident adds to the concern. How quickly could that driver have jumped back behind the wheel? Not fast enough. How comfortable are we having a car driving without a human behind the wheel? Not comfortable enough. Tesla has backed away from allowing drivers to enable self-driving mode for long periods, but drivers have figured out that they can fool it by wedging an orange into the steering wheel. The ingenuity of automotive engineers is one thing, but the ability of drivers to find workarounds (and their propensity to cause problems) is hard to predict. There are too many strange occurrences on highways and city streets, and what we've been learning lately is that the AI in cars can't really keep up with the lunacy.
Here's just one example from my own experience. I tested out the Cadillac CT6 recently, which uses a new technology called Super Cruise. I noticed how the vehicle would mostly stay in its own lane, adjust its speed, and generally piloted the car without issue. But in one test, a car decided to ride my bumper. Did the CT6 notice that? If it did, the car didn't act like a human and just kept on driving as though nothing was wrong. I grabbed the wheel back and sped up a little, then changed lanes. I knew, intuitively and from experience, that when someone rides your bumper like that, it is best to avoid them.
AI has come a long way, but we need to collect more data and figure out the balance between robotic precision in driving and a human emotional response that makes sense.