Mar 10, 2018 2:00 AM
PT New rules for “self-driving cars” in California highlight a glaring misconception about how A.I. works.

An educated public understands that autonomous vehicles are amazing but that they are so far unable to fully take control of a passenger vehicle with no human behind the wheel. Whoa, slow your roll there, self-driving car narrative.

California just approved licenses for self-driving cars to in fact have no human driver behind the wheel, or no human in the vehicle at all (after dropping off a passenger, or for deliveries) with one caveat: The self-driving car companies must monitor and be able to take over driving remotely. This rule matches other parts of the country where no-driver autonomous vehicles have been allowed.

There’s no human in the car, or at least the driver’s seat, but remote monitoring and remote control make that possible, safe and legal. I imagine NASA-like control rooms filled with operators and screens and traffic reports, where maybe a few dozen people are monitoring a few hundred vehicles, then taking control when they malfunction, freeze up or confront complex driving scenarios.

It’s also possible that remote monitoring could be done in call center-like cubicle farms. Here are the autonomous vehicle companies I’m aware of that are public about building or have already built such control rooms: Come to think of it, it’s not just the autonomous cars that need human help when the A.I. isn’t intelligent enough to handle unexpected situations. Read more from computerworld.com…

thumbnail courtesy of computerworld.com