Business

Two Young Drivers Take the Wheel. One, a Teen. The Other, a Tesla.

At the corner of Iris Ave and Folsom in North Boulder, my Model 3 Tesla was self-driving when it showed up two human drivers in the lane just in front of us. All three of us were turning left — two flesh-and-blood drivers and my robot — when the two humans violated a basic traffic rule by swinging wide and turning into the right hand lane.

The Tesla hugged the inside lane, as the driver manual indicates is the proper rule of the road. I wished my teenage son had been watching.

Milo is 15, with a learner’s permit. It is my fantasy that when he gets his license, he will develop the memory, rote behaviors and mundane habits that have shown themselves possible in the short life span of self-driving cars. On the other hand, my son is less likely than the Tesla’s software to suddenly disengage and just stop steering altogether (requiring me to take over).

The machine and the adolescent each have brains still under development, a human mind governed by millions of years of evolutionary biology, and algorithms shaped by decades of engineers. Seen through the lens of cognition and neuroscience, the contrast says a lot about the next generation of drivers.

For now, only the Tesla (not my son) has been involved in wrecks. The federal government in April reported that Tesla’s Autopilot technology was involved 956 crashes between January of 2018, and August of 2023, including 29 fatalities. The National Highway Traffic Safety Administration report found that, “Autopilot’s system controls may be insufficient for a driver assistance system that requires constant supervision by a human driver.”

Also recently, Elon Musk concluded talks in Beijing to clear the way with regulators to bring Autopilot to Chinese roads. Lots of other companies are developing their own versions — General Motors, BMW, Mercedes, Lincoln, Kia, and others — most that take some control in limited situations, like on the highway.

Back to top button