Tesla tests drivers to trust them to supervise experimental Autopilot

Tesla tests drivers to trust them to supervise experimental Autopilot
Enlarge / Tesla Autopilot is currently under investigation by the National Highway Traffic Safety Administration for 12 crashes and one death. Now the company is expanding access to an experimental version of the software.

Over the weekend, Tesla expanded access to the latest version of the company's highly controversial $10,000 automated driving feature. As is the Tesla way, CEO Elon Musk took to Twitter to announce the news, saying that owners could begin requesting access to the beta on Saturday. However, Musk noted that "FSD 10.1 needs another 24 hours of testing, so out tomorrow night."

For now, access to the latest build of the software is by no means assured. Instead, drivers have to agree to have their driving monitored by Tesla for seven days. If they're deemed safe drivers, they can have access to the experimental software. By contrast, autonomous vehicle companies like Argo AI put their test drivers through extensive training to ensure they're able to safely supervise experimental autonomous driving systems while they are being tested on public roads, which is an entirely different task from that of safely driving a car manually.

Better not brake

Tesla says that five factors affect whether or not you're safe enough a driver to then perform the task of supervising an unfinished automated driving system that is currently under investigation by the National Highway Traffic Safety Administration for a dozen crashes into parked emergency vehicles, including one fatality.

Specifically, Tesla is leveraging the connected nature of its cars and its insurance product. This involves tracking the number of forward collision warnings per 1,000 miles, hard braking events (above 0.3 G), any aggressive turning (above 0.4 G), whether the driver follows other vehicles too closely (within 1 second), and any forced autopilot disengagements. Tesla then uses this information and actuarial data to calculate a predicted collision frequency, and that number is in turn converted to give a safety score from 0-100.

Needless to say, the reactions to this plan have not been entirely positive. Last week, authorities in San Francisco, which has perhaps the highest concentration of Teslas on earth, raised their concerns about more cars testing FSD 10.1 on their streets. California regulators are also investigating whether Tesla's claim of "full self-driving" is deceptive.

  • Twitter user and Tesla owner Stephen Pallotta has been posting videos of FSD Beta 10.1 in action and they're not encouraging.
  • Ignoring the fact that being a safe driver is in no way a guarantee that you're also a good supervisor to an experimental robot driving system, does this behavior make you feel in any way safe?

Those concerns are well-founded, if early reports by Tesla owners are anything to go by. Podcaster Stephen Pallotta has been posting videos to Twitter showing his car's behavior with the latest build. One shows it crossing a double-yellow line into oncoming traffic, others complain about phantom braking events and even a failure to slow for pedestrians in a crosswalk.

Perhaps more worrying are the reports from an investor called Gary Black, whose tweets showed he was able to increase his safety score from 91 to 95 by "running yellow lights," "not braking for the cyclist who crossed against the red in an intersection," and "rolling through the stop signs." We would ask Tesla for a comment on these accounts, but the company dissolved its press office in late 2020, so there's no one to ask.

So, as Sergeant Esterhaus used to say in Hill Street Blues, "Let's be careful out there."