Analyses
12 mar 20

“Autopilot is the next step in cruise control”

In spite of a series of accidents involving a Tesla with Autopilot engaged, the feature does improve road safety. Using it correctly is essential, though, and drivers could do with more information or training on Tesla Autopilot.

Importantly and in spite of what the name suggests, Tesla Autopilot is not a true autopilot system, as drivers need to remain in control of their vehicle at all times.

Advanced Driver Assistance Systems (ADAS) are designed to improve road safety but they can provoke new risks, reports the Dutch Safety Board (Fleet Europe 113). Tesla’s Autopilot has been particularly controversial, having been involved in a series of incidents with fatalities.

On a fleet of 800,000 Autopilot-equipped Teslas driving around across the globe (according to Fortune magazine), the number of incidents may not seem to be particularly worrying. Misconceptions about what the system is capable of, however, may justify dedicated action targeting drivers.

So are fleet managers taking measures to insure Teslas in their fleet are safe?

Optional equipment

“We haven’t modified our policy,” said Arno Veenman (pictured, left), advisor HR expertise at the Dutch Volksbank and Dutch Mobility Manager of the Year 2019. Half his company’s car is electric (140 cars), 49 of which are adorned with the Tesla logo. Of those 49, 9 are equipped with Autopilot.

“We see Autopilot as a new piece of equipment in the same way as air conditioning, cruise control, active cruise control. We have not chosen to add Autopilot to all Teslas as it is optional equipment that impacts the lease rate. There is no incentive for our drivers to do so but they can pick the option.”

Pim De Weerd (pictured below, right), Global Commodity Manager, Philips, said his company has not introduced a specific Autopilot policy either. “I’m also convinced it shouldn’t be blown out of proportion. As a whole, I think Autopilot reduces the number of accidents even though there are reports of accidents when people weren’t paying attention or when they hadn’t turned Autopilot on.”

Innovation

The fleet Mr De Weerd manages worldwide includes 140 Teslas, some of which include Autopilot. He continued: “We have not had any incidents so far. I understand things can go wrong at some point but that’s inevitable when you try to innovate. At the end of the day, I’m convinced this technology makes driving safer.”

Mr Veenman has not experienced any Autopilot-related incidents either. “In any case, self-driving systems are still very limited and cannot be used to stop at traffic lights or to drive in urban areas. On motorways, however, I believe it is the next step to cruise control.”

Added value

Some say the current state of self-driving systems like Autopilot require additional training to prepare drivers and to teach them what the car can do and what it cannot. However, Mr Veenman doesn’t believe that should be done by the employer. “If drivers require additional information, they should be able to obtain that from the leasing company or the carmaker.”

Mr De Weerd believes Autopilot does indeed call for more information. “In the first place, that’s the carmaker’s role. It’s their product, it’s their car and it’s their name that’s at stake. And if you turn it around, it’s also up to them to demonstrate the added value they can offer. However, that’s not being done at all.”

Indeed, the experience Philips has had with Tesla reveal’s the company still has traits of a start-up. “Last year, the mass delivery of Teslas still had room for improvement. When a driver picks up a vehicle, they should be properly informed about the ADAS systems and they should be made aware that they remain in charge of the vehicle at all times.”

By and large, Tesla drivers at Philips and de Volksbank are happy to be driving their electric cars. Nevertheless, Tesla, now a company with a valuation higher than Ford and GM’s combined, should provide drivers with more information and perhaps driver training on Autopilot.

Assisted driving systems improve safety if used correctly

To date, five people have been killed in accidents involving Tesla Autopilot. A number of other accidents are still being investigated. However, it bears repeating that most, if not all, of these incidents can be attributed to driver error.

After a Tesla Model S crashed in Florida in 2016, for instance, investigators found the driver had ignored safety warnings given by his car, keeping his hands on the wheel for just 25 seconds of a 37-minute trip.

In the much-publicised case of the Uber car that struck a pedestrian, the safety driver had been using video streaming services on her smartphone. In this case, the vehicle was a Volvo retrofitted with Uber-specific systems.

According to the World Health Organisation, road traffic injuries caused an estimated 1.35 million deaths worldwide in 2016. The share of those deaths that involved Autopilot or similar systems, is infinitesimal.

“There are risks associated with these systems because drivers put too much trust in them,” said Richard Schram, Technical Director, Euro NCAP in Fleet Europe 113. Mr Schram agreed drivers need to be better prepared on how to safely operate a vehicle with self-driving features.

To this end, Euro NCAP is preparing new ratings for assisted driving in which it will focus on three elements:

  1. Engagement: how does the vehicle ensure the driver remains engaged? Teslas detect whether the driver has his hand on the steering wheel, for instance.
  2. Assistance level: how good is the system at self-driving?
  3. Safety Back-up: what happens if the driver doesn’t perform well or if the system fails? Ideally, said Mr Schram, the safety back-up system should be “uncomfortable”, so drivers learn not to rely on it.

Image copyright: Shutterstock

Authored by: Benjamin Uyttebroeck