Who is at fault in an autonomous vehicle accident? Are autonomous driving features safe? Is your self-driving car insured? In the wake of a high-profile accident and autonomous shuttle cancellations in Whitby and Toronto, we asked industry experts for their takeaways and to reflect on our future AV landscape
As the future of autonomous vehicles comes nearer, industry experts provide their insights into some of the most pressing questions about the evolving industry.
Self-driving technology is revolutionizing the automobile industry from vehicles with four wheels to 18. Manufacturers like Tesla, Ford, and Toyota are using semi-autonomous driving technology in their light-, medium- and heavy-duty vehicle lineups, while cities in Quebec, Ontario, Alberta, and British Columbia are testing driverless pilot programs at various levels of autonomy on and off public roads.
But while autonomous vehicles have game-changing potential, there is a strong divide among consumers and experts about the safety of this new technology.
“The perception of [autonomous vehicles] being unsafe probably has to do with the fact that people haven’t experienced them themselves,” explains Dan Finley, vice-president, corporate services and motorcoach at Pacific Western Transportation (PWT).
Based in Calgary, PWT piloted the first Canadian autonomous vehicle shuttle program, known as ELA in 2018. Since then, the emissions-free, driverless electric shuttle has taken part in 10 pilot projects spanning Alberta to B.C., without any incidents.
Finley says passengers were skeptical about the safety of the shuttles up until they got on, experienced a ride and were educated on its features.
The Society of Automotive Engineers (SAE) developed a classification that defines the degree of a vehicle’s automotive abilities, ranging from Level 0 (fully manual) to Level 5 (fully autonomous). Ontario’s ministry of transportation allows pilot programs to operate at an SAE Level 3.
“There are many autonomous vehicles out in the market today that are very safe. And with autonomous vehicles that we used — in the right situation, with the right parameters and the right safety measures in place — we had no issues with the operations of autonomous vehicles whatsoever,” says Finley.
But despite the successful results from the ELA projects and many others across Canada, self-driving vehicles will always make headlines if there is an incident — even when the technology is not engaged.
The latest Canadian example happened in Ontario late last year. In November, after months of vehicle testing and delays caused by the pandemic, the town of Whitby, in Durham Region, launched an autonomous shuttle service using an eight-seater vehicle named Olli. Just weeks later, however, the shuttle crashed into a tree, leaving the 23-year-old on-board attendant injured and the program in limbo.
Predictably, the story got immediate, widespread coverage in Canadian media. A big reason why such incidents make the news, says Finley, is the combination of sensational surface elements that make for a snappy headline coupled with little deep understanding of the nuances of the technology or the circumstances behind the incident.
“The headline is probably the challenge: ‘Autonomous vehicle gets in a collision’ [and] the vehicle is penalized without it actually being factual,” says Finley. “In many cases, autonomous vehicles can be safer than a human-driven vehicle. We’ve seen that because a lot of the issues or incidents with autonomous vehicles have been due to driver error or safety attendant error — not because of the autonomous functionality of the vehicle.”
Such was the case in Whitby.
In an email statement to Electric Autonomy Canada, a Durham Region spokesperson confirmed after a police investigation that the vehicle was out of service with the safety attendant alone in the shuttle and was “operating in manual mode at the time of the incident” and the accident “was not a failure of the collision avoidance technologies on the vehicle.”
The shuttle attendant has since been released from the hospital and is in recovery.
Even so, on Feb. 2 the municipality announced that the Whitby Autonomous Vehicle Electric (WAVE) pilot has been terminated. The Olli’s manufacturers, U.S.-based Local Motors went out of business in January and can no longer provide maintenance or support for the vehicle.
That’s not all. In neighbouring Toronto, the West Rouge autonomous vehicle pilot project — which was doing final testing on its own version of Olli 2.0 when the Whitby shuttle crashed — has also been cancelled after an initial suspension following the accident.
AutoGuardian, an Ottawa-based intelligent mobility solutions company, was the project lead for the WAVE pilot. Its responsibilities included training the onboard attendants for Olli.
“We actually go through a really rigorous training process that would go above and beyond any of the standard everyday driver training that you see for people just getting their license or fleet operations,” says Tenille Houston, co-founder and CEO of AutoGuardian.
To pre-qualify for the training program, AutoGuardian asks for a minimum of five years of driving experience and a clean record with no moving violations in the last three years.
The company looks for individuals who have experience with heavy equipment operations such as forklifts, snowplows, dump trucks and full-size buses, which require additional hand-eye coordination. Drivers with professional-scale experience driving Ubers or commercial delivery vehicles can be potential candidates as well.
Once a candidate is accepted into the program, Houston says they go through substantial training on equipment and operating procedures “before we actually let somebody manually drive the vehicle on the road.” Post-certification training and ongoing check-ins are also conducted.
With the Whitby shuttle accident, Houston calls it an “unfortunate” situation where it was an “operator error not following the training precisely as intended.”
But, Houston says, the crash is actually a further reason why more advanced autonomous technology is needed: so that sensors and monitors can detect surroundings and can react to them rather than humans.
With autonomous driving features trickling more into the market, the Insurance Bureau of Canada (IBC) is already anticipating a world where many or all vehicles on Canadian roadways will be autonomous and what implications that will have on vehicle insurance requirements and liability.
“Today you’re the one driving your car, so if there is an accident it’s either caused by you or another driver and so the liability and the fault in those instances are placed with the driver,” explains Aaron Sutherland, vice-president of Pacific & Western at the Insurance Bureau of Canada.
“Autonomous vehicles flips that on its head because now it’s the vehicle that was driving you and so if there’s an accident, theoretically, that’s the vehicle that had caused it, so it’s the vehicle’s fault and it’s the manufacturer’s fault…. Then it becomes a question of, if you’re not driving the vehicle, do you need insurance or does the vehicle manufacturer need insurance?”
In a report published in 2018, IBC provided recommendations for how the insurance landscape will have to change to reflect this different reality. Expectations include fewer accidents but costlier repairs, possible risks in software and network failures, and a switch from personal liability to product liability if there is a claim.
In the report, IBC proposed the adoption of single policy insurance for automated vehicles to cover both driver negligence and product liability exposure so regardless of who caused the accident — whether it’s the driver or the car itself — there is coverage available.
This is vital, says Sutherland, because product liability claims are not part of the auto insurance landscape of today. Auto insurance currently uses personal liability, which is well-regulated by the government in terms of the types of benefits people receive depending on the nature of the accidents.
“The system that exists today for [product liability is] quite complex and time-consuming and it doesn’t quite fit for auto insurance. Because if you get injured by an autonomous vehicle, it could theoretically be years before you get what you need to recover,” says Sutherland.
“Insurance is here to help people recover from accidents and we need to make sure that we create as smooth and efficient a process as possible.”