The recent tragic accident in Arizona involving an autonomous vehicle and a pedestrian raises some important questions about liability. While not yet available to consumers, self-driving cars are being tested on streets throughout the United States and Canada in order to fine-tune and develop the emerging technology with the ultimate goal of reducing collisions involving motor vehicles.

The question this unfortunate accident raises is who is at fault? Is it the test driver behind the wheel? The manufacturers of sensors or radar? The developer of the software? The owner of the vehicle? Or was it purely contributory negligence on the part of the pedestrian?

In Ontario, the Highway Traffic Act (HTA) has created a reverse onus on the drivers of automobiles involved in collisions with pedestrians. Section 193(1) reads:

When loss or damage is sustained by any person by reason of a motor vehicle on a highway, the onus of proof that the loss or damage did not arise through the negligence or improper conduct of the owner, driver, lessee or operator of the motor vehicle is upon the owner, driver, lessee or operator of the motor vehicle.

Unlike a collision with another vehicle, a driver is presumed negligent unless proven otherwise when a pedestrian is struck by a car. Therefore, had the collision between the autonomous vehicle and a pedestrian occurred in the streets of Toronto for example, Uber would be presumed negligent. The presumption is rebuttable, however, in circumstances where the driver can demonstrate evidence that they are only partially responsible or not responsible at all for the injury. Case law has permitted an apportionment of liability in these circumstances, which is largely driven by the facts of each case.

The recent accident in Arizona involved a pedestrian who was walking her bike across the street when she was struck. Some of the important facts to determine when considering the pedestrian's contributory negligence are:

  • Was she crossing the street in a designated crossing zone?
  • Was she keeping a proper lookout before crossing the street?
  • Did she make herself visible to the “driver”?
  • Did she move onto the road suddenly?, and
  • Was the vehicle speeding?

If the pedestrian bolted across the street immediately before being struck, a driver is generally not considered liable given that they had no reasonable opportunity to avoid the collision. This is supported by section 140(4) of the HTA which states that a pedestrian must not leave a place of safety if it is impractical for the driver to yield. (Initial reports of a video taken from the car do indeed suggest the pedestrian suddenly moved in front of the car.1)

Pedestrians maintain a right of way when crossing at non-designated crosswalks but they must take heightened care when doing so. Pedestrians have higher rights when crossing at a designated crosswalk and, therefore, a greater expectation is on drivers to avoid such collisions.2 In determining the liability of the autonomous vehicle, the car's videos and black box data must all be analyzed to establish whether the vehicle should have avoided the collision – either by its own sensors or by the test driver overriding the car's controls. Despite its heightened technological abilities, an autonomous vehicle is not held to a higher standard of care than a conventional driver when assessing liability.

An interesting point to consider in this circumstance is whether Uber's autonomous car had an issue with identifying the pedestrian because she was walking a bicycle. At any one time, there are over a dozen sensors scanning the immediate area of an autonomous car including RADAR, LIDAR and various cameras. Perhaps the bicycle and pedestrian combination confused the autonomous car in the same way autonomous vehicles have been confused by kangaroos in Australia. Volvo reported in the summer of 2017 that its autonomous vehicles were unable to identify kangaroos because of their odd movements.3 Autonomous vehicles operate on learning based software called artificial intelligence (AI) rather than rules-based software, which means they need to experience situations and then work those situations into the way the program works. This is precisely why companies like Uber are conducting real-world testing of their vehicles on the road. This incident in Arizona will likely encourage automakers to conduct more tests with pedestrians and bicycles to encourage further learning by their software in order to anticipate as many scenarios as possible.

As with conventional pedestrian-motor vehicle collisions, early investigation remains essential in order to properly assess liability. While autonomous technology adds another layer of complexity in assessing liability, it simultaneously provides more clarity and guidance in producing large amounts of useful data that will impart more certainty in determining liability.

Footnotes

1 Uber Victim Stepped Suddenly in Front of Self-Driving Car, March 20, 2018 .

2 Lalonde v. Kahkonen 1971 CarswellOnt 693.

3 Volvo's driverless cars confused by kangaroos, June 27, 2017

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.