Driverless cars might be safer but they’ll still keep the courts busy

Driverless cars might be safer but they'll still keep the

If driverless cars live up to the safety hype, they could result in a significant reduction in the number of court cases dealing with human-related traffic offences.

But before we can clear the courts, we will need to have a period where human drivers share the responsibility (both actual and legal) for control of their vehicles.

Arriving at the point where fully automated vehicles are on our roads requires us to establish who is legally in control of them. If we don’t, we could find that we have simply replaced one type of legal dispute with another.

Traffic offences

More than 200,000 people in Australia were found guilty of driving and traffic offences in the 12 months to June last year, according to Australian Bureau of Statistics figures.

The most common crimes were driving while intoxicated (71,501 cases) and speeding (16,461 cases). Other offences included failing to obey traffic signs and signals, and careless driving.

In some parts of Australia these offences are increasing in frequency, placing a growing burden on our courts. All because of wrongdoing or careless conduct of a human in charge of a motor vehicle.

By contrast, fully automated vehicles have established an impressive compliance and safety record, so far.

There have been very few reported crashes, although they obviously cause concern when they occur, such as the crash last year that led to the death of the driver of a Tesla car in autopilot mode.

Most crashes involving autonomous vehicles are less serious than that, and are due to the errors of other (human) drivers, such as last month’s Uber collision in Arizona and the 16 crashes reported by Google between 2009 and 2015. Google’s cars have been involved in further accidents, including one last February when its autonomous Lexus pulled out into a bus.

See also  There's no need to lock older people into nursing homes 'for their own safety'

The safety record of autonomous vehicles is frequently contrasted with research that identifies that more than 90% of regular vehicle collisions are caused by human error.

As the number of fully automated cars on our roads increases, we should expect to see a reduction in the number of traffic offences coming before our courts. In Victoria alone these offences currently account for nearly 21,000 cases heard each year in the Magistrates’ Court.

Drivers, passengers and ‘chaperones’

The removal of the human driver has the potential to make some criminal charges unnecessary. This could include charges of driving while intoxicated, unlicensed driving, and driving with a disqualified, suspended or cancelled licence.

Making these offences redundant in Victoria, for example, would remove about 29,000 cases annually from the Magistrates’ Court.

But reform may be necessary to clarify the legal situation. In Victoria the definition of “drives” in the Road Safety Act 1986 includes “to be in control of a vehicle”. Drink and drug driving offences refer to a person who “drives … or is in charge of a motor vehicle”.

Would a person who enters a fully automated car that unlocks as they approach it, who gives verbal directions as to their destination (and leaves it to the car to determine the route), be said to be in control or in charge of the car?

If that person is not in control of the car, then who is? And what if the vehicle in question is an Uber vehicle, taxi or bus? Who is responsible for the actions of that vehicle – the hirer, the owner or the manufacturer?

See also  SpaceX explosion shows why we must slow down private space exploration until we rewrite law

It is interesting to note that the staff member present in Curtin University’s recently introduced fully automated bus is referred to as a “chaperone”.

Curtin University is trialling an electric driverless bus that seats 11 passengers.

In essence, a key legal issue will be to determine whether a person present in a fully automated car is more like a driver or a passenger.

Tesla’s chief executive Elon Musk said his company will accept liability for fully automated vehicles where a problem stems from a design fault. But he also suggested that in other circumstances, individuals and their insurers may be responsible.

Some clarification of the legal position may be required. A report, released by the UK government’s Department for Transport in February 2015, warned that with fully automated vehicles:

[…] it does not seem reasonable to suggest that the human driver is still responsible for the manner in which the vehicle drives since they may not even be aware of the road environment or the presence of other road users.

The transition phase: semi-automated vehicles

While fully automated vehicles are not yet freely driving on Australian roads (the Curtain bus is restricted to a specified route), many Australian drivers already use cruise control, electronic stability control, parking assist and advanced braking systems to help them drive their cars.

Human drivers can override these automatic settings at any time, and may be required to do so to avoid colliding with cyclists, pedestrians, animals and stationary vehicles – all of which may not be detected by the automated programs.

Consequently, manufacturers warn drivers of semi-automated vehicles that they must remain alert and ready to resume control in the case of adverse environmental conditions or unexpected events.

See also  Why Melbourne’s e-scooter ban is a wrong turn away from safe, sustainable transport

Because of this shared responsibility for control of the vehicle, we might expect legal problems in attributing responsibility when something goes wrong. Fault could rest with a human driver who fails to take control when required to do so, or with a manufacturer whose product is faulty.

While some car manufacturers – such as Volvo with its Pilot assist – say that drivers bear responsibility for controlling the car even when the semi-automated program is used, the situation might be more complex.

It may be that the semi-automated program itself causes the problem. News reports from the US say that federal authorities have already expressed concern that a safety device on a semi-automated car may itself cause accidents. In these situations it may not be so easy for manufacturers to escape liability.

Marilyn McMahon does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.