Advertisement

Debate needed now on ethics and liability issues to get driverless cars back on Hong Kong roads

Franklin Koo says the tussle over whether Tesla’s new autopilot software should be allowed in Hong Kong highlights the concerns that must be addressed if the technology is to be used

Reading Time:3 minutes
Why you can trust SCMP
A Tesla Model S car equipped with the autopilot feature is taken for a test drive in Palo Alto, California. Photo: Bloomberg
In October, Tesla introduced its autopilot software, updating its cars in Hong Kong and across the globe. The software enabled the option for automated driving, a feature similar to what aircraft pilots use. The aim is to improve safety, minimise jams, allow the efficient use of energy and reduce stress for road users. Even so, the feature was short-lived here: last month, Tesla disabled the feature on its cars in Hong Kong following a government request. With the technology in limbo, there are crucial issues concerning ethics and liability that need addressing.

WATCH: Can Tesla’s new ‘autopilot’ system work in crowded Hong Kong? The South China Morning Post takes it for a test drive

In the event of an unavoidable collision, the decision-making of an autonomous car would ultimately rest in the hands of programmers writing the algorithms. Consider a revised version of the “trolley problem” used in ethics discussion: a child suddenly dashes onto the road, forcing the self-driving car to choose between hitting the child or swerving onto an oncoming truck in the other lane. With an autonomous car, the manufacturer now has the responsibility of planning such a scenario beforehand, as opposed to a person making a split-second decision. The algorithms, which decide how the car will behave, should be regulated to ensure it reacts responsibly. Jean-Francois Bonnefon at the Toulouse School of Economics in France suggests that the algorithms will need to accomplish three objectives: be consistent, not cause public outrage, and not discourage buyers.

Advertisement
The aim of the autopilot feature is to improve safety, minimise congestion, allow the efficient use of energy and reduce stress for road users. Photo: Bloomberg
The aim of the autopilot feature is to improve safety, minimise congestion, allow the efficient use of energy and reduce stress for road users. Photo: Bloomberg
Uniform algorithm standards could ensure consistency and predictability. However, this is difficult to implement since vehicles differ in size, acceleration and braking. Applying the above scenario, an SUV may choose to collide with the truck since it would offer better protection to occupants, while a compact car would probably choose not to.

Keeping it simple could avoid the ethical dilemma altogether. A project called CityMobil2 tests automated transit vehicles in various Italian cities. The vehicles simply follow a route and brake if something gets in the way.

READ MORE: Insurance a legal grey area in Hong Kong for all-electric Tesla car’s new Autopilot system

Algorithms are so integral to an autonomous car that liability may shift from the traditional role of the driver to the manufacturer. Nonetheless, increasing liability may be premature since the use of autonomous cars could save lives.

According to the World Health Organisation, about 1.25 million people die on roads worldwide every year. It is hoped that autonomous cars could significantly lower the number of fatalities, since, unlike a person, self-driving cars never get tired, drunk or distracted, and can foresee the traffic based on radar. Burdening the automaker with additional legal liability at the outset would only slow development of the technology.

Elon Musk, Tesla’s chief executive, speaks during an event in October, when the company begin rolling out the first version of its highly anticipated autopilot feature. Car manufacturers must respect local laws, ensure timely maintenance and prevent vulnerabilities such as hacking. Photo: Bloomberg
Elon Musk, Tesla’s chief executive, speaks during an event in October, when the company begin rolling out the first version of its highly anticipated autopilot feature. Car manufacturers must respect local laws, ensure timely maintenance and prevent vulnerabilities such as hacking. Photo: Bloomberg
By contrast, a vehicle in the hands of an irresponsible driver can be a lethal weapon. The same, of course, can be said for a malfunctioning or neglected autonomous car. The automaker should be held liable if an accident occurs due to inaccurate data updates affecting a car’s navigational system, for instance. Furthermore, car manufacturers must respect local laws, ensure timely maintenance and prevent vulnerabilities such as hacking.
Advertisement

The legal position is currently unclear for autonomous cars. Hong Kong law does not yet make a distinction between an autonomous car and one driven by a person in its Road Traffic Ordinance. However, the commissioner for transport does have the power to ask Tesla to disable its autopilot software. Currently, there is no legal framework even at an international level.

Advertisement