Blog - Are autonomous vehicles safer without a steering wheel?

Posted by

Are autonomous vehicles safer without a steering wheel?

According to some researchers, taking manual control of an autonomous vehicle in critical situations is dangerous. Recent Tesla crashes while in autopilot mode are challenging the views of American consumers, just as they have started to become more comfortable with the idea of letting technology take over the wheel. The suggestion is that these incidents are not in fact the fault of the autonomous car, but are down to the driver unwisely intervening and preventing the computerised system from dealing with the problem.

This raises an important discussion about self-driving cars: should the driver still be able to take over the controls or is that too dangerous?

“Imagine that you run into fog and poor traction. The idea that you suddenly take control of the vehicle when it’s about to spin - it's absurd”, says researcher Martin Steinert.

Steinert is a professor at the Norwegian University of Science and Technology (NTNU) and directs the multi-disciplinary research group TrollLabs. Among other projects, the group is researching technology in autonomous vehicles and ships.

“Just think about it! How hard should you brake? You’d need a few seconds to decide, whereas the computer system will make the decision in an instant”, says Steinert.

Driver or passenger?

Two seconds in a critical situation at high speed is too long. One tragic example is the fatal accident in Florida where a Tesla on autopilot collided with a trailer.

Should drivers continue as drivers or become passengers instead?

Tesla has chosen the first option. If drivers do not keep their hands on the wheel, the car turns on the hazard lights, slows down and stops. The control system is a tool - a kind of extended cruise control. The driver is still needed.

Waymo (the newly-created subsidiary of the Alphabet group, formerly run by Google X-Labs) has chosen the opposite route, building its cars with no steering wheel. Steinert supports this option. “Are you able to perform any better than your computer? Probably not.” he says.

Steinert gives examples from his time at Stanford. Chris Gerdes, among other researchers, worked with autonomous cars on the rally track there for many years.

Back in 2008, the student car P1 had already mastered drifting in tight circles on the loose, gravelly surface.

“P1 is able to keep the skid in check. It’s unbelievable. A human driver couldn’t do that,” says Steinert.

Later, Shelley - a converted Audi TT – showed that a computer is as good on the racetrack as the best rally driver.

Trusting Tesla owners

However, public confidence is won through experience with chaotic everyday traffic, not on the racetrack.

Internet videos show drivers reading, eating or sleeping at the wheel of their Tesla on autopilot - some even moved over to the passenger seat. With Tesla's latest updates, this is no longer possible. Now drivers need to keep their hands on the wheel.

Will the accident in Florida make people more cautious? Maybe, but for every foolhardy Tesla owner there are many fearful passengers. What will get them into a car without a driver – or a steering wheel?

Steinert and his colleagues are working on the answer. In 2014 and 2015 they built a prototype of an autonomous car that shows what it sees – and what it plans to do.

TrollLabs, Stanford’s Center for Design Research and Renault collaborated on the project.

LIDAR and lighting strip

The Renault vehicle was equipped with a wide lighting strip under and along the sides of the dashboard. When an obstacle appeared in the front or on the side of the car, the light strip glowed white on the same side.

“The light strip wasn’t providing exact information. It was only making you aware that the car detected something,” says Steinert.

The obstacles were detected with a LIDAR system, which works like radar but uses light from a laser instead of short radio waves.

The Renault in the experiment had three LIDAR instruments that provided 3D information on the lane ahead and gave signals to the light strip. A pedestrian in the road, for example, became a diffuse lit spot on the light strip in the direction of the pedestrian.

Google's autonomous car has a LIDAR array with as many as 64 detectors. It gives a more detailed picture around the vehicle, but also costs more. Tesla uses ordinary cameras for tracking rather than LIDAR technology.

Affective engineering

The experimental car also had a footplate at pedal height. When detecting an obstacle, the autopilot raised the footplate ever so slightly.

“A human driver would lift his foot and hold it over the brake pedal to be ready. The footplate gave a friendly little jerk, as if to say, ‘Hey, I'm ready,’” explains Steinert.

Steinert and his colleagues call this an example of affective engineering. Machines need to interpret people’s feelings and take them into account.

However, that’s easier said than done. Emotions are chaotic. How can a car - or other equipment for that matter - know whether you're feeling safe or scared?

Steinert has written an article with colleague Stephanie Balters about this in the Journal of Intelligent Manufacturing.

Distinguishing emotions with infrared

The article explains how to measure emotional reactions in the human body. Strong emotions increase the heart rate and dilate our pupils.

Sweating can also indicate strong emotions. It conducts current, so it can be measured as less electrical resistance in the skin. The problem with this is that strong feelings can be positive or negative, so the measurements don’t distinguish between exultation and horror, or whether emotions are caused by traffic or something you hear on the news.

To determine this difference, scientists have until recently had to interview subjects afterwards but now they can also make use of near-infrared spectroscopy, or NIRS, a non-invasive brain imaging method

“The skull is transparent to the infrared rays. We can see the blood flow in the brain,” says Balters.

The NIRS rays can be combined with traditional EEG measurements, which are linked to computer software that constructs models for detecting emotions.

Another computer program, called iMotions, analyses eye movements, facial expressions and other body sensors to interpret emotions better.

The driving simulator lets researchers test how the driver can handle current and future cars.

Still in driving school

Learning to trust autonomous cars is one thing, but autonomous cars are in training, too, and still learning to recognise dangerous situations.

As Balters explains, the system for avoiding collisions is programmed to monitor the areas down low in front of the car. In the Florida accident, due to the high ride height of the trailer, the radar didn’t perceive it as an obstacle and alert the driver to take over.

Once autonomous cars actually manage to pass the driving test in all situations at some point in the future, how will they tell passengers that they can relax - even if the steering wheel is gone?

Maybe it will be with light signals and tilting of the footplate, as in the Renault experiment, or perhaps the car will talk to the passenger.

According to a paper by Steinert and colleagues in the Journal of Interactive Design and Manufacturing, people preferred it when the car explained why it did something, not just what it was doing.

“The car has to learn how each individual reacts. Each person responds differently. It’s not the sensors that are hard to understand, but interpreting them,” says Balters.

Human beings also have to be involved in building this trust.

“You need to learn the car’s behaviour and get used to it. People have to learn the car’s psychology and the car needs to learn the psychology and physiology — i.e. the body’s reactions — of human beings,” says Balters.

Title Image Credit: © Tesla Motors (Image Cropped)

Add a comment

:
:
: