Taking back control of an autonomous car affects human steering behavior, Stanford research shows
When human drivers retake control of an autonomous car, the transition could be problematic, depending on how conditions have changed since they were last at the wheel.
There you are, cruising down the freeway, listening to some tunes and enjoying the view as your autonomous car zips and swerves through traffic. Then the fun ends and it becomes time take over the wheel. How smooth is that transition going to be?
Twenty-two drivers put that question to a test – on a track, not a freeway – to find out. The results, which were published in the first issue of Science Robotics on Dec. 6, could help in the design of future autonomous cars.
The researchers, who had a combined expertise in autonomous car design, human-robot interaction research and neuroscience, found that the transition could be rough. Drivers who experienced certain changes in driving conditions since their last time at the wheel, such as changes in speed, since their last time at the wheel had a period of adjustment in their steering.
“Many people have been doing research on paying attention and situation awareness. That’s very important,” said Holly Russell, lead author of the research and former graduate student in the Dynamic Design Lab at Stanford University. “But, in addition, there is this physical change and we need to acknowledge that people’s performance might not be at its peak if they haven’t actively been participating in the driving.”
The trouble the drivers had getting used to different driving conditions wasn’t enough to cause them to miss their turns, but it was noticeable in the researcher’s measurements and by watching them wobble the wheel to account for over- and understeering. These challenges bring up the possibility that, depending on the particulars of the driver, the driving conditions and the autonomous system being used, the transition back to driver-controlled driving could be an especially risky window of time.
Mimicking a transition from self-driving
Study participants drove a 15-second course consisting of a straightaway and a lane change. Then they took their hands off the wheel and the car took over, bringing them back to the start. After going through this process four times, they drove the course 10 additional times with steering conditions that were modified to represent changes in speed or steering that may occur while the car drives itself.
Changing the steering ratio from the standard 15:1 to 2:1 simulated the more sensitive steering feel drivers experience at a higher speed. This modification made the car turn more sharply to simulate the way less steering wheel movement is needed to make a lane change at a high speed versus at a low speed.
All drivers were given advance warning of the changes and had some opportunity to probe the difference during the straightaway. Regardless, during the altered steering ratio trials, the drivers’ steering maneuvers differed significantly from their paths previous to the experimental modifications.
“Even knowing about the change, being able to make a plan and do some explicit motor planning for how to compensate, you still saw a very different steering behavior and compromised performance,” said Lene Harbott, co-author of the research and research associate in the Revs Program at Stanford.
The participants also drove the course another six times, after being taken back to the start by the car, with the original conditions restored. Again, drivers who experienced the steering ratio change displayed a clear period of adjustment, undershooting the steering wheel turning required to complete their lane change.
In neuroscience this is explained as a difference between explicit and implicit learning, said IIana Nisky, co-author of the study and senior lecturer at Ben-Gurion University in Israel. Even when a person is aware of a change, their implicit motor control is unaware of what that change means and can only figure out how to react through experience.
A classic neuroscience test
This driving test is close to a real-life version of a classic neuroscience experiment that assesses motor adaptation. In one example of these experiments, participants use a hand control to move a cursor on a screen to specific points. The way the cursor moves in response to their control is adjusted during the experiment and they, in turn, change their movements to make the cursor go where they want it to go.
Just as in the driving test, people who take part in this experiment have to adjust to changes in how the controller moves the cursor. They also must adjust a second time if the original response relationship is restored.
“Even though there are really substantial differences between these classic experiments and the car trials, you can see this basic phenomena of adaptation and then after-effect of adaptation,” said Nisky. “What we learn in the laboratory studies of adaptation in neuroscience actually extends to real life.”
It also showed that the effect and after-effect of motor adaptation applies to skilled tasks that people have learned over a long period of time.
What this means for autonomous cars
Although these drivers were not so thrown off by the changes in steering that they drove off-course, the fact that there is a period of altered steering behavior is still significant. There are so many different variables involved in driving that anything that compromises driving performance could lead to an accident.
In this research, the test vehicle was developed at Stanford and doesn’t represent any system currently available. The study addressed one specific example of handover, but there is still a lot to learn about how people respond in other circumstances, depending on the type of car, the driver and how the driving conditions have changed.
“If someone is designing a method for automated vehicle handover, there will need to be detailed research on that specific method,” said Harbott. “This study is tip of an iceberg.”
Additional co-authors on this paper include Allison Okamura, professor of mechanical engineering, member of Stanford Bio-X and member of the Stanford Neurosciences Institute; Chris Gerdes, professor of mechanical engineering at Stanford; and Selina Pan, former postdoctoral fellow in Stanford’s Dynamic Design Lab.
This work is funded by the Revs Program at Stanford University, the Toyota Class Action Settlement Safety Research and Education Program, the Israel Science Foundation, the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Initiative of Ben-Gurion University of the Negev and the U.S. National Science Foundation.