April 17, 2023
By Patricia Waldron
The town of Linköping, Sweden, has a small fleet of autonomous electric buses that carry riders along a predetermined route. The bright vehicles, emblazoned with the tagline, “Ride the Future,” have one main problem: Pedestrians and cyclists regularly get too close, causing the buses to brake suddenly, and making riders late for work.
Researchers saw this problem as an opportunity to design new ways of using sound to help autonomous vehicles navigate complex social situations in traffic. Currently, sound is still underexplored as a tool to enable autonomous vehicles and robots to interact with humans and each other.
The research team found that jingles and beeps effectively move people out of the way. But more importantly, they discovered it’s the timing of the sound – not the sound itself – that allows the bus to meaningfully communicate with people in traffic.
“If we want to create sounds for social engagement, it's really about shifting the focus from ‘what’ sound to ‘when’ sound,” said study co-author Malte Jung, associate professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science (Cornell Bowers CIS).
Lead author Hannah Pelikan, a recent visiting scholar in the Department of Information Science at Cornell Bowers CIS and doctoral student at Linköping University, presented their study, “Designing Robot Sound-In-Interaction: The Case of Autonomous Public Transport Shuttle Buses,” on March 15 at the 2023 ACM/IEEE International Conference on Human-Robot Interaction. The work received a nomination for the best design paper award.
The researchers designed potential bus sounds through an iterative process: They played sounds through a waterproof Bluetooth speaker on the outside of the bus, analyzed video recordings of the resulting interactions, and used that information to select new sounds to test. Either the researchers or a safety driver, who rides along in case the bus gets stuck, triggered the sounds to warn pedestrians and cyclists.
Initially, the researchers tried humming sounds that became louder as people got closer, but low-pitched humming blended into the road noise and a high-pitched version irritated the safety drivers. The repeated sound of a person saying “ahem” was also ineffective.
They found that “The Wheels on the Bus” and a similar jingle successfully signaled cyclists to clear out before the brakes engaged. The song also elicited smiles and waves from pedestrians, possibly because it reminded them of an ice cream truck, and may be useful for attracting new riders, they concluded.
Standard vehicle noises – beeps and dings – also worked to grab people’s attention; repeating or speeding up the sounds communicated that pedestrians needed to move farther away.
In analyzing the videos, Pelikan and Jung saw that regardless of which sound they played, the timing and duration were most important for signaling the bus’ intentions – just as the honk of a car horn can be a warning or a greeting. A sound that is too late can become incomprehensible, and is ignored as a result.
These insights came from applying conversation analysis, an interdisciplinary approach influenced by sociology, anthropology, and interactional linguistics, which has not been used previously for robot sound design. By transcribing the pedestrians’ reactions in the video recordings in great detail, the researchers were able to see the moment-by-moment impact of the sounds during a traffic interaction.
“We looked very much at the interaction component,” Pelikan said. “How can sound help to make a robot, bus, or other machine explainable in some way, so you immediately understand?”
The study’s approach represents a new way of designing sound that is applicable to any autonomous system or robot, the researchers said. While most sound designers work in quiet labs and create sounds to convey specific meanings, this approach uses the bus as a laboratory to test how people will respond to the sounds in the wild.
“We’ve approached sound design all wrong in human-robot interaction for the past decades,” Jung said. “We wanted to really rethink this and bring in a new perspective.”
Pelikan and Jung said their findings also underline another important factor for autonomous vehicle design: Traffic is a social phenomenon. While societies may have established rules of the road, people are constantly communicating through their horns, headlights, turn signals and movements. Pelikan and Jung want to give autonomous vehicles a better way to participate in the conversation.
The research received funding from the Swedish Research Council and the National Science Foundation.
Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.
This story was orignially published in the Cornell Chronicle.