Talking with touch
Nadine Sarter is pioneering the use of tactile interfaces to build better conversations between machines and humans
Nadine Sarter is pioneering the use of tactile interfaces to build better conversations between machines and humans
Nadine Sarter sees touch as an effective but underutilized method of communication, and the newly-elected member of the National Academy of Engineering is working to bring touch-based interfaces to fields like aviation, the military and health care. She is a certified pilot, director of the U-M Center for Ergonomics, and the Richard W. Pew Collegiate Professor of Industrial and Operations Engineering.
Every time Nadine Sarter looks up from her desk, she sees a flight deck—the cockpit of a modern commercial airliner to be exact. The photo is posted directly across from her chair.
“I like fast things,” she explains. “And I am interested in the complexity of the systems on a flight deck,” “I started out working in maritime in Germany, but when I moved to the US, I joined a lab that had funding to work in aviation. I thought to myself ‘Finally. Something fast.”
On modern flight decks, and in other complex environments that require rapid, high-stakes decision making, humans often rely on guidance from sensor- and computer-driven systems—the area in which Sarter has made her mark. Recently, she was elected to the National Academy of Engineering for her contributions to opening a new line of human-machine communication: touch.
The Richard W. Pew Collegiate Professor of Industrial Operations and Engineering, director of the U-M Center for Ergonomics and certified pilot has dedicated much of her career to working on some of the most complex high-risk technologies built by humans. Sarter helped NASA improve the human-machine interface for the robotic arm used first in the Space Shuttle program and later on the International Space Station. She worked with the FAA to develop certification guidelines for touch-based technologies on commercial flight decks. She worked on a system that uses tactile cues to help soldiers navigate and understand the fast-moving combat environment.
All those projects might sound wildly different, but Sarter’s research in these and other areas has a common thread: driving richer and safer interactions between people and machines.
Sarter brings a unique perspective to the space; she studied psychology at Germany’s University of Hamburg, earning a master’s degree before moving to the United States to earn a PhD in industrial and systems engineering from Ohio State University. With her interdisciplinary background comes the insight that better systems don’t always require more automation; they require smarter more collaborative automation.
“People need to step back and think carefully about the push for automation and autonomy,” she said. “It’s easy for engineers to think of humans as the problem. But when the unexpected happens, you still need a human in the loop, and it’s very important to think about the best way to support that.”
To that end, much of her recent research has focused on refining the use of touch as a new channel to convey information from machines to humans. Next-generation flight decks use tactile interfaces to help pilots understand on what the plane is doing. On the battlefield, soldiers wear arrays of coin-sized discs called tactors, which vibrate to indicate an approaching enemy.
“When I first started working with tactile interfaces, people thought I was crazy,” she said. “But in many environments, such as aviation, there were more and more visual interfaces competing for human attention. Eventually overload set in, and you needed a new channel for conveying information. We realized that touch could be that channel.”
It was Sarter’s work with tactile interfaces that ultimately led to her NAE induction. And as automation works its way into more and more complex work domains, she anticipates that this communication channel will become increasingly essential. Recently, she and her students have developed tactile interfaces for medicine, where they hold the potential to help physicians and nurses monitor patients more efficiently and thus increase patient safety.
“In a sense, designing tactile interfaces is like creating a new language,” she said. “We’re all highly trained on visual interfaces, we all know what red and yellow and green lights mean. But there are no such conventions for vibrations to the skin. It’s exciting to have the opportunity to help create them.”