Some industrial robots are hulking, highly specialized pieces of machinery that are cordoned off by cages from human factory workers.
But manufacturers have also begun experimenting with a new generation of “cobots” designed to work side-by-side with humans, and University of Wisconsin-Madison researchers are playing an important role in making these human-robot collaborations more natural and efficient.
Bilge Mutlu, an assistant professor of computer sciences, is working with counterparts at the Massachusetts Institute of Technology (MIT) to determine best practices for effectively integrating human-robot teams within manufacturing environments. Their research is funded by a three-year grant from the National Science Foundation (NSF) as part of its National Robotics Initiative program.
Furniture maker Steelcase, a global company headquartered in Grand Rapids, Michigan, is also a partner. “Working with world-class research universities like UW is critical to our strategy to evolve our industrial systems and develop industry-leading capabilities,” says Steelcase’s Edward Vander Bilt. “Our hope with this research is that we will learn how to extend human-robot collaboration more broadly across our operations.”
In recent years, the robotics industry has introduced new platforms that are less expensive and intended to be easier to reprogram and integrate into manufacturing. Steelcase owns four next-generation robots based on a platform called Baxter, made by Rethink Robotics. Each Baxter robot has two arms and a tablet-like panel for “eyes” that provide cues to help human workers anticipate what the robot will do next.
“This new family of robotic technology will change how manufacturing is done,” says Mutlu. “New research can ease the transition of these robots into manufacturing by making human-robot collaboration better and more natural as they work together.”
Mutlu directs UW-Madison’s Human-Computer Interaction Laboratory and serves as the principal investigator on the UW side of the collaboration. He works closely with Julie A. Shah, an assistant professor of aeronautics and astronautics at MIT.
Mutlu’s team is building on previous work related to topics such as gaze aversion in humanoid robots, robot gestures and the issue of “speech and repair.” For example, if a human misunderstands a robot’s instructions or carries them out incorrectly, how should the robot correct the human?
At MIT, Shah breaks down the components of human-robot teamwork and tries to determine who should perform various tasks. Mutlu’s work complements Shah’s by focusing on how humans and robots actually interact.
“People can sometimes have difficulty figuring out how best to work with or use a robot, especially if its capabilities are very different from people’s,” says Shah. “Automated planning techniques can help bridge the gap in our capabilities and allow us to work more effectively as a team.”
Over the summer, UW-Madison computer sciences graduate student Allison Sauppé traveled to Steelcase headquarters to learn more about its efforts to incorporate Baxter into the production line. She found that perceptions of Baxter varied according to employees’ roles.
While managers tended to see Baxter as part of the overall system of automation, front-line workers had more complex feelings. “Some workers saw Baxter as a social being or almost a co-worker, and they talked about Baxter as if it were another person,” she says. “They unconsciously attributed human-like characteristics.”
Source: University of Wisconsin-Madison