Google Play icon

Augmented reality enhances robot collaboration

Posted March 24, 2018

Thousands of exciting and novel applications of augmented reality and robotic technologies have emerged in recent years, but the potential for networking these two technologies and using them in conjunction with each other has gone relatively unexplored. However, two papers published by the ATLAS Iron Lab last week for the ACM/IEEE International Conference on Human Robot Interaction in Chicago open the door to this promising area of research, paving the way for more seamless integration of robots in modern life.

Researcher points at airborne drone

Recognizing the value of their innovative work, conference organizers awarded the IRON Lab teams best paper and runner-up best paper in the Design category. Assistant Professor Dan Szafir, who directs the IRON Lab, explains that both papers examine the potential for transmitting real-time visual information from drones to people with AR. In the first study, research participants completing an assembly task while sharing a workspace with a drone were more efficient when informed of the drones’ flightpath using AR versus tracking its path without assistance. In the second study, drone photography proved safer and more accurate when a drone camera’s field of view was streamed to operators’ AR displays instead of tablet screens, as is the norm today.

reseach subject looking at drone through AR headset

Reseach subject looking at drone through AR headset

To conduct the first study, researchers set up an environment similar to a small warehouse, where participants were assigned the task of stringing beads in a specific color order, requiring them to move between six assembly stations, remaining at a safe distance from the drone at all times. Their goal was to assemble as many beaded strings as possible in eight minutes. When the drone approached, they had to stop work and move to a different workstation.

Results found that when the drone’s imminent flightpath was communicated with AR, participants were more efficient. Furthermore, the study evaluated tradeoffs in a variety of different graphical approaches to communicating the drone’s flightpath, which may help guide the design of future AR interfaces.

The second study found that AR technology helped drone operators take photos more safely and with more accuracy. Using a drone-mounted camera, research subjects were asked to photograph framed targets on a wall as quickly and precisely as possible. The drone camera’s field of view was visible to operators using a handheld tablet and using AR, in a variety of graphical configurations.

Results were judged by how fast subjects completed the task, the accuracy of their photos and the number of times they crashed. Once again, the study found AR significantly improved performance, increasing accuracy and reducing the number!
of crashes, with some AR graphical approaches proving more effective than others.

research subject wears AR headset with airborne drone nearby

Research subject wears AR headset with airborne drone nearby

As the world moves towards integrating humans and robots in the workplace, effective collaboration depends on the ability of team members to rapidly understand and predict a robot’s behavior, something that human workers do through facial expressions, gestures and speech,” says Szafir, who directs the IRON Lab. “Human workers want to know explicitly when and where their robot coworker intends to move next, and they perform best when they can anticipate those movements. We are excited to be exploring how to leverage augmented reality to communicate this information in new and more effective ways.”

Taking place over a 12-month period, the two studies Szafir supervised were conducted by PhD students Michael Walker and Hooman Hedayati, along with master’s student Jennifer Lee. After being launched in January 2016 and Szafir making the Forbes “30 Under 30: Science” list in January 2017, the IRON Lab’s latest commendations from the world’s preeminent HRI conference sets expectations high for this ambitious and growing group of researchers.

Communicating Robot Motion Intent with Augmented Reality by Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir (Best Paper—Design, ACM/IEEE International Conference on Human Robot Interaction, 2018)

Improving Collocated Robot Teleoperation with Augmented Reality by Hooman Hedayati, Michael Walker and Daniel Szafir (Runner-Up Best Paper—Design, ACM/IEEE International Conference on Human Robot Interaction, 2018)

Visit Project Page   Read Article   IRON Lab

Source: University of Colorado Boulder

Featured news from related categories:

Technology Org App
Google Play icon
85,970 science & technology articles

Most Popular Articles

  1. Universe is a Sphere and Not Flat After All According to a New Research (November 7, 2019)
  2. NASA Scientists Confirm Water Vapor on Europa (November 19, 2019)
  3. This Artificial Leaf Turns Atmospheric Carbon Dioxide Into Fuel (November 8, 2019)
  4. How Do We Colonize Ceres? (November 21, 2019)
  5. A stretchable stopwatch lights up human skin (November 4, 2019)

Follow us

Facebook   Twitter   Pinterest   Tumblr   RSS   Newsletter via Email