Augmented Reality for Maintenance and Repair (ARMAR)

Steve Henderson Steven Feiner
The Computer Graphics and User Interfaces Laboratory

Columbia University
Computer Graphics & User Interfaces Lab


Introduction

Augmented Reality for Maintenance and Repair (ARMAR) explores the use of augmented reality to aid in the execution of procedural tasks in the maintenance and repair domain. The principal research objective of this project is to determine how real time computer graphics, overlaid on and registered with the actual repaired equipment, can improve the productivity, accuracy, and safety of maintenance personnel. Head-worn, motion-tracked displays augment the userís physical view of the system with information such as sub-component labeling, guided maintenance steps, real time diagnostic data, and safety warnings. The virtualization of the user and maintenance environment allows off-site collaborators to monitor and assist with repairs. Additionally, the integration of real-world knowledge bases with detailed 3D models provides opportunities to use the system as a maintenance simulator/training tool. This project features the design and implementation of prototypes integrating the very latest in motion tracking, mobile computing, wireless networking, 3D modeling, and human-machine interface technologies.

(Left) A mechanic wearing a tracked head-worn display performs a maintenance task on a Rolls Royce DART 510 Engine. (Right) A view through the head-worn display depicts information provided using augmented reality to assist the mechanic.

(Left) A mechanic wearing a tracked head-worn display performs a maintenance task on a Rolls Royce DART 510 Engine. (Right) A view through the head-worn display depicts information provided using augmented reality to assist the mechanic.

 


Research Directions



(Left) A mechanic wearing a tracked head-worn display performs a maintenance task inside an LAV-25A1 armored personnel carrier. (Right) The AR condition in the study: A view through the head-worn display captured in a similar domain depicts information pro
vided using augmented reality to assist the mechanic. (The view through the head-worn display for the LAV-25A1 domain was not cleared for publication due to security restrictions, necessitating the substitution of images from an alternative domain.)

(Left) A mechanic wearing a tracked head-worn display performs a maintenance task inside an LAV-25A1 armored personnel carrier. (Right) The AR condition in the study: A view through the head-worn display captured in a similar domain depicts information pro vided using augmented reality to assist the mechanic. (The view through the head-worn display for the LAV-25A1 domain was not cleared for publication due to security restrictions, necessitating the substitution of images from an alternative domain.)

Benefits of using AR for Task Localization

As part of our exploration into the potential benefits of using AR for maintenance and repair, we designed, implemented, and user tested a prototype augmented reality application to support military mechanics conducting routine maintenance tasks inside an armored vehicle turret [Henderson & Feiner, ISMAR 2009; Henderson & Feiner, TVCG 2011]. Our prototype uses a tracked head-worn display to augment a mechanicís natural view with text, labels, arrows, and animated sequences designed to facilitate task comprehension, location, and execution. A within-subject controlled user study examined professional military mechanics using our system to complete 18 common tasks under field conditions. These tasks included installing and removing fasteners and indicator lights, and connecting cables, all within the cramped interior of an armored personnel carrier turret. An augmented reality condition was tested against two baseline conditions: an untracked head-worn display with text and graphics and a fixed flat panel display representing an improved version of the laptop-based documentation currently employed in practice. The augmented reality condition allowed mechanics to locate tasks more quickly than when using either baseline, and in some instances, resulted in less overall head movement. A qualitative survey showed mechanics found the augmented reality condition intuitive and satisfying for the tested sequence of tasks.

Benefits of using AR for Task Performance

The next step in our work [Henderson & Feiner, ISMAR 2011] addressed how augmented reality could assist users in the relatively under-explored psychomotor phase of procedural tasks. In this phase, the user begins physical manipulations, and thus alters aspects of the underlying task environment. Building on our earlier work that showed benefits for task localization, we extended our prototype to track the user and multiple components in a typical maintenance assembly task, and provide dynamic, prescriptive, overlaid instructions on a see-through head-worn display in response to the user's ongoing activity. A user study shows that participants were able to complete psychomotor aspects of the assembly task significantly faster and with significantly greater accuracy with AR than when using 3D-graphics-based assistance presented on a stationary LCD. Qualitative questionnaire results indicate that participants overwhelmingly preferred the AR condition, and ranked it as more intuitive than the LCD condition.


(Left) A user manipulates 3D virtual buttons while receiving haptic feedback from the underlying grooves of an engine compression section. (Right) User sketches of possible natural interfaces constructed with opportunistic surfaces.

Interaction Techniques

One of ARMAR's research directions has examined what types of interaction techniques are well suited for conducting AR-assisted procedural tasks. This research led to the creation of Opportunistic Controls [Henderson & Feiner, VRST 2008; Henderson & Feiner, TVCG 2010], a class of user interaction techniques for augmented reality (AR) applications that support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. Opportunistic Controls leverage characteristics of these affordances to provide passive haptics that ease gesture input, simplify gesture recognition, and provide tangible feedback to the user. 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. While not suitable for all user interface scenarios, this technique may be a good choice for procedural tasks requiring eye and hand focus and restricting other interaction techniques.


Publications

Steven Henderson and Steven Feiner, "Augmented Reality in the Psychomotor Phase of a Procedural Task", Proceedings of IEEE International Symposium on Mixed and Augmented Reality (ISMAR '11), October 2011, Basel, Switzerland, pp. 191-200. (recipient Best Science and Technology Student Paper Award) (pdf)

Steven Henderson and Steven Feiner, "Exploring the Benefits of Augmented Reality Documentation for Maintenance and Repair", IEEE Transactions on Visualization and Computer Graphics (TVCG), October 2011 (vol. 17, no. 10), pp. 1355-1368. (pdf)

Steven Henderson and Steven Feiner, "Opportunistic Tangible User Interfaces for Augmented Reality", IEEE Transactions on Visualization and Computer Graphics (TVCG), January/February 2010 (vol. 16, no. 1), pp. 4-16. (pdf)

Steven Henderson and Steven Feiner, "Evaluating the Benefits of Augmented Reality for Task Localization in Maintenance of an Armored Personnel Carrier Turret", Proceedings of IEEE International Symposium on Mixed and Augmented Reality (ISMAR '09), October 2009, pp. 135-144. (recipient Best Paper Award)   (pdf)

Steven Henderson and Steven Feiner, "Opportunistic Controls: Leveraging Natural Affordances as Tangible User Interfaces for Augmented Reality", Proceedings ACM Virtual Reality Software and Technology (VRST '08), Oct 2008, pp. 211-218. (recipient Best Paper Award)  (pdf)

Steven Henderson and Steven Feiner, "Augmented and Mixed Reality for Training," The PSI Handbook of Virtual Environments for Training and Education, Ed. Dylan Schmorrow, Joseph Cohn, and Denise Nicholson", Praeger Security International, Westport, CT, May 2008. (link)

Steven Henderson and Steven Feiner, "Augmented Reality for Maintenance and Repair (ARMAR)", Technical Report AFRL-RH-WP-TR-2007-0112, United States Air Force Research Lab, Jul 2007. (pdf)


Videos

 


 

     

Images


ARMAR Collection on Flickr

Acknowledgments

This research is funded in part by DAFAFRL Grant FA8650-05-2-6647 and ONR Grant N00014-04-1-0005, and a generous gifts from NVIDIA and Google. We thank Bengt-Olaf Schneider, who provided the StereoBLT SDK used to support the display of stereo camera imagery, and Kyle Johnsen, who advised on use of the OptiTrack system. We are grateful for the assistance of cadre and students at Aberdeen Proving Ground, as well as the support of engineers at the Marine Corps Logistics Base including Mike Shellem, Curtis Williams, Andrew Mitchell, and Alan Butterworth. We also thank David Madigan and Magnus Axholt for insights shared during design and analysis of experiments.


Back to the Columbia CGUI Lab home page.

Please send comments to Steve Henderson