We propose to explore ways of combining headworn and handheld augmented reality for mobile computing. The goal is to create
interaction techniques suitable for future mobile user interfaces in which users may have lightweight augmented reality eyewear in addition to handheld displays, all of which are tracked in 6DOF relative to each other. One interaction paradigm that we would like to investigate uses see-through eyewear to augment the user’s view of their interactions with a handheld phone, directly annotating the phone’s keypad and touchscreen or producing a larger contextual surround within which the physically small phone is situated. Another approach takes advantage of the internal cameras in one or more tracked phones to produce an augmented reality built from both the egocentric view of the headworn display and the exocentric views of the phones (e.g., capturing the rear view of an object in front of the user, or a view around a corner, as seen from the handheld phone).
To concentrate on the design of interaction techniques, we will exploit existing tracking technology available in our lab, including external optical trackers, and orientation tracking technologies built into the headworn displays and phones. Some prototypes may drive the headworn display with an external laptop, while others may drive it with a second phone. We may also explore the use of a wristworn phone instead of, or in addition to, a handheld one.