Authoring Environments for Mobile Augmented and Virtual Reality

The Computer Graphics and User Interfaces Laboratory

Sinem Güven and Steven Feiner

Columbia University
Computer Graphics & User Interfaces Lab

Introduction

Most existing authoring systems for wearable augmented and virtual reality experiences concentrate on creating separate media objects and embedding them within the user’s surroundings. In contrast, designing narrative multimedia experiences for such environments is still largely a tedious manual task. We present a MARS Authoring Tool for creating and editing 3D hypermedia narratives that are interwoven with a wearable computer user’s surrounding environment. Our system is designed for use by authors who are not programmers, and allows them to preview their results on a desktop workstation, as well as with an augmented or virtual reality system.

Using the MARS Authoring Tool, we have created several Situated Documentaries that tell the stories of events that took place on our campus. Users can experience these Situated Documentaries using our experimental wearable MARS backpack. This mobile prototype uses a tracked see-through head-worn display to overlay 3D graphics, imagery, and sound on top of the real world.


Example Pictures

Each thumbnail image is a link to the originally captured (mostly VGA resolution) image in JPG format. Note that the gamma values for the images are not adjusted. The images may need to be processed (gamma, contrast) before they can be used. Please contact Prof. Steven Feiner if you are interested in obtaining permission to use the images.

© Computer Graphics and User Interfaces Lab, Columbia University


A clip being previewed on the desktop via the presentation component.
A clip being presented in AR by the presentation component.
Initial view of a documentary in AR.
Selected icon is highlighted and presents user with contents. Arrow next to clip provides access to linked clips.
Screen-stabilized image snippet is fixed at its screen position as user looks around.
World-stabilized image snippet is fixed at its world location as user looks around.
Screen-stabilized world in miniature indicates user’s current location and the Revolt2 icon containing the linked clip. The user has turned toward the Revolt2 icon, but hasn’t selected it, so the clip menu for Revolt is still displayed.
A scene from the Armstrong documentary in VR.
Three 3D icons representing three different locations of interest.
Image snippet associated with the second floor of Low Library.
The author can zoom in and navigate within the 3D environment of the MARS Authoring Tool for precise positioning of snippets. Snippets and 3D icons are billboarded to always face the user.
Linking clips associated with icons representing locations in the authoring component.

Example movie

The following movie sequence demonstrates our system in action. It shows how an indoor user, embedded within a backdrop sphere texture mapped with an omni image of Columbia Campus College Walk, experiences hyperlinked multimedia presentations. Interlinked virtual fist icons denote story nodes. They can be selected using a wireless handheld trackball and explored in detail to learn about the (in this case historic) information associated with them. The video material shown here was captured by a camera pointing through the system's VR-glasses, and requires DivX codec for viewing.


Publications

  • S. Güven and S. Feiner, Authoring 3D Hypermedia for Wearable Augmented and Virtual Reality, In Proc. ISWC '03 (Seventh International Symposium on Wearable Computers), White Plains, NY, October 21-23, 2003, pp. 118-126. ( 2MB Acrobat version of paper)

  • S. Güven and S. Feiner, A Hypermedia Authoring Tool for Augmented and Virtual Reality, In NRHM 2003 (The New Review of Hypermedia and Multimedia), Special Issue on Hypermedia Beyond the Desktop, Vol. 9, Taylor & Francis Group Publishing, London, United Kingdom, 2003.


    Acknowledgments

    This research is supported in part by Office of Naval Research Contracts N00014-99-1-0394, N00014-99-1-0683, and N00014-99-1-0249, NSF Grants IIS-00-82961 and IIS-01-21239, and gifts from Microsoft Corporation.

    Any opinions, findings, and conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF or any other organization supporting this work.

    Back to the Mobile Augmented Reality page.

    Please send comments to Sinem Güven at <sinem@cs.columbia.edu>