We are looking for subjects for an experiment that will involve trying out different augmented reality user interfaces outdoors on a mobile phone. The experiment will take about an hour to complete and you will receive $15 for your participation.
If you are interested, please contact Nicolas Dedual to make an appointment. His email address is firstname.lastname@example.org.
Thank you in advance for your help.
Award includes funding to support her dissertation on “User Interfaces for Communicating Patient Status and Progress.”
Lauren’s work addresses an important gap in health information technology: there has been limited research to date that explores the impact of providing hospitalized patients with direct access to health information throughout their care. Her research will yield new insights into how such technology can be used to educate and engage hospitalized patients and their families, by developing tablet-computer-based user interfaces with which hospitalized patients and their families can review clinical and health-related information. It will advance scientific knowledge in the field of patient-clinician communication, demonstrate new technical capabilities for sharing information among patients and their care team, and explore potential improvements to patient engagement, knowledge, and satisfaction.
IEEE 3DUI (7th Symposium on 3D User Interfaces), which took place March 4-5 2012 in Costa Mesa, California is focused on the design and development of 3D user interfaces. The poster, “Manipulating Virtual Objects in Hand-Held Augmented Reality using Stored Snapshots” was the work of Ph.D. student Mengu Sukan, with M.S. student Semih Energin and Prof. Steve Feiner. Their work is an example of augmented reality, in which camera imagery is overlaid with live 3D graphics. The poster presents a set of interaction techniques that allow a user to first take snapshots of a scene using a tablet computer, and then jump back and forth between the snapshots, to revisit them virtually for interaction. By storing for each snapshot a still image of the scene, along with the camera position and orientation determined by computer vision software, this approach allows the overlaid 3D graphics to be dynamic and interactive. This makes it possible for the user to move and rotate virtual 3D objects from the vantage points of different locations, without the overhead of physically traveling between those locations. 3DUI attendees tried a real-time demo in which they laid out virtual furniture. They could rapidly transition between the live view and the viewpoints of multiple snapshots, as they moved and rotated items of virtual furniture, iteratively designing a desired layout.
The paper, “Augmented Reality in the Psychomotor Phase of a Procedural Task”, reports on a key part of Steve Henderson’s spring 2011 dissertation. It presents the design and evaluation of a prototype augmented reality user interface designed to assist users in performing an aircraft maintenance assembly task. The prototype tracks the user and multiple physical task objects, and provides dynamic, prescriptive, overlaid instructions on a tracked, see-through, head-worn display in response to the user’s ongoing activity. A user study shows participants were able to complete aspects of the assembly task in which they physically manipulated task objects significantly faster and with significantly greater accuracy when using augmented reality than when using 3D-graphics-based assistance presented on a stationary LCD panel.
This year, Prof. Steven Feiner was honored at CHI 2011 by being elected to the CHI academy. The CHI Academy is an honorary group of individuals who have made extensive contributions to the study of HCI and who have led the shaping of the field.
The official press release can be found here.