Migration in Process:
2016
Smith, B. A. & Nayar, S. K., Mining Controller Inputs to Understand Gameplay. Proc. ACM UIST, Tokyo, Japan, 2016, 157 - 168.
Today's game analytics systems are powered by event logs, which reveal information about what players are doing but offer little insight about the types of gameplay that games foster. Moreover, the concept of gameplay itself is difficult to define and quantify. In this paper, we show that analyzing players' controller inputs using probabilistic topic models allows game developers to describe the types of gameplay -- or action -- in games in a quantitative way. More specifically, developers can discover the types of action that a game fosters and the extent that each game level fosters each type of action, all in an unsupervised manner. They can use this information to verify that their levels feature the appropriate style of gameplay and to recommend levels with gameplay that is similar to levels that players like. We begin with latent Dirichlet allocation (LDA), the simplest topic model, then develop the player-gameplay action (PGA) model to make the same types of discoveries about gameplay in a way that is independent of each player's play style. We train a player recognition system on the PGA model's output to verify that its discoveries about gameplay are in fact independent of each player's play style. The system recognizes players with over 90% accuracy in about 20 seconds of playtime.
Miau, D. & Feiner, S., Personalized Compass: A Compact Visualization for Direction and Location. Proc. ACM CHI, Santa Clara, CA, USA, May 7-12, 2016, 5114 - 5125.
Maps on mobile/wearable devices often make it difficult to determine the location of a point of interest (POI). For example, a POI may exist outside the map or on a background with no meaningful cues. To address this issue, we present Personalized Compass, a self-contained compact graphical location indicator. Personalized Compass uses personal a priori POIs to establish a reference frame, within which a POI in question can then be localized. Graphically, a personalized compass combines a multi-needle compass with an abstract overview map. We analyze the characteristics of Personalized Compass and the existing Wedge technique, and report on a user study comparing them. Personalized Compass performs better for four inference tasks, while Wedge is better for a locating task. Based on our analysis and study results, we suggest the two techniques are complementary and offer design recommendations.
Miau, D. & Feiner, S., Personalized Compass: A Demonstration of a Compact Visualization for Direction and Location. CHI 2016 Extended Abstracts, Santa Clara, CA, USA, May 7-12, 2016, 3731 - 3734.
Maps on mobile/wearable devices often make it difficult to determine the location of a point of interest (POI). For example, a POI may exist outside the map or on a background with no meaningful cues. To address this issue, we present Personalized Compass, a self-contained compact graphical location indicator. Personalized Compass uses personal a priori POIs to establish a reference frame, within which a POI in question can then be localized. Graphically, a personalized compass combines a multi-needle compass with an abstract overview map. We analyze the characteristics of Personalized Compass and the existing Wedge technique, and report on a user study comparing them. Personalized Compass performs better for four inference tasks, while Wedge is better for a locating task. Based on our analysis and study results, we suggest the two techniques are complementary and offer design recommendations. In this demonstration, we present an iOS application comparing Personalized Compass with Wedge for map-based location and direction tasks.
Elvezio, C., Sukan, M., & Feiner, S., A Framework to Facilitate Reusable, Modular Widget Design for Real-Time Interactive Systems. IEEE 9th Workshop on Software Engineering and Architectures for Realtime Interactive Systems (SEARIS), Greenville, SC, USA, March 20, 2016.
Game engines have become popular development platforms for real-time interactive systems. Contemporary game engines, such as Unity and Unreal, feature component-based architectures, in which an object's appearance and behavior is determined by a collection of component scripts added to that object. This design pattern allows common functionality to be contained within component scripts and shared among different types of objects. In this paper, we describe a flexible framework that enables programmers to design modular, reusable widgets for real-time interactive systems using a collection of component scripts.
We provide a reference implementation written in C# for the Unity game engine. Making an object, or a group of objects, part of our managed widget framework can be accomplished with just a few drag-and-drop operations in the Unity Editor. While our framework provides hooks and default implementations for common widget behavior (e.g., initialization, refresh, and toggling visibility), programmers can also define custom behavior for a particular widget or combine simple widgets into a hierarchy and build arbitrarily rich ones. Finally, we provide an overview of an accompanying library of scripts that support functionality for testing and networking.
2015
Oda, O., Elvezio, C., Sukan, M., Feiner, S., & Tversky, B., Virtual Replicas for Remote Assistance in Virtual and Augmented Reality. Proc. ACM UIST, Charlotte, NC, USA, November 8-11, 2015, 405 - 415.
In many complex tasks, a remote subject-matter expert may need to assist a local user to guide actions on objects in the local user's environment. However, effective spatial referencing and action demonstration in a remote physical environment can be challenging. We introduce two approaches that use Virtual Reality (VR) or Augmented Reality (AR) for the remote expert, and AR for the local user, each wearing a stereo head-worn display. Both approaches allow the expert to create and manipulate virtual replicas of physical objects in the local environment to refer to parts of those physical objects and to indicate actions on them. This can be especially useful for parts that are occluded or difficult to access. In one approach, the expert points in 3D to portions of virtual replicas to annotate them. In another approach, the expert demonstrates actions in 3D by manipulating virtual replicas, supported by constraints and annotations. We performed a user study of a 6DOF alignment task, a key operation in many physical task domains, comparing both approaches to an approach in which the expert uses a 2D tablet-based drawing system similar to ones developed for prior work on remote assistance. The study showed the 3D demonstration approach to be faster than the others. In addition, the 3D pointing approach was faster than the 2D tablet in the case of a highly trained expert.
Elvezio, C., Sukan, M., Feiner, S., & Tversky, B., POSTER: Interactive Visualizations for Monoscopic Eyewear to Assist in Manually Orienting Objects in 3D, Proc. ISMAR 2015 (IEEE Int. Symp. on Mixed and Augmented Reality), 180 - 181.
Assembly or repair tasks often require objects to be held in specific orientations to view or fit together. Research has addressed the use of AR to assist in these tasks, delivered as registered overlaid graphics on stereoscopic head-worn displays. In contrast, we are interested in using monoscopic head-worn displays, such as Google Glass. To accommodate their small monoscopic field of view, off center from the user's line of sight, we are exploring alternatives to registered overlays. We describe four interactive rotation guidance visualizations for tracked objects intended for these displays.
2014
Sukan, M., Elvezio, C., Oda, O., Feiner, S., & Tversky, B., ParaFrustum: Visualization Techniques for Guiding a User to a Constrained Set of Viewing Positions and Orientations, Proc. ACM UIST, 2014, 331 - 340.
Many tasks in real or virtual environments require users to view a target object or location from one of a set of strategic viewpoints to see it in context, avoid occlusions, or view it at an appropriate angle or distance. We introduce ParaFrustum, a geometric construct that represents this set of strategic viewpoints and viewing directions. ParaFrustum is inspired by the look-from and look-at points of a computer graphics camera specification, which precisely delineate a location for the camera and a direction in which it looks. We generalize this approach by defining a ParaFrustum in terms of a look-from volume and a look-at volume, which establish constraints on a range of acceptable locations for the user's eyes and a range of acceptable angles in which the user's head can be oriented. Providing tolerance in the allowable viewing positions and directions avoids burdening the user with the need to assume a tightly constrained 6DoF pose when it is not required by the task. We describe two visualization techniques for virtual or augmented reality that guide a user to assume one of the poses defined by a ParaFrustum, and present the results of a user study measuring the performance of these techniques. The study shows that the constraints of a tightly constrained ParaFrustum (e.g., approximating a conventional camera frustum) require significantly more time to satisfy than those of a loosely constrained one. The study also reveals interesting differences in participant trajectories in response to the two techniques.
Prey, J. E., Woollen, J., Wilcox, L., Sackeim, A. D., Hripcsak, G., Bakken, S., Restaino, S., Feiner, S., & Vawdrey, D. K., Patient engagement in the inpatient setting: a systematic review. Journal of the American Medical Informatics Association, 21(4), 742 - 750.
Paper abstract to appear here.
Wilcox, L., Feiner, S., Elhadad, N., Vawdrey, D., & Tran, T. H., Patient-centered tools for medication information search. Proc. Pervasive Computing Technologies for Healthcare, 49 - 56.
Paper abstract to appear here.
2013
Oda, O., Sukan, M., Feiner, S., and Tversky, B., Poster: 3D referencing for remote task assistance in augmented reality. Proc. 3DUI 2013 (IEEE Symp. on 3D User Interfaces), Orlando, FA, March 16-17, 2013. (Runner-up of 3DUI 2013 Best Poster Award
We present a 3D referencing technique tailored for remote maintenance tasks in augmented reality. The goal is to improve the accuracy and efficiency with which a remote expert can point out a real physical object at a local site to a technician at that site. In a typical referencing task, the remote expert instructs the local technician to navigate to a location from which a target object can be viewed, and then to attend to that object. The expert and technician both wear head-tracked, stereo, see-through, head-worn displays, and the expert's hands are tracked by a set of depth cameras. The remote expert first selects one of a set of prerecorded viewpoints of the local site, and a representation of that viewpoint is presented to the technician to help them navigate to the correct position and orientation. The expert then uses hand gestures to indicate the target.
Smith, B. A., Yin, Q., Feiner, S. K., & Nayar, S. K., Gaze locking: Passive eye contact detection for human-object interaction, Proc. ACM UIST, 2013, 271 - 280.
Eye contact plays a crucial role in our everyday social interactions. The ability of a device to reliably detect when a person is looking at it can lead to powerful human–object interfaces. Today, most gaze-based interactive systems rely on gaze tracking technology. Unfortunately, current gaze tracking techniques require active infrared illumination, calibration, or are sensitive to distance and pose. In this work, we propose a different solution — a passive, appearance-based approach for sensing eye contact in an image. By focusing on gaze locking rather than gaze tracking, we exploit the special appearance of direct eye gaze, achieving a Matthews correlation coefficient (MCC) of over 0.83 at long distances (up to 18 m) and large pose variations (up to ±30° of head yaw rotation) using a very basic classifier and without calibration. To train our detector, we also created a large publicly available gaze data set: 5,880 images of 56 people over varying gaze directions and head poses. We demonstrate how our method facilitates human--object interaction, user analytics, image filtering, and gaze-triggered photography.
Wilcox, L., Feiner, S., Elhadad, N., Vawdrey, D., & Tran, T. H., Remedy: supporting consumer-centered medication information search. In Pervasive Computing Technologies for Healthcare (PervasiveHealth), 2013, 317 -- 318.
Paper abstract to appear here.
Dedual, N. J. & Feiner, S. K., Addressing Information Overload in Urban Augmented Reality Applications. In Proc. GeoHCI 2013 (CHI 2013 Workshop on Geographic Human-Computer Interaction).
Paper abstract to appear here.
2012
Lu, W., Duh, H. B.-L., and Feiner, S. Subtle cueing for visual search in augmented reality. Proc. ISMAR 2012 (IEEE Int. Symp. on Mixed and Augmented Reality), Atlanta, GA, November 5-8, 2012, 161-166.
Paper abstract to appear here.
Oda, O. and Feiner, S. 3D referencing techniques for shared augmented reality environments. Proc. ISMAR 2012 (IEEE Int. Symp. on Mixed and Augmented Reality), Atlanta, GA, November 5-8, 2012, 208 - 215.
Paper abstract to appear here.
Sukan, M., Feiner, S., Tversky, B., and Energin, S. Quick viewpoint switching for manipulating virtual objects in hand-held augmented reality using stored snapshots. Proc. ISMAR 2012 (IEEE Int. Symp. on Mixed and Augmented Reality), Atlanta, GA, November 5-8, 2012, 217-226.
Magic-lens style augmented reality applications allow users to control camera pose easily by manipulating a portable hand-held device and provide immediate visual feedback. However, strategic vantage points must often be revisited repeatedly, adding time and error and taxing memory. We describe a new approach that allows users to take snapshots of augmented scenes that can be virtually revisited at later times. The system stores still images of scenes along with camera poses, so that augmentations remain dynamic and interactive. Users can manipulate virtual objects while viewing snapshots, instead of moving to real-world views. We present a study comparing performance in snapshot and live mode conditions in a task in which a virtual object must be aligned with two pairs of physical objects. Proper alignment requires sequentially visiting two viewpoints. Participants completed the alignment task significantly faster and more accurately using snapshots than when using the live mode. Moreover, participants preferred manipulating virtual objects using snapshots to the live mode.
Sackeim, A., Wilcox, L., Restaino, S., Stein, D., Hripcsak, G., Bakken, S., Feiner, S., and Vawdrey, D. Using an inpatient personal health record to enhance patient-provider communication. Poster. Proc. AMIA 2012 (Amer. Med. Informatics Assoc. Ann. Symp.), Chicago, IL, November 3-7, 2012.
Paper abstract to appear here
Rosenblum, L., Feiner, S., Julier, S., Swan, J., Livingston, M. The Development of mobile augmented reality. In Dill, J., Earnshaw, R., Kasik, D., Vince, J., and Wong, P. (eds.), Expanding the Frontiers of Visual Analytics and Visualization, Springer-Verlag, Berlin, Germany, 2012., 431-448
Paper abstract to appear here.
Baur, D., Boring, S., and Feiner, S. Virtual projection: Exploring optical projection as a metaphor for multi-device interaction. Proc. CHI 2012, Austin, TX, May 5-10, 2012., 1693-1702
Paper abstract to appear here.
Sukan, M., Feiner, S., and Energin, S. Poster: Manipulating virtual objects in hand-held augmented reality using stored snapshots. Proc. 3DUI 2012 (IEEE Symp. on 3D User Interfaces), Orange County, CA, March 4-8, 2012. 165-166 (Recipient of 3DUI 2012 Best Poster Award
We describe a set of interaction techniques that allow a user of a magic-lens style augmented reality application to take snapshots of an augmented scene and revisit them virtually for interaction at a later time. By storing a still image of the background along with the camera pose, this approach allows augmentations to remain dynamic and interactive. This makes it possible for the user to manipulate virtual objects from the vantage points of different locations without the overhead of physically traveling between those locations. Preliminary results from a user study show that participants were able to complete an alignment task significantly faster and as accurately when using snapshots as opposed to physical travel. Qualitative questionnaire answers showed that participants preferred using snapshots over walking and found it less demanding.
Wilcox, L., Feiner, S., Liu, A., Collins, S., Restaino, S, and Vawdrey, D. Designing inpatient technology to meet the medication information needs of cardiology patients. To appear in Proc. IHI 2012 (ACM SIGHIT Int. Health Informatics Symp), Miami, FL, January 28-30, 2012.
As patients are encouraged to become active participants in their own care, recent research has begun to explore the direct sharing of electronic health information with patients during hospital visits. The design of patient-facing views of clinical information is, however, a relatively recent line of inquiry. Research is needed to further understand guidelines for communicating specific types of information to hospital patients. In this work, we focus on cardiology patients' information needs related to their hospital medications. We assessed these needs to inform the design of interactive, electronic views of medication information for cardiology inpatients. We present results of in-situ interviews with 11 inpatients and 6 nurses in a cardiology step-down unit. Our findings suggest that cohesive trends in medication information needs exist across cardiology inpatients. We discuss interview results and their implications for the design of inpatient-facing information technology. We also discuss key ways in which electronic medication information, formatted for inpatient use, differs from that formatted for outpatient or transitional medication-management use.
2011
Henderson, S. and Feiner, S. Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Transactions on Visualization and Computer Graphics, 17(10), October 2011, 1355-1368.
Paper abstract to appear here.
Henderson, S. and Feiner, S. Augmented reality in the psychomotor phase of a procedural task. Proc. ISMAR 2011 (IEEE Int. Symp. on Mixed and Augmented Reality), Basel, Switzerland, October 26-29, 2011, 191-200. (Recipient of ISMAR 2011 Best Science and Technology Student Paper Award.).
Procedural tasks are common to many domains, ranging from maintenance and repair, to medicine, to the arts. We describe and evaluate a prototype augmented reality (AR) user interface designed to assist users in the relatively under-explored psychomotor phase of procedural tasks. In this phase, the user begins physical manipulations, and thus alters aspects of the underlying task environment. Our prototype tracks the user and multiple components in a typical maintenance assembly task, and provides dynamic, prescriptive, overlaid instructions on a see-through head-worn display in response to the user’s ongoing activity. A user study shows participants were able to complete psychomotor aspects of the assembly task significantly faster and with significantly greater accuracy than when using 3D-graphics–based assistance presented on a stationary LCD. Qualitative questionnaire results indicate that participants overwhelmingly preferred the AR condition, and ranked it as more intuitive than the LCD condition.
Dedual, N., Oda O. and Feiner, S. Creating hybrid user interfaces with a 2D multi-touch tabletop and a 3D see-through head-worn display. Poster. Proc. ISMAR 2011 (IEEE Int. Symp. on Mixed and Augmented Reality), Basel, Switzerland, October 26-29, 2011, 231-232.
How can multiple different display and interaction devices be used together to create an effective augmented reality environment? We explore the design of several prototype hybrid user interfaces that combine a 2D multi-touch tabletop display with a 3D head-tracked video–see-through display. We describe a simple modeling application and an urban visualization tool in which the information presented on the head-worn display supplements the information displayed on the tabletop, using a variety of approaches to track the head-worn display relative to the tabletop. In all cases, our goal is to allow users who can see only the tabletop to interact effectively with users wearing head-worn displays.
Vawdrey, D., Wilcox, L. Collins, S., Bakken, S., Feiner, S, Boyer, A., and Restaino, S. A tablet computer application for patients to participate in their hospital care. Proc. AMIA 2011 (Amer. Med. Informatics Assoc. Ann. Symp.), Washington, DC, October 22-26, 2011, 1428-1435.
Building on our institution’s commercial electronic health record and custom personal health record Web portal, we developed a tablet computer application to provide interactive information to hospital patients. Using Apple iPad devices, the prototype application was provided to five patients in a cardiology step-down unit. We conducted detailed interviews to assess patients’ knowledge of their inpatient care, as well as their perceptions of the usefulness of the application. While patients exhibited varying levels of comfort with using the tablet computer, they were highly enthusiastic about the application’s ability to supply health information such as their inpatient medication histories and photographs of their care providers. Additional research is warranted to assess the benefit such applications may have for addressing inpatient information needs, enhancing patient-provider communication and improving patient satisfaction.
Vawdrey, D., Wilcox, L., Collins, S., Feiner, S., Mamykina, O., Stein, D., Bakken, S., Fred, M., and Stetson, P. Awareness of the care team in electronic health records. Applied Clinical Informatics, 2(4), 2011, 395-405.
Objective: To support collaboration and clinician-targeted decision support, electronic health records (EHRs) must contain accurate information about patients’ care providers. The objective of this study was to evaluate two approaches for care provider identification employed within a commercial EHR at a large academic medical center.
Methods: We performed a retrospective review of EHR data for 121 patients in two cardiology wards during a four-week period. System audit logs of chart accesses were analyzed to identify the clinicians who were likely participating in the patients’ hospital care. The audit log data were compared with two functions in the EHR for documenting care team membership: 1) a vendor-supplied module called “Care Providers”, and 2) a custom “Designate Provider” order that was created primarily to improve accuracy of the attending physician of record documentation. Results: For patients with a 3–5 day hospital stay, an average of 30.8 clinicians accessed the electronic chart, including 10.2 nurses, 1.4 attending physicians, 2.3 residents, and 5.4 physician assistants. The Care Providers module identified 2.7 clinicians/patient (1.8 attending physicians and 0.9 nurses). The Designate Provider order identified 2.1 clinicians/patient (1.1 attending physicians, 0.2 resident physicians, and 0.8 physician assistants). Information about other members of patients’ care teams (social workers, dietitians, pharmacists, etc.) was absent.
Conclusions: The two methods for specifying care team information failed to identify numerous individuals involved in patients’ care, suggesting that commercial EHRs may not provide adequate tools for care team designation. Improvements to EHR tools could foster greater collaboration among care teams and reduce communication-related risks to patient safety.
Wilcox, L., Morris, D., Gatewood, J., Tan, D., and Horvitz, E. Characterizing patient-friendly 'micro-explanations' of medical events. Proc CHI 2011, Vancouver, BC, Canada, May 7-12, 2011, 29-32.
Patients’ basic understanding of clinical events has been shown to dramatically improve patient care. We propose that the automatic generation of very short micro-explanations, suitable for real-time delivery in clinical settings, can transform patient care by giving patients greater awareness of key events in their electronic medical record. We present results of a survey study indicating that it may be possible to automatically generate such explanations by extracting individual sentences from consumer-facing Web pages. We further inform future work by characterizing physician and non-physician responses to a variety of Web-extracted explanations of medical lab tests.
Veas, E., Mendez, E., Feiner, S., and Schmalstieg, D. Directing attention and influencing memory with visual saliency modulation. Proc. CHI 2011, Vancouver, BC, Canada, May 7-12, 2011, 1471-1480.
Paper abstract to appear here.
von Kapri, A., Rick, T., and Feiner, S. Comparing steering-based travel techniques for search tasks in a CAVE. Proc. IEEE Virtual Reality 2011, Singapore, March 19-23, 2011, 91-94.
Paper abstract to appear here.
White, S. and Feiner, S. Dynamic, abstract representations of audio in a mobile augmented reality conferencing system. In Alem, L. and Huang, W. (eds), Recent Trends of Mobile Collaborative Augmented Reality Systems, Springer Verlag, 2011, 149-161.
Paper abstract to appear here.
2010
Wilcox, L., Gatewood, J., Morris, D., Tan, D., Feiner, S., and Horvitz, E. Physician attitudes about patient-facing information displays at an urban emergency department. Proc. AMIA 2010 (Amer. Med. Informatics Assoc. Ann. Symp), Washington, DC, November 13-17, 2010, 887-891.
Hospital information systems have primarily been designed to support physicians and administrators, though recent research has explored the value of patient-facing information displays. Electronic systems can be designed to provide tailored information to patients on their health, their care teams, the status of their hospital stays, and their expected care plans. However, this direct delivery of information from database to patient represents a fundamental change to the traditional flow of clinical information. We therefore explore physician attitudes toward a proposed patient-facing display of information abstracted from a hospital EHR, in the context of an urban emergency department. We find that physicians generally support direct delivery of electronic information to patients, and uncover important concerns to consider in the design of patient-facing information systems.
Kruijff, E., Swan, E., and Feiner, S. Perceptual issues in augmented reality revisited. Proc. ISMAR 2010 (IEEE Int. Symp. on Mixed and Augmented Reality), Seoul, Korea, October 13-16, 2010, 3-12.
Paper abstract to appear here
Sukan, M. and Feiner, S. SnapAR: Storing snapshots for quick viewpoint switching in hand-held augmented reality. Poster. Proc. ISMAR 2010 (IEEE Int. Symp. on Mixed and Augmented Reality), Seoul, Korea, October 13-16, 2010, 273-274.
Paper abstract goes here.
White, S. and Feiner, S. Dynamic, abstract representations of audio in a mobile augmented reality conferencing system. Int. Workshop on Mobile Collaborative Augmented Reality at IEEE Int. Symp. on Mixed and Augmented Reality, Seoul, Korea, October 13, 2010.
Paper abstract goes here.
Sukan, M., Oda, O., Shi, X., Entrena, M., Qi, J., Sadalgi, S., and Feiner, S. ARmonica: A collaborative sonic environment. Adjunct Proc. of UIST 2010, New York, NY, October 3-6, 2010, 401-402
Paper abstract goes here.
Mendez, E., Feiner, S., and Schmalstieg, D. Focus and context in mixed reality by modulating first order salient features. Proc. Smart Graphics 2010 (10th Int. Symp. on Smart Graphics), June 24-26, 2010, Banff, Canada. (In R. Taylor et al., eds., Lecture Notes in Computer Science, Springer, 2010, Vol. 6133), 232-243.
Paper abstract goes here.
Wilcox, L., Lu, J., Lai, J., Feiner, S., and Jordan, D. Physician-driven management of patient progress notes in an intensive care unit. Proc. CHI 2010, Atlanta, GA, April 10-15, 2010, 1879-1888.
We describe fieldwork in which we studied hospital ICU physicians and their strategies and documentation aids for
composing patient progress notes. We then present a clinical documentation prototype, activeNotes, that supports the creation of these notes, using techniques designed based on our fieldwork. ActiveNotes integrates automated, context-sensitive patient data retrieval, and user control of automated data updates and alerts via tagging, into the documentation process. We performed a qualitative study of activeNotes with 15 physicians at the hospital to explore the utility of our information retrieval and tagging techniques. The physicians indicated their desire to use tags for a number of purposes, some of them extensions to what we intended, and others new to us and unexplored in other systems of which we are aware. We discuss the physicians’ responses to our prototype and distill several of their proposed uses of tags: to assist in note content management, communication with other clinicians, and care delivery.
Wilcox, L., Morris, D., Tan, D., and Gatewood, J. Designing Patient-Centric Information Displays for Hospitals. Proc. CHI 2010, Atlanta, GA, April 10-15, 2010, 2123-2132.
Electronic medical records are increasingly comprehensive, and this vast repository of information has already contributed to medical efficiency and hospital procedure. However, this information is not typically accessible to patients, who are frequently under-informed and unclear about their own hospital courses. In this paper, we propose a design for in-room, patient-centric information displays, based on iterative design with physicians. We use this as the basis for a Wizard-of-Oz study in an emergency department, to assess patient and provider responses to in-room information displays. 18 patients were presented with real-time information displays based on their medical records. Semi-structured interviews with patients, family members, and hospital staff reveal that subjective response to in-room displays was overwhelmingly positive, and through these interviews we elicited guidelines regarding specific information types, privacy, use cases, and information presentation techniques. We describe these findings, and we discuss the feasibility of a fully-automatic implementation of our design.
Mamykina, L., Wilcox, L., Vawdrey, D., Stein, D., Collins, S., Camargo, S., Fred, M., Hripcsak, G., and Feiner, S. Designing for Adoption: A Living Laboratory for Health IT. Proc. ACM WISH 2010 (Workshop on Interactive Systems in Healthcare), Atlanta, GA, April 11, 2010, 181-184.
We describe how an interdisciplinary collaboration has created a “living laboratory” in which researchers maintain
a direct and ongoing loop between innovation and production and study true adoption of technology in real
world settings. The collaborators include the Department of Biomedical Informatics and the Department of
Computer Science at Columbia University, industrial partners developing commercial health IT applications,
and New York-Presbyterian Hospital’s Columbia University Medical Center. In this paper we discuss our current
projects, and mention some of the unique benefits and challenges of building a living laboratory for health information technology.
White, S. and Feiner, S. Exploring interfaces to botanical species classification. Demo. CHI 2010 Extended Abstracts, Atlanta, GA, April 10-15, 2010, 3051-3055.
Oda, O. and Feiner, S. Rolling and shooting: Two augmented reality games. Demo. CHI 2010 Extended Abstracts, Atlanta, GA, April 10-15, 2010, 3041-3044.
St-Aubin B., Mostafavi M., Roche S., Dedual N., A 3D Collaborative Geospatial Augmented Reality System for Urban Design and Planning Purposes, Canadian Geomatics Conference 2010, Apr 2010.
In typical urban planning and design scenarios, good collaboration between implicated participants is fundamental but not necessarily easy to achieve. This can be due to a variety of factors ranging from the different backgrounds of the people involved to the complexity of the tools used by the experts. Furthermore, the collaborative aspect of urban planning and design is often limited to the meetings where designs and plans are assembled and discussed. This paper presents a system we designed to resolve those problems through a collaborative geospatial augmented reality application acting as an intuitive multi-user interactive scale model aiming to facilitate decision-making in urban planning and design projects. The proposed system integrates innovative technologies of geospatial augmented reality with tools for 3D modelling and spatial analysis in order to establish a collaborative augmented reality based urban planning and design system. Early results from our experimentations demonstrate the interesting potential of such a system for interactive and collaborative urban planning and design. The paper presents a comparison of the proposed approach with the existing ones and proposes further development as future work.
Mendez, E., Schmalstieg, D., and Feiner, S. Experiences on attention direction through manipulation of salient features. Proc. PIVE 2010 (2nd IEEE Virtual Reality Workshop on Perceptual Illusions in Virtual Environments), Waltham, MA, March 21, 2010.
Paper abstract to appear here;
Henderson, S. and Feiner, S. Opportunistic tangible user interfaces for augmented reality. IEEE Transactions on Visualization and Computer Graphics, 16(1), January/February 2010, 4-16.
Paper abstract to appear here
Lonsdale H., Jensen C., Wynn E., Dedual N., Cutting and Pasting Up: 'Documents' and Provenance in a Complex Work Environment, Hawaii International Conference on System Sciences (HICSS), Jan 2010.
This paper explores how historical models of documents as stable information artifacts should be replaced with a new model of information objects that exist around and between document boundaries. The new model is information-centered; files and documents are seen as snapshots in time, part of individual and group information flows. The flows are versioned across multiple documents and applications. This model is based on new fine-grained tracking and analysis capabilities derived from machine learning research. Using these capabilities, we outline a view of documents based on results from an experiment that tracked the activities of 17 information workers doing their regular work over 8 weeks. The research supports certain postmodern theories of work, specifically the notion of "pasting up." The construct of provenance describes information flows and networks and is the core theoretical base of the paper. The research had two goals: to understand what users do moment to moment and to provide insight for information management. There are further implications for information design in managing multi-tasking workloads, for methods of studying computer-based work, and for an updated desktop user interface
2009
Henderson, S. and Feiner, S. Evaluating the benefits of augmented reality for task localization in maintenance of an armored personnel carrier turret. Proc. ISMAR 2009 (IEEE Int. Symp. on Mixed and Augmented Reality), Orlando, FL, October 19-22, 2009, 135-144. (Recipient of ISMAR 2009 Best Paper Award.)
Paper abstract goes here.
White, S., Feng, D., and Feiner, S. Interaction and presentation techniques for shake menus in tangible augmented reality. Proc. ISMAR 2009 (IEEE Int. Symp. on Mixed and Augmented Reality), Orlando, FL, October 19-22, 2009, 39-48.
Paper abstract goes here.
Oda, O. and Feiner, S. Interference avoidance in multi-user hand-held augmented reality. Proc. ISMAR 2009 (IEEE Int. Symp. on Mixed and Augmented Reality), Orlando, FL, October 19-22, 2009, 13-22.
Paper abstract goes here.
Tokusho, Y. and Feiner, S. Prototyping an outdoor mobile augmented reality street view application. Workshop: Let's Go Out: Research in Outdoor Mixed and Augmented Reality, ISMAR 2009 (IEEE Int. Symp. on Mixed and Augmented Reality), Orlando, FL, October 19, 2009.
Paper abstract goes here.
Holz, C. and Feiner, S. Relaxed selection techniques for querying time-series graphs. Proc. UIST 2009 (ACM Symposium on User Interface Software and Technology), Victoria, BC, Canada, October 4-7, 2009, 213-222.
Paper abstract goes here.
White, S. and Feiner, S. SiteLens: Situated visualization techniques for urban site visits. Proc. CHI 2009, Boston, MA, April 4-9, 2009, 1117-1120.
Paper abstract goes here.
Wilcox, L., Lu, J., Lai, J., Feiner, S., and Jordan, D. activeNotes: Computer-assisted creation of patient progress notes. CHI 2009 Extended Abstracts, Boston, MA, April 4-9, 2009, 3323-3328.
We present activeNotes, a prototype application that supports the creation of Critical Care Notes by physicians in a hospital intensive care unit. activeNotes integrates automated, context-sensitive patient data retrieval and user control of automated data updates and alerts into the note-creation process. In a user study at New York Presbyterian Hospital, we gathered qualitative feedback on the prototype from 15 physicians. The physicians found activeNotes to be valuable and said they would use it to create both formal notes for medical records and informal notes. One surprising finding is that while physicians have rejected template-based clinical documentation systems in the past, they expressed a desire to use activeNotes to create personalized, physician-specific note templates to be reused with a given patient, or for a given condition.
White, S., Feng, D., and Feiner, S. Poster: Shake menus: Towards activation and placement techniques for prop-based 3D graphical menus. Proc. 3DUI 2009 (Fourth IEEE Symp. on 3D User Interfaces), Lafayette, LA, March 14-15, 2009, 129-130.
Paper abstract goes here.
Olwal, A. and Feiner, S. Spatially aware handhelds for high-precision tangible interaction with large displays. Proc. TEI 09 (Third Int. Conf. on Tangible and Embedded Interaction), Cambridge, UK, February 16-18, 2009, 181-188.
Paper abstract goes here.
2008
Henderson, S. and Feiner, S.Opportunistic controls: Leveraging natural affordances as tangible user interfaces for augmented reality. Proc. ACM VRST 2008 (ACM Symp. on Virtual Reality Software and Technology), Bordeaux, France, October 27-29, 2008, 211-218. (Recipient of VRST 2008 Best Paper Award.)
Paper abstract to appear here.
Belhumeur, P., Chen, D., Feiner, S., Jacobs, D., Kress, W., Ling, H., Lopez, I., Ramamoorthi, R., Sheorey, S., White, S., and Zhang, L. Searching the world's herbaria: A system for visual identification of plant species. Proc. ECCV 2008 (2European Conf. on Computer Vision), Part IV, LNCS 5305, Marseille, France, October 12-18, 2008, 116-129.
Paper abstract to appear here.
Murphy, C., Sheth, S., Kaiser, G., Wilcox, L. genSpace: Exploring Social Networking Metaphors for Knowledge Sharing and Scientific Collaborative Work. In Proc ACM/IEEE SoSEA 2008, L'Aquila, Italy, September 2008, 29-36.
Many collaborative applications, especially in scientific research, focus only on the sharing of tools or the sharing of data. We seek to introduce an approach to scientific collaboration that is based on the sharing of knowledge. We do this by automatically building organizational memory and enabling knowledge sharing by observing what users do with a particular tool or set of tools in the domain, through the addition of activity and usage monitoring facilities to standalone applications. Once this knowledge has been gathered, we apply social networking models to provide collaborative features to users, such as suggestions on tools to use, and automatically-generated sequences of actions based on past usage amongst the members of a social network or the entire community. In this work, we investigate social networking models as an approach to scientific knowledge sharing, and present an implementation called genSpace, which is built as an extension to the geWorkbench
platform for computational biologists. Last, we discuss the approach from the viewpoint of social software engineering.
Olwal, A., Feiner, S., and Heyman, S. Rubbing and tapping for precise and rapid selection on touch-screen displays. Proc. ACM CHI 2008, Florence, Italy, April 5-10, 2008, 295-304. (Recipient of CHI 2008 Best Paper Honorable Mention.)
Paper abstract to appear here.
Oda, O., Lister, L., White, S., and Feiner, S. Developing an augmented reality racing game. Proc. INTETAIN 2008 (Int. Conf. on Intelligent Technologies for Interactive Entertainment), Playa del Carmen, Mexico, January 8-10, 2008.
Paper abstract to appear here.
Henderson, S. and Feiner, S. Mixed and augmented reality for training. In Schmorrow, D., Cohen, J., and Nicholson, D. (eds.), The PSI Handbook of Virtual Environments for Training and Education, vol. 2. Praeger Security International, Westport, CT, 2008, 135-156.
Paper abstract to appear here.
2007
White, S., Lister, L., and Feiner, S. Visual hints for tangible gestures in augmented reality. Proc. ISMAR 2007 (IEEE and ACM Int. Symp. on Mixed and Augmented Reality), Nara, Japan, November 13-16, 2007, 47-50.
Paper abstract to appear here.
White, S., Morozov, P., and Feiner, S. Site visit by situated visualization. In Calabrese, F. et al., Urban computing and mobile devices, IEEE Pervasive Computing, 6(3), July-September 2007, 52-57.
Paper abstract to appear here.
Benko, H. and Feiner, S. Pointer warping in heterogeneous multi-monitor environments. Proc. Graphics Interface 2007, Montreal, Canada, May 28-30, 2007, 111-117.
Paper abstract to appear here.
White, S., Marino, D., and Feiner, S. Designing a mobile user interface for automated species identification. Proc. ACM CHI 2007, San Jose, CA, April 28-May 3, 2007, 291-294. (Recipient of CHI 2007 Best Note Award.)
Paper abstract to appear here.
Ishak, E. and Feiner, S. Content-aware layout. ACM CHI 2007 Extended Abstracts, San Jose, CA, April 28-May 3, 2007, 2459-2464.
Paper abstract to appear here.
Benko, H. and Feiner, S. Balloon selection: A multi-finger technique for accurate low-fatigue 3D selection. Proc. 3DUI 2007 (Second IEEE Symp. on 3D User Interfaces), Charlotte, NC, March 10-11, 2007, 79-86.
Paper abstract to appear here.
Yee, B., Sturman, D., and Feiner, S. Integrating video game development experience in an academic framework. Proc. Microsoft Academic Days on Game Development in Computer Science Education, February 22-25, 2007, 28-32.
Paper abstract to appear here.
Bell, B. and Feiner, S. Representing and processing display screen space in augmented reality. In Haller, M., Billinghurst, M., and Thomas, B. (eds.), Emerging Technologies of Augmented Reality: Interfaces and Design, Idea Group, Hershey, PA, 2007, 110-136.
Paper abstract to appear here.
2006
Guven, S. and Feiner, S. Visualizing and navigating complex situated hypermedia in augmented and virtual reality. Proc. ISMAR 2006 (IEEE and ACM Int. Symp. on Mixed and Augmented Reality), Santa Barbara, CA, October 22-25, 2006, 155-158.
Paper abstract to appear here.
Guven, S., Feiner, S., and Oda, O. Mobile augmented reality interaction techniques for authoring situated media on-site. Poster. Proc. ISMAR 2006 (IEEE and ACM Int. Symp. on Mixed and Augmented Reality), Santa Barbara, CA, October 22-25, 2006, 235-236.
Paper abstract to appear here.
Ishak, E. and Feiner, S. Content-aware scrolling. Proc. UIST 2006 (ACM Symposium on User Interface Software and Technology), Montreux, Switzerland, October 15-18, 2006, 155-158.
Paper abstract to appear here.
White, S., Marino, D., and Feiner, S. LeafView: A user interface for automated botanical species identification and data collection. Poster. Adjunct Proc. of ACM UIST 2006, Montreux, Switzerland, October 15-18, 2006, 101-102.
Paper abstract to appear here.
Blasko, G. and Feiner, S. Evaluation of an eyes-free cursorless numeric entry system for wearable computers. Proc. ISWC 2006 (IEEE Int. Symp. on Wearable Computers), Montreux, Switzerland, October 11-14, 2006, 21-28. (Nominee for IEEE ISWC 2006 Best Paper Award.)
Paper abstract to appear here.
Agarwal, G., Belhumeur, P., Feiner, S., Jacobs, D., Kress, W.J., Ramamoorthi, R., Bourg, N., Dixit, N., Ling, H., Mahajan, D., Russell, R., Shirdhonkar, S., Sunkavalli, K., and White, S. First steps toward an electronic field guide for plants. Taxon, 55(3), August 2006, 597-610.
Paper abstract to appear here.
Blasko, G., Narayanaswami, C., and Feiner, S. Prototyping retractable string-based interaction techniques for dual-display mobile devices. Proc. ACM CHI 2006, Montreal, Canada, April 22-27, 2006, 369-372.
Paper abstract to appear here.
Livingston, M., Lederer, A., Ellis, S., White, S., and Feiner, S. Virtual vergence calibration for augmented reality displays. Poster. Proc. IEEE Virtual Reality 2006, Alexandria, VA, March 25-29, 2006, 287-288.
Paper abstract to appear here.
White, S., Feiner, S., and Kopylec, J. Virtual vouchers: Prototyping a mobile augmented reality user interface for botanical species identification. Proc. 3DUI 2006 (First IEEE Symp. on 3D User Interfaces), Alexandria, VA, March 25-26, 2006, 119-126 & 181.
Paper abstract to appear here.
Guven, S. and Feiner, S. Interaction techniques for exploring historic sites through situated media. Proc. 3DUI 2006 (First IEEE Symp. on 3D User Interfaces), Alexandria, VA, March 25-26, 2006 111-118 & 180.
Paper abstract to appear here.
2005
Güven, S., Podlaseck M. and Pingali G.. PICASSO: Pervasive Information Chronicling, Access, Search and Sharing in Organizations. International Conference on Pervasive Computing and Communications 2005. Kauai Island, Hawaii, March 8-12, 2005, p. 341-350
Paper abstract to appear here.
Olivier, P. and Feiner, S. Editorial: Special issue on language, speech and gesture for VR. Virtual Reality, 8(4), 2005, 199-200.
Paper abstract to appear here.
Olwal, A. and Feiner, S. Using Prosodic Features of Speech and Audio Localization in Graphical User Interfaces. IUI 2005. San Diego, CA, Jan 9-12, 2005, p. 284-286
Paper abstract to appear here.
Blasko, G., Coriand, F., and Feiner, S. Exploring interaction with a simulated wrist-worn projection display. Proc. ISWC 2005 (IEEE Int. Symp. on Wearable Computers), Osaka, Japan, October 18-21, 2005, 2-9.
Paper abstract to appear here.
Sandor, C., Olwal, A., Bell, B., and Feiner, S. Immersive mixed-reality configuration of hybrid user interfaces. Proc. ISMAR 2005 (IEEE and ACM Int. Symp. on Mixed and Augmented Reality), Vienna, Austria, October 5-8, 2005, 110-113.
Paper abstract to appear here.
Yao, Y., Cheng, G., Rajurkar, K., Kovacevic, R., Feiner, S., and Zhang, W. Combined research and curriculum development of nontraditional manufacturing. European Journal of Engineering Education, 30(3), September 2005, 363-376.
Paper abstract to appear here.
Blasko, G. and Feiner, S. Input devices and interaction techniques to minimize visual feedback requirements in augmented and virtual reality. Proc. HCI International 2005 (11th Int. Conf. on Human-Computer Interaction), Las Vegas, NV, July 22-27, 2005.
Paper abstract to appear here.
Yao, Y., Cheng, G., Feiner, S., Zhang, W., Rajurkar, K., and Kovacevic, R. A web-based curriculum development on nontraditional manufacturing with interactive features. International Journal of Engineering Education, 21(3), 2005, 546-554.
Paper abstract to appear here.
Benko, H. and Feiner, S. Multi-monitor mouse. ACM CHI 2005 Extended Abstracts, Portland, Oregon, April 2-7, 2005, 1208-1211.
Paper abstract to appear here.
Bell, B., Feiner, S., and Hollerer, T. Maintaining visibility constraints for view management in 3D user interfaces. In Stock, O. and Zancanaro, M. (eds.), Multimodal Intelligent Information Presentation (Text, Speech and Language Technology, Vol. 27), Dordrecht, The Netherlands, Springer, 2005, 255-277.
Paper abstract to appear here.
Benko,H. Edward Ishak, Feiner S. Cross-Dimensional Gestural Interaction Techniques for Hybrid Immersive Environments. VR 2005. March 10-12, 2005 p.209-116
Paper abstract to appear here.
2004
Höllerer, T. and Feiner, S. Mobile augmented reality. In Karimi, H. and Hammad, A. (eds.), Telegeoinformatics: Location-Based Computing and Services, Taylor and Francis, CRC Press, 2004, 221-260.
Paper abstract to appear here.
Goose, S., Güven,S., Zhang X., Sudarsky, S., and Navab, N. . PARIS: Fusing Vision-based Location Tracking with Standards-based 3D Visualization and Speech Interaction on a PDA. International Conference on Distributed Multimedia Systems 2004. San Francisco, CA, USA, Sept 8-10, 2004, p. 75-80
Lok S.,Feiner S., Ngai G., Evaluation of Visual Balance for Automated Layout. IUI 2004. Madeira, Funchal, Portugal, Jan 13-16, 2004, p. 101-108.
Paper abstract to appear here.
Eaddy M., Blaskó G., Babcock J., Feiner S., My Own Private Kiosk: Privacy-Preserving Public Displays. ISWC 2004. Arlington, VA, USA Oct 31-Nov 3, 22004, p.132-135
Paper abstract to appear here.
Blaskó G., Beaver W., Kamvar M., Feiner S., Workplane-Orientation Sensing Techniques for Tablet PCs (poster). UIST 2004. Santa Fe, NM, USA 24-27 October, 2004, p.1-2
Paper abstract to appear here.
Blaskó G., Feiner S., Single-Handed Interaction Techniques for Multiple Pressure-Sensitive Strips. CHI 2004. Vienna, Austria, April 24-29, 2004, p.1461-1464
Paper abstract to appear here.
Hallaway D., Höllerer T., Feiner S., Bridging the Gaps: Hybrid Tracking for Adaptive Mobile Augmented Reality. AAI-AIMS 2004. Applied Artificial Intelligence, Special Edition on Artificial Intelligence in Mobile Systems, vol. 18(6), July, 2004.
Paper abstract to appear here.
Sandor C., Bell B., Olwal A., Temiyabutr N., Feiner S. Visual End User Configuration of Hybrid User Interfaces. ACM SIGMM 2004 Workshop on Effective Telepresence 2004. New York, NY, USA, Oct 15, 2004, p. 67-68
Paper abstract to appear here.
Olwal A., Feiner S., Unit: Modular Development of Distributed Interaction Techniques for Highly Interactive User Interfaces. GRAPHITE 2004. Singapore, Singapore, June 15-18, 2004, p. 131-138
Paper abstract to appear here.
Allen P., Feiner S., Meskell L., Ross K., Troccoli A., Smith B., Benko H., Ishak E., and Conlon J.. Digitally Modeling, Visualizing and preserving Archaeological Sites (poster). JCDL 2004. Tuscon AZ, USA June 7-11, 2004, p.389
Paper abstract to appear here.
Allen P., Feiner S., Troccoli A., Benko H., Edward Ishak, and Smith B., Seeing into the Past: Creating a 3D Modeling Pipeline for Archaeological Visualization. 3DPVT 2004. p.751-758
Paper abstract to appear here.
Benko H., Ishak E., Feiner S., Collaborative Mixed Reality Visualization of an Archaeological Excavation, ISMAR 2004. Nov 2-5, 2004 p.132-140
Paper abstract to appear here.
Blaskó G., Feiner S., An Interaction System for Watch Computers Using Tactile Guidance and Bidirectional Segmented Strokes. ISWC 2004. Arlington, VA, USA Oct 31-Nov 3, 2004, p.120-123
Paper abstract to appear here.
Ishak E.,Feiner S., Interacting with Hidden Content Using Content-Aware Free-Space Transparency. UIST 2004. Sante Fe, NM, USA, Oct 24-27, 2004, p.189-192
Paper abstract to appear here.
2003
Güven S., Feiner S., A Hypermedia Authoring Tool for Augmented and Virtual Reality. New Review of Hypermedia and Multimedia 2003. Special Issue on Hypermedia Beyond the Desktop, Taylor and Francis Group Publishing, London, United Kingdom, 2004, Vol.9. p. 89-116
Paper abstract to appear here.
Güven S., Feiner S., Authoring 3D Hypermedia for Wearable Augmented and Virtual Reality. ISWC 2003. White Plains, NY, October 21-23, p. 118-126.
Paper abstract to appear here.
Lok S., Kan M., Employing Natural Language Summarization and Automated Layout for Effective Presentation and Navigation of Information Retrieval Results. Proceedings of the 12th International World Wide Web Conference 2003.
Paper abstract to appear here.
Blaskó G., Feiner S., An Extended menu Navigation Interface Using Multiple Pressure-Sensitive Strips (poster). ISWC 2003. White Plains, NY, USA, 21-23 October, 2003. p.128-129.
Paper abstract to appear here.
Ishak E., Feiner S., Free-Space Transparency: Exposing Hidden Content Through Unimportant Screen Space. UIST 2003. (Conference Supplement), Vancouver, BC, November 2-5.
Paper abstract to appear here.
Kaiser E., Olwal A., McGee D., Benko H., Corradini A., Li X., Feiner S., and Cohen P., Mutual Disambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality. ICMI 2003. Vancouver, BC. November 5-7, p. 12-19.
Paper abstract to appear here.
Hallaway D., Höllerer T., Feiner S., Coarse, Inexpensive, Infrared Tracking for Wearable Computing. ISWC 2003. White Plains, NY, October 21-23, pp. 69-78
Paper abstract to appear here.
Olwal A., Feiner S., The Flexible Pointer: An Interaction Technique for Augmented and Virtual Reality. UIST 2003. (Conference Supplement), Vancouver, BC, November 2-5, p. 81-82.
Paper abstract to appear here.
Olwal A., Feiner S., Rubbing the Fisheye: Precise Touch-Screen Interaction with Gestures and Fisheye Views. UIST 2003. (Conference Supplement), Vancouver, BC, November 2-5, p. 83-84.
Paper abstract to appear here.
Olwal A., Benko H., Feiner S. SenseShapes: Using Statistical Geometry for Object Selection in a Multimodal Augmented Reality System. ISMAR 2003. Tokyo, Japan, October 7-10, p. 300-301.
Paper abstract to appear here.
2002
Lok S., Feiner S., Chiong W., Hirsch Y., A Graphical User Interface Toolkit Approach to Thin Client Computing. Proceedings of the 11th International World Wide Web Conference 2002.
Paper abstract to appear here.
Lok S., Feiner S., The AIL Automated Interface Layout System. Proceedings of the Conference on Intelligent User Interfaces 2002.
Paper abstract to appear here.
Blaskó G., Feiner S., A Menu Interface for Wearble Computing. ISWC 2002. Seattle, WA, USA, 7-10 October, 2002. p.164-165.
Paper abstract to appear here.
Bell B., Höllerer T., Feiner S., An Annotated Situation-Awareness Aid for Augmented Reality. UIST 2002. Paris, France.
Paper abstract to appear here.
Bell B., Feiner S., Höllerer T.. Information at a Glance. Computer Graphics and Applications, IEEE 2002. vol. 22, pp. 6-9.
Paper abstract to appear here.
2001
Bell B., Feiner S., Höllerer T., View Management for Virtual and Augmented Reality. UIST 2001. (ACM Symp. on User Interface Software and Technology), San Diego, CA. November 11-14, 2001. p. 101-110.
Paper abstract to appear here.
Broll W., Schåfer L., Höllerer T., Bowman D., Interface with angels: The future of VR and AR interfaces. IEEE 2001. IEEE Computer Graphics and Applications, 21(6):14-17, November/December 2001.
Paper abstract to appear here.
Höllerer T., Hallaway D., Tinna N., Feiner S., Steps toward accommodating variable position tracking accuracy in a mobile augmented reality system. AIMS 2001. In 2nd Int. Worksh. on Artificial Intelligence in Mobile Systems (AIMS '01), pages 31-37, 2001.
Paper abstract to appear here.
Höllerer T., Feiner S., Hallaway D., Bell B., Lanzagorta M., Brown D., Julier S., Baillot Y., and Rosenblum L. User interface management techniques for collaborative mobile augmented reality. Computers and Graphics 2001. 25(5):799-810, October 2001
Paper abstract to appear here.
Lok S., Feiner S., A Survey of Automated Layout Techniques for Information Presentations. Proceedings of the SmartGraphics Symposium 2001.
Paper abstract to appear here.
2000
Bell B., Feiner S. Dynamic Space Management for User Interfaces. UIST 2000. (ACM Symp. on User Interface Software and Technology), San Diego, CA. November 5-8, 2000. p. 238-248.
Paper abstract to appear here.
1999
Feiner, S. The Importance of Being Mobile: Some Social Consequences of Wearable Augmented Reality Systems. IWAR 1999. (Int. Workshop on Augmented Reality), San Francisco, CA, October 20-21, 1999, pp. 145-148.
Paper abstract to appear here.
Butz A., Höllerer T., Feiner S., MacIntyre B., Beshers C. Enveloping Users and Computers in a Collaborative 3D Augmented Reality. IWAR 1999. (Int. Workshop on Augmented Reality), San Francisco, CA, October 20-21, 1999, pp. 35-44.
Paper abstract to appear here.
Höllerer T., Feiner S., Terauchi T., Rashid G., Hallaway D. Exploring MARS: Developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers and Graphics 1999. 23(6):779-785
Paper abstract to appear here.
Höllerer T., Feiner S., Pavlik J.. Situated Documentaries: Embedding Multimedia Presentations in the Real World. ISWC 1999. (Third Int. Symp. on Wearable Computers), San Francisco, CA, October 18-19, 1999, pp. 79-86
Paper abstract to appear here.
1997
Feiner S., MacIntyre B., Höllerer T., and Webster A. A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. ISWC 1997. (Int. Symp. on Wearable Computers), October 13-14, 1997, Cambridge, MA.
Paper abstract to appear here.
1996
Seligmann D., Feiner S., MacIntyre B., Massie W., Krueger T. , Augmented reality in architectural construction, inspection and renovation, ASCE 1996. Proc. ASCE Third Congress on Computing in Civil Engineering, Anaheim, CA, June 17-19, 1996, 913-919.
Paper abstract to appear here.
MacIntyre B., Feiner S., Language-level support for exploratory programming of distributed virtual environments, UIST 1996. n Proc. UIST '96 (ACM Symp. on User Interface Software and Technology), Seattle, WA, November 6-8, 1996, 83-95
Paper abstract to appear here.
MacIntyre B., Feiner S. Future Multimedia User Interfaces, Multimedia Systems 1996. 4(5)
Paper abstract to appear here.
1995
Feiner S., Webster A., Krueger T., MacIntyre B., Keller E., Architectural anatomy, Presence 1995. In Presence, 4(3), Summer 1995, 318-325.
Paper abstract to appear here.
Crutcher L., Lazar A., Feiner S., Zhou M., Managing networks through a virtual world, IEEE 1995. IEEE Parallel and Distributed Technology, 3(2), Summer 1995, 4-13.
Paper abstract to appear here.
1993
Feiner S., MacIntyre B., Haupt M., Solomon E, Windows on the world: 2D windows for 3D augmented reality. UIST 1993. Proc. UIST '93 (ACM Symp. on User Interface Software and Technology), Atlanta, GA, November 3-5, 1993, 145-155.
Paper abstract to appear here.
Feiner S., MacIntyre B., Seligmann D., Knowledge-based augmented reality. Communications of the ACM 1993. 36(7), July 1993, 52-62.
Paper abstract to appear here.
Beshers C., Feiner S., AutoVisual: Rule-based design of interactive multivariate visualizations., IEEE 1993. IEEE Computer Graphics and Applications, 13(4), July 1993, 41-49.
Paper abstract to appear here.
Karp P., Feiner S., Automated presentation planning of animation using task decomposition with heuristic reasoning., Graphics Interface 1993. Toronto, Canada, May 17-21, 1993, 118-127.
Paper abstract to appear here.
1991
Feiner S., McKeown K. Automating the generation of coordinated multimedia explanations. IEEE 1991. In IEEE Computer, 24(10), October 1991, 33-41
Paper abstract to appear here.
Feiner S., Shamash A., Hybrid user interfaces: Breeding virtually bigger interfaces for physically smaller computers. UIST 1991. Proc. UIST '91 (ACM Symp. on User Interface Software and Technology), Hilton Head, SC, November 11-13, 1991, 9-17.
Paper abstract to appear here.
Duchamp D., Feiner S., Maguire S. Software technology for wireless mobile computing. IEEE 1991. In IEEE Network, 5(6), November 1991, 12-18.
Paper abstract to appear here.
Seligmann D., Feiner S., Automated generation of intent-based 3D illustrations. SIGGRAPH 1991. (Proc. ACM SIGGRAPH '91, Las Vegas, NV, July 28-August 2, 1991), 123-132
Paper abstract to appear here.
1990
Feiner S., Beshers C., Worlds within worlds: Metaphors for exploring n-dimensional virtual worlds. UIST 1990. Proc. UIST '90 (ACM Symp. on User Interface Software and Technology), Snowbird, UT, October 3-5, 1990, 76-83.
Paper abstract to appear here.
Feiner S., Beshers C., Visualizing n-dimensional virtual worlds with n-Vision. I3D 1990. Computer Graphics, 24(2), March 1990 (Proc. 1990 Symp. on Interactive 3D Graphics, Snowbird, UT, March 25-28, 1990), 37-38.
Paper abstract to appear here.