Results 1 - 10
of
20
360 ◦ panoramic overviews for location-based services
- In ACM CHI
, 2012
"... Figure 1. Left: 360 ° panoramic overviews of a laboratory room. We consider three different shapes for visualizing the panorama. Right: the setup for our user studies consisted of a static display and a tracked wand (used for pointing in the environment). We investigate 360 ° panoramas as overviews ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
(Show Context)
Figure 1. Left: 360 ° panoramic overviews of a laboratory room. We consider three different shapes for visualizing the panorama. Right: the setup for our user studies consisted of a static display and a tracked wand (used for pointing in the environment). We investigate 360 ° panoramas as overviews to support users in the task of locating objects in the surrounding environment. Panoramas are typically visualized as rectangular photographs, but this does not provide clear cues for physical directions in the environment. In this paper, we conduct a series of studies with three different shapes: Frontal, Top-Down and Bird’s Eye; the last two shapes are chosen because they provide a clearer representation of the spatial mapping between panorama and environment. Our results show that good readability of the panorama is most important and that a clear representation of the spatial mapping plays a secondary role. This paper is the first to provide understanding on how users exploit 360 ° panoramic overviews to locate objects in the surrounding environment and how different design factors can affect user performance. Author Keywords Panorama; visualization; location-based service; magic lens
Precise pointing techniques for handheld Augmented Reality
"... Abstract. We propose two techniques that improve accuracy of pointing at physical objects for handheld Augmented Reality (AR). In handheld AR, pointing accuracy is limited by both touch input and camera viewpoint instability due to hand jitter. The design of our techniques is based on the relationsh ..."
Abstract
-
Cited by 4 (4 self)
- Add to MetaCart
(Show Context)
Abstract. We propose two techniques that improve accuracy of pointing at physical objects for handheld Augmented Reality (AR). In handheld AR, pointing accuracy is limited by both touch input and camera viewpoint instability due to hand jitter. The design of our techniques is based on the relationship between the touch input space and two visual reference frames for on-screen content, namely the screen and the physical object that one is pointing at. The first technique is based on Shift, a touch-based pointing technique, and video freeze, in order to combine the two reference frames for precise pointing. Contrastingly-without freezing the video-, the second technique offers a precise mode with a cursor that is stabilized on the physical object and controlled with relative touch inputs on the screen. Our experimental results show that our techniques are more accurate than the baseline techniques, namely direct touch on the video and screen-centered crosshair pointing.
Who’s that girl? handheld augmented reality for printed photobooks
- In Proc. Interact
, 2011
"... All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
(Show Context)
All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.
User experiences with augmented reality aided navigation on phones. Poster at ISMAR
, 2011
"... We investigate user experiences when using augmented reality (AR) as a new aid to navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit AR. Based on previous work on augmented photograp ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
(Show Context)
We investigate user experiences when using augmented reality (AR) as a new aid to navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit AR. Based on previous work on augmented photographs, we hypothesize that AR is used more to support wayfinding at static locations when users approach a road intersection. In partial contrast to this hypothesis, our results from a user evaluation hint that users will expect to use the system while walking. Further, our results also show that AR is usually exploited shortly before and after road intersections, suggesting that tracking support will be mostly needed in proximity of road intersections. INDEX TERMS: H.5.1. Artificial, augmented and virtual realities. 1
Integrating Spatial Sensing to an Interactive Mobile 3D Map
"... We present an interaction technique that integrates spatial sensing to an interactive 3D city model in order to support efficient localization of objects (esp. buildings) known or remembered in the real world. The technique offers a unified 3D interaction scheme for both visible and remote objects. ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
We present an interaction technique that integrates spatial sensing to an interactive 3D city model in order to support efficient localization of objects (esp. buildings) known or remembered in the real world. The technique offers a unified 3D interaction scheme for both visible and remote objects. In the egocentric view, sensor data from the mobile device (accelometer, gyroscope, GPS) is utilized to couple the viewport to user’s movement similarly as in mobile AR. The technique offers easy shifting from the viewport-coupled mode to a top-down view where movement is POI-based. A field experiment compared it to an exocentric technique resembling the traditional pan-and-zoom in 2D mobile maps. The two techniques showed differential benefits for target acquisition performance.
Brush-and-Drag: A Multi-touch Interface For Photo Triaging
"... Due to the convenience of taking pictures with various digital cameras and mobile devices, people often end up with multiple shots of the same scene with only slight variations. To enhance photo triaging, which is a very common photowork activity, we propose an effective and easy-to-use brush-anddra ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Due to the convenience of taking pictures with various digital cameras and mobile devices, people often end up with multiple shots of the same scene with only slight variations. To enhance photo triaging, which is a very common photowork activity, we propose an effective and easy-to-use brush-anddrag interface that allows the user to interactively explore and compare photos within a broader scene context. First, we brush to mark an area of interest on a photo with our finger(s); our tailored segmentation engine automatically determines corresponding image elements among the photos. Then, we can drag the segmented elements from different photos across the screen to explore them simultaneously, and further perform simple finger gestures to interactively rank photos, select favorites for sharing, or to remove unwanted ones. This novel interaction method was implemented on a consumer-level tablet computer and demonstrated to offer effective interactions in a user study. Author Keywords Digital photo collections; User interaction
Enhancing Handheld Navigation Systems with Augmented Reality
- Proceedings of MobileHCI, Workshop on Mobile AR
, 2011
"... We investigate the role of augmented reality (AR) as a new kind of handheld interface to enhance navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit the AR interface. Based on previou ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
We investigate the role of augmented reality (AR) as a new kind of handheld interface to enhance navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit the AR interface. Based on previous work on augmented photographs, we hypothesize that AR is more useful as a support for wayfinding at static locations just before road intersections. In partial contradiction with our hypotheses, our results show that AR is used mostly while walking, usually shortly before and after road intersections. Our results help drawing considerations informing both the design of AR interfaces and the development of tracking technologies.
Comparative evaluation of interfaces for presenting location-based information on mobile devices
- In Digital Libraries: For Cultural Heritage, Knowledge Dissemination, and Future Creation
, 2011
"... Comparative evaluation of interfaces for presenting location-based information on mobile devices. Lecture ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Comparative evaluation of interfaces for presenting location-based information on mobile devices. Lecture
Transitional Augmented Reality Navigation for Live Captured Scenes
"... Figure 1: After placing physical objects on a table, our system offers the possibility to structurally navigate an unprepared scene with a set of new transitional navigation techniques in AR (Left) and VR (Right) modes. The techniques seamlessly switch from AR to VR modes. Augmented Reality (AR) app ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Figure 1: After placing physical objects on a table, our system offers the possibility to structurally navigate an unprepared scene with a set of new transitional navigation techniques in AR (Left) and VR (Right) modes. The techniques seamlessly switch from AR to VR modes. Augmented Reality (AR) applications require knowledge about the real world environment in which they are used. This knowledge is often gathered while developing the AR application and stored for future uses of the application. Consequently, changes to the real world lead to a mismatch between the previously recorded data and the real world. New capturing techniques based on dense Simul-taneous Localization and Mapping (SLAM) not only allow users to capture real world scenes at run-time, but also enables them to capture changes of the world. However, instead of using previously recorded and prepared scenes, users must interact with an unpre-pared environment. In this paper, we present a set of new interaction techniques that support users in handling captured real world envi-ronments. The techniques present virtual viewpoints of the scene based on a scene analysis and provide natural transitions between the AR view and virtual viewpoints. We demonstrate our approach with a SLAM based prototype that allows us to capture a real world scene and describe example applications of our system.
General Terms
"... The use of Augmented Reality for overlaying visual information on print media like street posters has become widespread over the last few years. While this user interface metaphor represents an instance of cross-media information spaces the specific context of its use has not yet been carefully stud ..."
Abstract
- Add to MetaCart
(Show Context)
The use of Augmented Reality for overlaying visual information on print media like street posters has become widespread over the last few years. While this user interface metaphor represents an instance of cross-media information spaces the specific context of its use has not yet been carefully studied, resulting in productions generally relying on trial-and-error approaches. In this paper, we explicitly consider mobile contexts in the consumption of augmented print media. We explore the design space of hybrid user interfaces for augmented posters and describe different case studies to validate our approach. Outcomes of this work inform the design of future interfaces for publicly accessible augmented print media in mobile contexts. Author Keywords augmented reality; poster; print media; cross-media; hybrid user interface; mobile physical interaction; public displays