Results 1 - 10
of
162
Jogging the distance
- Proceedings of the SIGCHI conference on Human Factors in computing systems, ACM
, 2007
"... Exertion games require investing physical effort. The fact that such games can support physical health is tempered by our limited understanding of how to design for engaging exertion experiences. This paper introduces the Exertion Framework as a way to think and talk about Exertion Games, both for t ..."
Abstract
-
Cited by 97 (22 self)
- Add to MetaCart
(Show Context)
Exertion games require investing physical effort. The fact that such games can support physical health is tempered by our limited understanding of how to design for engaging exertion experiences. This paper introduces the Exertion Framework as a way to think and talk about Exertion Games, both for their formative design and summative analysis. Our Exertion Framework is based on the ways in which we can conceive of the body investing in gamedirected exertion, supported by four perspectives on the body (the Responding Body, Moving Body, Sensing Body and Relating Body) and three perspectives on gaming (rules, play and context). The paper illustrates how this framework was derived from prior systems and theory, and presents a case study of how it has been used to inspire novel exertion interactions. Author Keywords Exertion Interface, whole-body interaction, exergame,
Tangible Bits: Beyond Pixels
, 2008
"... Tangible user interfaces (TUIs) provide physical form to digital information and computation, facilitating the direct manipulation of bits. Our goal in TUI development is to empower collaboration, learning, and design by using digital technology and at the same time taking advantage of human abiliti ..."
Abstract
-
Cited by 73 (4 self)
- Add to MetaCart
Tangible user interfaces (TUIs) provide physical form to digital information and computation, facilitating the direct manipulation of bits. Our goal in TUI development is to empower collaboration, learning, and design by using digital technology and at the same time taking advantage of human abilities to grasp and manipulate physical objects and materials. This paper discusses a model of TUI, key properties, genres, applications, and summarizes the contributions made by the Tangible Media Group and other researchers since the publication of the first Tangible Bits
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
- In Proc. ACM UIST ’10
"... daniel.bierwirth @ student.hpi.uni-potsdam.de Screen-less wearable devices allow for the smallest form factor and thus the maximum mobility. However, current screen-less devices only support buttons and gestures. Pointing is not supported because users have nothing to point at. However, we challenge ..."
Abstract
-
Cited by 48 (2 self)
- Add to MetaCart
(Show Context)
daniel.bierwirth @ student.hpi.uni-potsdam.de Screen-less wearable devices allow for the smallest form factor and thus the maximum mobility. However, current screen-less devices only support buttons and gestures. Pointing is not supported because users have nothing to point at. However, we challenge the notion that spatial interaction requires a screen and propose a method for bringing spatial interaction to screen-less devices. We present Imaginary Interfaces, screen-less devices that allow users to perform spatial interaction with empty hands and without visual feedback. Unlike projection-based solutions, such as Sixth Sense, all visual “feedback ” takes place in the user’s imagination. Users define the origin of an imaginary space by forming an L-shaped coordinate cross with their non-dominant hand. Users then point and draw with their dominant hand in the resulting space. With three user studies we investigate the question: To what extent can users interact spatially with a user interface that exists only in their imagination? Participants created simple drawings, annotated existing drawings, and pointed at locations described in imaginary space. Our findings suggest that users ’ visual short-term memory can, in part, replace the feedback conventionally displayed on a screen.
The design and evaluation of multitouch marking menus
- In Proc. CHI
, 2010
"... Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Mark ..."
Abstract
-
Cited by 24 (0 self)
- Add to MetaCart
(Show Context)
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive results collected from this study, and in particular, high angular accuracy, we discuss our new multitouch marking menu design, which can increase the number of items in a menu, and eliminate a level of depth. A second experiment showed that multitouch marking menus perform significantly faster than traditional hierarchal marking menus, reducing acquisition times in both novice and expert usage modalities. Author Keywords Multi-finger input, multi-touch displays, marking menus.
Squidy: A Zoomable Design Environment for Natural User Interfaces
"... Copyright is held by the author/owner(s). ..."
(Show Context)
Designing Reality-Based Interfaces for Creative Group Work
"... Using affinity diagramming as an example, we investigate reality-based interfaces for supporting creative group work. Based on an observational study grounded in the realitybased interaction framework, we identified power vs. reality tradeoffs that can be addressed to find a close fit to embodied pr ..."
Abstract
-
Cited by 14 (7 self)
- Add to MetaCart
(Show Context)
Using affinity diagramming as an example, we investigate reality-based interfaces for supporting creative group work. Based on an observational study grounded in the realitybased interaction framework, we identified power vs. reality tradeoffs that can be addressed to find a close fit to embodied practice. Using this knowledge, we designed and implemented a digital workspace for supporting affinity diagramming. Its hybrid interaction techniques combine digital pen & paper with an interactive table and tangible tokens. An additional vertical display is used to support reflection-in-action and for enhancing discussion and coordination. A preliminary user study confirmed the applicability of our tradeoffs and the general acceptance of the tool design. Author Keywords Creative group work, affinity diagramming, reality-based
Comparing the use of tangible and graphical programming languages for informal science education
- in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’09
, 2009
"... ABSTRACT Much of the work done in the field of tangible interaction has focused on creating tools for learning; however, in many cases, little evidence has been provided that tangible interfaces offer educational benefits compared to more conventional interaction techniques. In this paper, we prese ..."
Abstract
-
Cited by 13 (0 self)
- Add to MetaCart
(Show Context)
ABSTRACT Much of the work done in the field of tangible interaction has focused on creating tools for learning; however, in many cases, little evidence has been provided that tangible interfaces offer educational benefits compared to more conventional interaction techniques. In this paper, we present a study comparing the use of a tangible and a graphical interface as part of an interactive computer programming and robotics exhibit that we designed for the Boston Museum of Science. In this study, we have collected observations of 260 museum visitors and conducted interviews with 13 family groups. Our results show that visitors found the tangible and the graphical systems equally easy to understand. However, with the tangible interface, visitors were significantly more likely to try the exhibit and significantly more likely to actively participate in groups. In turn, we show that regardless of the condition, involving multiple active participants leads to significantly longer interaction times. Finally, we examine the role of children and adults in each condition and present evidence that children are more actively involved in the tangible condition, an effect that seems to be especially strong for girls.
A Specification Paradigm for the Design and Implementation of Tangible User Interfaces
, 2009
"... Tangible interaction shows promise to significantly enhance computer-mediated support for activities such as learning, problem solving, and design. However, tangible user interfaces are currently considered challenging to design and build. Designers and developers of these interfaces encounter sever ..."
Abstract
-
Cited by 12 (0 self)
- Add to MetaCart
(Show Context)
Tangible interaction shows promise to significantly enhance computer-mediated support for activities such as learning, problem solving, and design. However, tangible user interfaces are currently considered challenging to design and build. Designers and developers of these interfaces encounter several conceptual, methodological and technical difficulties. Among others, these challenges include: the lack of appropriate interaction abstractions, the shortcomings of current user interface software tools to address continuous and parallel interactions, as well as the excessive effort required to integrate novel input and output technologies. To address these challenges, we propose a specification paradigm for designing and implementing Tangible User Interfaces (TUIs), that enables TUI developers to specify the structure and behavior of a tangible user interface using high-level constructs, which abstract away implementation details. An important benefit of this approach, which is based on User Interface Description Language (UIDL) research, is that these specifications could be automatically or semi-automatically converted into concrete TUI implementations. In addition, such specifications could serve as a common ground for investigating both design and implementation concerns by TUI developers from different disciplines. Thus, the primary contribution of this paper is a high-level UIDL that provides developers,
g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface
"... In this paper we present g-stalt, a gestural interface for interacting with video. g-stalt is built upon the g-speak spatial operating environment (SOE) from Oblong Industries. The version of g-stalt presented here is realized as a three-dimensional graphical space filled with over 60 cartoons. Thes ..."
Abstract
-
Cited by 12 (1 self)
- Add to MetaCart
In this paper we present g-stalt, a gestural interface for interacting with video. g-stalt is built upon the g-speak spatial operating environment (SOE) from Oblong Industries. The version of g-stalt presented here is realized as a three-dimensional graphical space filled with over 60 cartoons. These cartoons can be viewed and rearranged along with their metadata using a specialized gesture set. g-stalt is designed to be chirocentric, spatiotemporal, and telekinetic. Author Keywords Gesture, gestural interface, chirocentric, spatiotemporal, telekinetic, video, 3D, pinch, g-speak. ACM Classification Keywords H5.2. User Interfaces: input devices and strategies; interaction styles.
iCon: Utilizing Everyday Objects as Additional, Auxiliary and Instant Tabletop Controllers
"... This work describes a novel approach to utilizing everyday objects of users as additional, auxiliary, and instant tabletop controllers. Based on this approach, a prototype platform, called iCon, is developed to explore the possible design. Field studies and user studies reveal that utilizing everyda ..."
Abstract
-
Cited by 10 (1 self)
- Add to MetaCart
(Show Context)
This work describes a novel approach to utilizing everyday objects of users as additional, auxiliary, and instant tabletop controllers. Based on this approach, a prototype platform, called iCon, is developed to explore the possible design. Field studies and user studies reveal that utilizing everyday objects such as auxiliary input devices might be appropriate under a multi-task scenario. User studies further demonstrate that daily objects can generally be applied in low precision circumstances, low engagement with selected objects, and medium-to-high frequency of use. The proposed approach allows users to interact with computers while not altering their original work environments. Author Keywords