Results 1 - 10
of
81
The LilyPad Arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education
- Proc. of the 26th International Conference on Human Factors in Computing Systems (CHI 2008
, 2008
"... The advent of novel materials (such as conductive fibers) combined with accessible embedded computing platforms have made it possible to re-imagine the landscapes of fabric and electronic crafts—extending these landscapes with the creative range of electronic/computational textiles or e-textiles. Th ..."
Abstract
-
Cited by 47 (6 self)
- Add to MetaCart
(Show Context)
The advent of novel materials (such as conductive fibers) combined with accessible embedded computing platforms have made it possible to re-imagine the landscapes of fabric and electronic crafts—extending these landscapes with the creative range of electronic/computational textiles or e-textiles. This paper describes the LilyPad Arduino, a fabric-based construction kit that enables novices to design and build their own soft wearables and other textile artifacts. The kit consists of a microcontroller and an assortment of sensors and actuators in stitch-able packages; these elements can be sewn to cloth substrates and each other with conductive thread to build e-textiles. This paper will introduce the latest version of the kit; reflect on its affordances; present the results of our most recent user studies; and discuss possible directions for future work in the area of personalized e-textile design and its relation to technology education. Author Keywords LilyPad Arduino, computational textiles, electronic textiles,
The Proximity Toolkit: Prototyping Proxemic Interactions in Ubiquitous Computing Ecologies
"... orientation, distance, pointing rays; Right: visualizing these relationships in the Proximity Toolkit visual monitoring tool. People naturally understand and use proxemic relationships in everyday situations. However, only few ubiquitous computing (ubicomp) systems interpret such proxemic relationsh ..."
Abstract
-
Cited by 35 (10 self)
- Add to MetaCart
(Show Context)
orientation, distance, pointing rays; Right: visualizing these relationships in the Proximity Toolkit visual monitoring tool. People naturally understand and use proxemic relationships in everyday situations. However, only few ubiquitous computing (ubicomp) systems interpret such proxemic relationships to mediate interaction (proxemic interaction). A technical problem is that developers find it challenging and tedious to access proxemic information from sensors. Our Proximity Toolkit solves this problem. It simplifies the exploration of interaction techniques by supplying finegrained proxemic information between people, portable devices, large interactive surfaces, and other non-digital objects in a room-sized environment. The toolkit offers three key features. 1) It facilitates rapid prototyping of proxemic-aware systems by supplying developers with the
Momento: support for situated ubicomp experimentation
- Proc. CHI 2007
, 2007
"... We present the iterative design of Momento, a tool that provides integrated support for situated evaluation of ubiquitous computing applications. We derived requirements for Momento from a user-centered design process that included interviews, observations and field studies of early versions of the ..."
Abstract
-
Cited by 34 (2 self)
- Add to MetaCart
(Show Context)
We present the iterative design of Momento, a tool that provides integrated support for situated evaluation of ubiquitous computing applications. We derived requirements for Momento from a user-centered design process that included interviews, observations and field studies of early versions of the tool. Motivated by our findings, Momento supports remote testing of ubicomp applications, helps with participant adoption and retention by minimizing the need for new hardware, and supports mid-to-long term studies to address infrequently occurring data. Also, Momento can gather log data, experience sampling, diary, and other qualitative data.
Parallel prototyping leads to better design results, more divergence, and increased self-efficacy
- ACM Trans. Comput.-Hum. Interact
, 2010
"... Iteration can help people improve ideas. It can also give rise to fixation, continuously refining one option without considering others. Does creating and receiving feedback on multiple prototypes in parallel, as opposed to serially, affect learning, self-efficacy, and design exploration? An experim ..."
Abstract
-
Cited by 29 (4 self)
- Add to MetaCart
Iteration can help people improve ideas. It can also give rise to fixation, continuously refining one option without considering others. Does creating and receiving feedback on multiple prototypes in parallel, as opposed to serially, affect learning, self-efficacy, and design exploration? An experiment manipulated whether independent novice designers created graphic Web advertisements in parallel or in series. Serial participants received descriptive critique directly after each prototype. Parallel participants created multiple prototypes before receiving feedback. As measured by clickthrough data and expert ratings, ads created in the Parallel condition significantly outperformed those from the Serial condition. Moreover, independent raters found Parallel prototypes to be more diverse. Parallel participants also reported a larger increase in task-specific self-confidence. This article outlines a theoretical foundation for why parallel prototyping produces better design results and discusses the implications for design education.
Rapidly Exploring Application Design Through Speed Dating
- In: Proceedings of the Conference on Ubiquitous Computing, (2007) [Forthcoming
"... Abstract. While the user-centered design methods we bring from humancomputer interaction to ubicomp help sketch ideas and refine prototypes, few tools or techniques help explore divergent design concepts, reflect on their merits, and come to a new understanding of design opportunities and ways to ad ..."
Abstract
-
Cited by 26 (8 self)
- Add to MetaCart
(Show Context)
Abstract. While the user-centered design methods we bring from humancomputer interaction to ubicomp help sketch ideas and refine prototypes, few tools or techniques help explore divergent design concepts, reflect on their merits, and come to a new understanding of design opportunities and ways to address them. We present Speed Dating, a design method for rapidly exploring application concepts and their interactions and contextual dimensions without requiring any technology implementation. Situated between sketching and prototyping, Speed Dating structures comparison of concepts, helping identify and understand contextual risk factors and develop approaches to address them. We illustrate how to use Speed Dating by applying it to our research on the smart home and dual-income families, and highlight our findings from using this method.
Eyepatch: Prototyping Camera-Based Interaction Through Examples
- ACM Symposium on User Interface Software and Technology (UIST
"... Cameras are a useful source of input for many interactive applications, but computer vision programming is difficult and requires specialized knowledge that is out of reach for many HCI practitioners. In an effort to learn what makes a useful computer vision design tool, we created Eyepatch, a tool ..."
Abstract
-
Cited by 22 (1 self)
- Add to MetaCart
Cameras are a useful source of input for many interactive applications, but computer vision programming is difficult and requires specialized knowledge that is out of reach for many HCI practitioners. In an effort to learn what makes a useful computer vision design tool, we created Eyepatch, a tool for designing camera-based interactions, and evaluated the Eyepatch prototype through deployment to students in an HCI course. This paper describes the lessons we learned about making computer vision more accessible, while retaining enough power and flexibility to be useful in a wide variety of interaction scenarios. ACM Classification: H.1.2 [Information Systems]:
iStuff Mobile: Rapidly Prototyping New Mobile Phone Interfaces for Ubiquitous Computing
"... iStuff Mobile is the first rapid prototyping framework that helps explore new sensor-based interfaces with existing mobile phones. It focuses on sensor-enhanced physical interfaces for ubiquitous computing scenarios. The framework includes sensor network platforms, mobile phone software, and a prove ..."
Abstract
-
Cited by 20 (1 self)
- Add to MetaCart
iStuff Mobile is the first rapid prototyping framework that helps explore new sensor-based interfaces with existing mobile phones. It focuses on sensor-enhanced physical interfaces for ubiquitous computing scenarios. The framework includes sensor network platforms, mobile phone software, and a proven rapid prototyping framework. Interaction designers can use iStuff Mobile to quickly create and test functional prototypes of novel interfaces without making internal hardware or software modifications to the handset. A visual programming paradigm provides a low threshold for prototyping activities: the system is not difficult to learn. At the same time, the range of examples built using the toolkit demonstrates a high ceiling for prototyping activities: the toolkit places few limits on prototype complexity. A user study shows that the visual programming metaphor enables prototypes to be built faster and encourages more iterations than a previous approach.
Activity-Based Prototyping of Ubicomp Applications for Long-Lived, Everyday Human Activities
- CHI 2008
, 2008
"... We designed an activity-based prototyping process realized in the ActivityDesigner system that combines the theoretical framework of Activity-Centered Design with traditional iterative design. This process allows designers to leverage human activities as first class objects for design and is support ..."
Abstract
-
Cited by 20 (1 self)
- Add to MetaCart
(Show Context)
We designed an activity-based prototyping process realized in the ActivityDesigner system that combines the theoretical framework of Activity-Centered Design with traditional iterative design. This process allows designers to leverage human activities as first class objects for design and is supported in ActivityDesigner by three novel features. First, this tool allows designers to model activities based on concrete scenarios collected from everyday lives. The models form a context for design and computational constructs for creating functional prototypes. Second, it allows designers to prototype interaction behaviors based on activity streams spanning time. Third, it allows designers to easily test these prototypes with real users continuously, in situ. We have garnered positive feedback from a series of laboratory user studies and several case studies in which ActivityDesigner was used in realistic design situations. ActivityDesigner was able to effectively streamline a ubicomp design process, and it allowed creating realistic ubicomp application prototypes at a low cost and testing them in everyday lives over an extended period.
Midas: Fabricating Custom Capacitive Touch Sensors to Prototype Interactive Objects
"... An increasing number of consumer products include user interfaces that rely on touch input. While digital fabrication techniques such as 3D printing make it easier to prototype the shape of custom devices, adding interactivity to such prototypes remains a challenge for many designers. We introduce M ..."
Abstract
-
Cited by 17 (3 self)
- Add to MetaCart
(Show Context)
An increasing number of consumer products include user interfaces that rely on touch input. While digital fabrication techniques such as 3D printing make it easier to prototype the shape of custom devices, adding interactivity to such prototypes remains a challenge for many designers. We introduce Midas, a software and hardware toolkit to support the design, fabrication, and programming of flexible capacitive touch sensors for interactive objects. With Midas, designers first define the desired shape, layout, and type of touch sensitive areas, as well as routing obstacles, in a sensor editor. From this high-level specification, Midas automatically generates layout files with appropriate sensor pads and routed connections. These files are then used to fabricate sensors using digital fabrication processes, e.g., vinyl cutters and conductive ink printers. Using step-by-step assembly instructions generated by Midas, designers connect these sensors to the Midas microcontroller, which detects touch events. Once the prototype is assembled, designers can define interactivity for their sensors: Midas supports both record-and-replay actions for controlling existing local applications and WebSocket-based event output for controlling novel or remote applications. In a first-use study with three participants, users successfully prototyped media players. We also demonstrate how Midas can be used to create a number of touch-sensitive interfaces. ACM Classification: