• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,687
Next 10 →

Developing Gestural Input

by Matthias Kranz, Stefan Freund, Paul Holleis, Albrecht Schmidt - 26TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS WORKSHOPS, 2006. ICDCS WORKSHOPS 2006. , 2006
"... In this paper, we present the Gesture Cube, a digitally augmented cube for human-computer interaction. The Gesture Cube is designed to be an unobtrusive and playful interaction device for controlling media appliances. Additionally, we discuss in more general the development process for gesture input ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
input. In an explorative user study, we analyzed user requirements, and in particular we were interested in the set of meaningful gestures that users think of. Based on this, a digitally augmented cube for controlling home appliances and computer applications was designed. Gestures are used as the main

Gesture Input using Neural Networks

by Philip A. Harling , 1993
"... This project explores the uses of neural networks to recognise manual postures (static hand positions) and gestures (dynamic hand positions). The postures and gestures are recorded using a hand-gesture input device, the Mattel Power Glove. Two experiments were performed---one to recognise postures, ..."
Abstract - Cited by 4 (1 self) - Add to MetaCart
This project explores the uses of neural networks to recognise manual postures (static hand positions) and gestures (dynamic hand positions). The postures and gestures are recorded using a hand-gesture input device, the Mattel Power Glove. Two experiments were performed---one to recognise postures

Supporting Gestural Input For Users On The Move

by Headon And Coulouris, R Headon, G Coulouris - Proc IEE Eurowearable '03 , 2003
"... A wearable system is described that enables users to perform simple command selection and input operations. The system is suitable for use while moving and in almost any posture. The implementation uses RFID and is low in cost and power requirements. There is provision for changing between applicati ..."
Abstract - Cited by 9 (0 self) - Add to MetaCart
A wearable system is described that enables users to perform simple command selection and input operations. The system is suitable for use while moving and in almost any posture. The implementation uses RFID and is low in cost and power requirements. There is provision for changing between

Tactual Articulatory Feedback and Gestural Input

by Bert Bongers, Gerrit Van Der Veer
"... Abstract The role of the sense of touch in Human-Computer Interaction as a channel for feedback in manipulative processes is investigated. The paper describes information as presented by the computer, and focuses on the feedback that supports the articulation of human gesture. Several experiments ar ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Abstract The role of the sense of touch in Human-Computer Interaction as a channel for feedback in manipulative processes is investigated. The paper describes information as presented by the computer, and focuses on the feedback that supports the articulation of human gesture. Several experiments

Ubi-Finger: Gesture Input Device for Mobile Use

by Koji Tsukada A, Michiaki Yasumura B
"... We propose a novel interface in mobile environment called “Ubi-Finger ” that realizes sensuous operations for PDA and information appliances by gestures of fingers. Since gestures are usual methods of non-verbal communications and enables sensuous operations for users, many researches on them carrie ..."
Abstract - Add to MetaCart
carried out especially in the field of Virtual Reality. However most of those existing gesture-input systems are either very expensive or large, and have not been used in mobile environment. In contrast, Ubi-Finger is a gesture-input device, which is simple, compact, and optimized for mobile use. We

An Assessment of Single-Channel EMG Sensing for Gestural Input

by Travis Peters
"... Wearable devices of all kinds are becoming increasingly popular. One problem that plagues wearable devices, however, is how to interact with them. In this paper we construct a prototype electromyography (EMG) sensing device that captures a single channel of EMG sensor data corresponding to user gest ..."
Abstract - Add to MetaCart
gestures. We also implement a machine learning pipeline to recognize gestural input received via our prototype sensing device. Our goal is to assess the feasibility of using a BITalino EMG sensor to recognize gestural input on a mobile health (mHealth) wearable device known as Amulet. We conduct three

An Interaction Model Designed For Hand Gesture Input

by Thomas Baudel, Michel Beaudouin-Lafon, Annelies Braffort, Daniel Teil , 1992
"... This paper describes an interaction model that allows a user to interact with a computer system through gestures while using gestures at the same time for interaction in the real world. This model uses simple rules for detecting the user's intention to address a gestural command to the system. ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
testing of this application, we present other potential applications and directions for future work. This research aims at providing usable applications of hand gesture input in the real world. The results presented in this paper are of interest to researchers working on new interaction techniques based

Integration of Speech and Gesture Inputs during Multimodal Interaction

by Sharon Oviatt, Fang Chen
"... Speech and gesture are two types of multimodal inputs that can be used to facilitate more natural human-machine interaction in applications for which the traditional keyboard and mouse input mechanisms are inappropriate, however the possibility of their concurrent use raises the issue of how best to ..."
Abstract - Add to MetaCart
Speech and gesture are two types of multimodal inputs that can be used to facilitate more natural human-machine interaction in applications for which the traditional keyboard and mouse input mechanisms are inappropriate, however the possibility of their concurrent use raises the issue of how best

Squiggle- A Glyph Recognizer for Gesture Input

by Jeremy Lee
"... Abstract. Squiggle is a template-based glyph recognizer in the lineage of “$1 Recognizer”[1] and “Protractor”[2]. It seeks a good fit linear affine mapping between the input and template glyphs which are represented as a list of milestone points along the glyph path. The algorithm can recognize inpu ..."
Abstract - Add to MetaCart
Abstract. Squiggle is a template-based glyph recognizer in the lineage of “$1 Recognizer”[1] and “Protractor”[2]. It seeks a good fit linear affine mapping between the input and template glyphs which are represented as a list of milestone points along the glyph path. The algorithm can recognize

Flowmouse: A computer vision-based pointing and gesture input device

by Andrew D. Wilson, Edward Cutrell - In Interact ’05 , 2005
"... Abstract. We introduce FlowMouse, a computer vision-based pointing device and gesture input system. FlowMouse uses optical flow techniques to model the motion of the hand and a capacitive touch sensor to enable and disable interaction. By using optical flow rather than a more traditional tracking ba ..."
Abstract - Cited by 11 (0 self) - Add to MetaCart
Abstract. We introduce FlowMouse, a computer vision-based pointing device and gesture input system. FlowMouse uses optical flow techniques to model the motion of the hand and a capacitive touch sensor to enable and disable interaction. By using optical flow rather than a more traditional tracking
Next 10 →
Results 1 - 10 of 1,687
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University