• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 369
Next 10 →

Visual interpretation of hand gestures for human-computer interaction: A review

by Vladimir I. Pavlovic, Rajeev Sharma, Thomas S. Huang - IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE , 1997
"... The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI). In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. This has motivated a very active research area conc ..."
Abstract - Cited by 489 (17 self) - Add to MetaCart
-based gesture recognition. Although the current progress is encouraging, further theoretical as well as computational advances are needed before gestures can be widely used for HCI. We discuss directions of future research in gesture recognition, including its integration with other natural modes of human

A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions

by Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thomas S. Huang , 2009
"... Automated analysis of human affective behavior has attracted increasing attention from researchers in psychology, computer science, linguistics, neuroscience, and related disciplines. However, the existing methods typically handle only deliberately displayed and exaggerated expressions of prototypi ..."
Abstract - Cited by 374 (51 self) - Add to MetaCart
, an increasing number of efforts are reported toward multimodal fusion for human affect analysis, including audiovisual fusion, linguistic and paralinguistic fusion, and multicue visual fusion based on facial expressions, head movements, and body gestures. This paper introduces and surveys these recent advances

Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. UIST

by Mike Wu, Ravin Balakrishnan , 2003
"... www.dgp.toronto.edu Recent advances in sensing technology have enabled a new generation of tabletop displays that can sense multiple points of input from several users simultaneously. However, apart from a few demonstration techniques [17], current user interfaces do not take advantage of this incre ..."
Abstract - Cited by 203 (11 self) - Add to MetaCart
www.dgp.toronto.edu Recent advances in sensing technology have enabled a new generation of tabletop displays that can sense multiple points of input from several users simultaneously. However, apart from a few demonstration techniques [17], current user interfaces do not take advantage

Gestural Cues of Discourse Segmentation

by Jane Ch, Nanette Veilleux
"... Research on discourse segmentation frequently involves the identification of certain cues in the various dimensions of text, speech, and gesture. Advances in automated segmentation models and algorithms have been achieved when these cues are taken into consideration. For gestures in particular, it m ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Research on discourse segmentation frequently involves the identification of certain cues in the various dimensions of text, speech, and gesture. Advances in automated segmentation models and algorithms have been achieved when these cues are taken into consideration. For gestures in particular

Advanced Review Learning through gesture

by Susan Goldin-meadow
"... When people talk, they move their hands—they gesture. Although these movements might appear to be meaningless hand waving, in fact they convey substantive information that is not always found in the accompanying speech. As a result, gesture can provide insight into thoughts that speakers have but do ..."
Abstract - Add to MetaCart
When people talk, they move their hands—they gesture. Although these movements might appear to be meaningless hand waving, in fact they convey substantive information that is not always found in the accompanying speech. As a result, gesture can provide insight into thoughts that speakers have

Gesture + Play

by Konrad Tollmar, David Demirdjian, Trevor Darrell
"... Physical and perceptual interfaces to games and virtual environments are an exciting new interface paradigm. Recent advances in computer vision make real-time sensing of users ’ position and pose possible using relatively lowcost sensors. However, little thought has been given to the interface abstr ..."
Abstract - Add to MetaCart
Physical and perceptual interfaces to games and virtual environments are an exciting new interface paradigm. Recent advances in computer vision make real-time sensing of users ’ position and pose possible using relatively lowcost sensors. However, little thought has been given to the interface

Advancing Human Pose and Gesture Recognition

by Tomas Pfister , 2015
"... This thesis presents new methods in two closely related areas of computer vision: human pose estimation, and gesture recognition in videos. In human pose estimation, we show that random forests can be used to estimate human pose in monocular videos. To this end, we propose a co-segmentation al-gorit ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
This thesis presents new methods in two closely related areas of computer vision: human pose estimation, and gesture recognition in videos. In human pose estimation, we show that random forests can be used to estimate human pose in monocular videos. To this end, we propose a co-segmentation al

The Case Study of Application of Advanced Gesture Interface and Mapping

by Interface Virtual Musical, Suguru Goto - Proceedings of the 2004 Conference on New Interfaces for Musical Expression , 2004
"... We will discuss the case study of application of the Virtual Musical Instrument and Sound Synthesis. Doing this application, the main subject is advanced Mapping Interface in order to connect these. For this experiment, our discussion also refers to Neural Network, as well as a brief introduction of ..."
Abstract - Add to MetaCart
We will discuss the case study of application of the Virtual Musical Instrument and Sound Synthesis. Doing this application, the main subject is advanced Mapping Interface in order to connect these. For this experiment, our discussion also refers to Neural Network, as well as a brief introduction

The Gesture Recognition Toolkit

by Nicholas Gillian, Joseph A. Paradiso, Isabelle Guyon, Vassilis Athitsos, Sergio Escalera
"... The Gesture Recognition Toolkit is a cross-platform open-source C++ library designed to make real-time machine learning and gesture recognition more accessible for non-specialists. Emphasis is placed on ease of use, with a consistent, minimalist design that promotes accessibility while supporting fl ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
flexibility and customization for advanced users. The toolkit features a broad range of classification and regression algorithms and has extensive support for building real-time systems. This includes algorithms for signal processing, feature extraction and automatic gesture spotting.

Gesture and aphasia: Helping hands?

by Victoria L Scharp , Connie A Tompkins , Jana M Iverson
"... Background: The study of communicative gestures is one of considerable interest for aphasia, in relation to theory, diagnosis, and treatment. Significant limitations currently permeate the general (psycho)linguistic literature on gesture production, and attention to these limitations is essential f ..."
Abstract - Add to MetaCart
for both continued investigation and clinical application of gesture for people with aphasia. Aims: The aims of this paper are to discuss issues imperative to advancing the gesture production literature and to provide specific suggestions for applying the material herein to studies in gesture production
Next 10 →
Results 1 - 10 of 369
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University