• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

3D motion and structure estimation using inertial sensors and computer vision for augmented reality. Tele-operators and Virtual Environments, 11(5): 474–492 (2002)

by L Chai, W A Hoff, T Vincent
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 23
Next 10 →

An Introduction to inertial and visual sensing

by Peter Corke, Jorge Lobo, Jorge Dias - The International Journal of Robotics , 2007
"... ..."
Abstract - Cited by 32 (3 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...u et al. 1999� Lang et al. 2002� Neumann et al. 2003� Jiang et al. 2004). Many other hybrid self-trackers based on inertial and vision sensors have been proposed (Hoff et al. 1996� Azuma et al. 1999� =-=Chai et al. 2002-=-� Naimark and Foxlin 2002� Foxlin and Naimark 2003b� Ribo et al. 2004� Hogue et al. 2004� Alenya et al. 2004� Klein and Drummond 2004). The visual tracking relies on either specific targets, line cont...

H and Zhou H (2007) Integration of vision and inertial sensors for 3D arm motion tracking in home-based rehabilitation

by Yaqin Tao, Huosheng Hu, Huiyu Zhou - The International Journal of Robotics Research
"... The integration of visual and inertial sensors for human motion tracking has attracted significant attention recently, due to its robust performance and wide potential application. This paper introduces a real-time hybrid solution to articulated 3D arm motion tracking for home-based rehabilitation b ..."
Abstract - Cited by 19 (1 self) - Add to MetaCart
The integration of visual and inertial sensors for human motion tracking has attracted significant attention recently, due to its robust performance and wide potential application. This paper introduces a real-time hybrid solution to articulated 3D arm motion tracking for home-based rehabilitation by combining visual and inertial sen-sors. Data fusion is a key issue in this hybrid system and two dif-ferent data fusion methods are proposed. The first is a determinis-tic method based on arm structure and geometry information, which is suitable for simple rehabilitation motions. The second is a proba-bilistic method based on an Extended Kalman Filter (EKF) in which data from two sensors is fused in a predict-correct manner in order to deal with sensor noise and model inaccuracy. Experimental results are presented and compared with commercial marker-based systems, CODA and Qualysis. They show good performance for the proposed solution. KEY WORDS—sensor fusion, extended Kalman filter, iner-tial sensor, human motion tracking, home-based rehabilitation 1.

Virtual Reality System with Integrated Sound Field Simulation and Reproduction

by Tobias Lentz, Dirk Schröder, Michael Vorländer, Ingo Assenmacher , 2007
"... A real-time audio rendering system is introduced which combines a full room-specific simulation, dynamic crosstalk cancellation, and multitrack binaural synthesis for virtual acoustical imaging. The system is applicable for any room shape (normal, long, flat, coupled), independent of the a priori as ..."
Abstract - Cited by 17 (1 self) - Add to MetaCart
A real-time audio rendering system is introduced which combines a full room-specific simulation, dynamic crosstalk cancellation, and multitrack binaural synthesis for virtual acoustical imaging. The system is applicable for any room shape (normal, long, flat, coupled), independent of the a priori assumption of a diffuse sound field. This provides the possibility of simulating indoor or outdoor spatially distributed, freely movable sources and a moving listener in virtual environments. In addition to that, near-tohead sources can be simulated by using measured near-field HRTFs. The reproduction component consists of a headphone-free reproduction by dynamic crosstalk cancellation. The focus of the project is mainly on the integration and interaction of all involved subsystems. It is demonstrated that the system is capable of real-time room simulation and reproduction and, thus, can be used as a reliable platform for further research on VR applications.

Efficient camera motion and 3d recovery using an inertial sensor

by Martin Labrie - In Computer and Robot Vision, 2007. CRV ’07. Fourth Canadian Conference on , 2007
"... This paper presents a system for 3D reconstruction us-ing a camera combined with an inertial sensor. The sys-tem mainly exploits the orientation obtained from the iner-tial sensor in order to accelerate and improve the matching process between wide baseline images. The orientation fur-ther contribut ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
This paper presents a system for 3D reconstruction us-ing a camera combined with an inertial sensor. The sys-tem mainly exploits the orientation obtained from the iner-tial sensor in order to accelerate and improve the matching process between wide baseline images. The orientation fur-ther contributes to incremental 3D reconstruction of a set of feature points from linear equation systems. The processing can be performed online while using consecutive groups of three images overlapping each other. Classic or incremen-tal bundle adjustment is applied to improve the quality of the model. Test validation has been performed on object and camera centric sequences. 1
(Show Context)

Citation Context

... of images. It is assumed that the intrinsic parameters of the camera are known and fixed. A few studies have been conducted on this combination in the fields of mobile robotics and augmented reality =-=[6, 7, 8, 9]-=-. Our work is mostly inspired by the more recent research of Okatani and Deguchi who demonstrated in [10] that the orientation provided by an inertial sensor could be efficiently used to calculate the...

A Novel Sensing and Data Fusion System for 3-D Arm Motion Tracking in Telerehabilitation

by Yaqin Tao, Huosheng Hu, Senior Member
"... Abstract—In this paper, we present a novel sensing and data fusion system to track 3-D arm motion in a telerehabilitation program. A particle filter (PF) algorithm is adopted in the system to fuse data from inertial and visual sensors in a probabilistic manner. It is able to propagate multimodal dis ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
Abstract—In this paper, we present a novel sensing and data fusion system to track 3-D arm motion in a telerehabilitation program. A particle filter (PF) algorithm is adopted in the system to fuse data from inertial and visual sensors in a probabilistic manner. It is able to propagate multimodal distributions of system states based on an “importance sampling ” technique by using sets of weighted particles. To avoid the problem of conventional PF algorithms that suffer from particle degeneracy and perform poorly in a narrow distribution situation, we adopt two strategies in our system, namely state space pruning and an arm physical geometry constraint. Experimental results show that the proposed PF framework outperforms other fusion methods and provides accurate results in comparison to the ground truth. Index Terms—Biomedical measurements, particle filter (PF), sensor fusion, telerehabilitation, upper limb pose estimation, 3-D arm motion tracking. I.

Object-centered Feature Selection for Weakly-Unsupervised Object Categorization

by Christoph Stock, Markus Lambrecht, Andreas Opelt, Axel Pinz
"... We describe a novel approach of spatio-temporal mapping of local image features, to reduce the number of input data for further object categorization. The main focus of our work is the selection of good features to learn, by achieving a precise mapping of image features either related to static obje ..."
Abstract - Add to MetaCart
We describe a novel approach of spatio-temporal mapping of local image features, to reduce the number of input data for further object categorization. The main focus of our work is the selection of good features to learn, by achieving a precise mapping of image features either related to static objects or to background. This can be done by initial camera motion estimation, subsequent structure estimation and final clustering of the 3D points. Experimental results show that our method achieves a significant reduction of processed image features, which yields a better performance in subsequent learning modules. 1
(Show Context)

Citation Context

...ion, we yield higher computational speed and additionally force the reliability and robustness of the estimated camera pose (wrong local minima will be suppressed). 3 Structure Estimation Chai et al. =-=[1]-=- present a structure and motion estimation scheme, which works offline with manually selected point correspondences. Other works, like [8] are able to compute the structure information up to a scalefa...

Autonomous Vehicle Video Aided Navigation – Coupling INS and Video Approaches

by Chris Baker, Chris Debrunner, Sean Gooding, William Hoff, William Severson
"... Abstract. As autonomous vehicle systems become more prevalent, their navigation capabilities become increasingly critical. Currently most systems rely on a combined GPS/INS solution for vehicle pose computation, while some systems use a video-based approach. One problem with a GPS/INS approach is th ..."
Abstract - Add to MetaCart
Abstract. As autonomous vehicle systems become more prevalent, their navigation capabilities become increasingly critical. Currently most systems rely on a combined GPS/INS solution for vehicle pose computation, while some systems use a video-based approach. One problem with a GPS/INS approach is the possible loss of GPS data, especially in urban environments. Using only INS in this case causes significant drift in the computed pose. The video-based approach is not always reliable due to its heavy dependence on image texture. Our approach to autonomous vehicle navigation exploits the best of both of these by coupling an outlier-robust video-based solution with INS when GPS is unavailable. This allows accurate computation of the system’s current pose in these situations. In this paper we describe our system design and provide an analysis of its performance, using simulated data with a range of different noise levels. 1

Downloaded from

by Chien-ming Hu, Yen-hui Chen, Ming-tsai Chiang, Lee-young Chau
"... Permissions: Requests for permissions to reproduce figures, tables, or portions of articles originally published in Circulation can be obtained via RightsLink, a service of the Copyright Clearance Center, not the Editorial Office. Once the online version of the published article for which permission ..."
Abstract - Add to MetaCart
Permissions: Requests for permissions to reproduce figures, tables, or portions of articles originally published in Circulation can be obtained via RightsLink, a service of the Copyright Clearance Center, not the Editorial Office. Once the online version of the published article for which permission is being requested is located, click Request Permissions in the middle column of the Web page under Services. Further information about this process is available in the Permissions and Rights Question and Answer document. Reprints: Information about reprints can be found online at:

of human upper limbs

by Huiyu Zhou, Huosheng Hu
"... Inertial sensors for motion detection ..."
Abstract - Add to MetaCart
Inertial sensors for motion detection

Motion estimation from image and inertial

by Dennis W. Strelow , 2004
"... measurements ..."
Abstract - Add to MetaCart
measurements
(Show Context)

Citation Context

...ential Monte Carlo framework. In this case, the authors showed that the inclusion of gyro measurements significantly reduced the number of samples required for accurate motion estimation. Chai, et al.=-=[9]-=- describe a system for simultaneously estimating the motion of a sensor rig and the sparse structure of the environment in which the rig moves, from gyro, accelerometer, and image measurements. This s...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University