Results 1 - 10
of
20
Data-driven biped control
- In SIGGRAPH ’10: ACM SIGGRAPH 2010 papers (2010), ACM
"... Figure 1: Our data-driven controller allows the physically-simulated biped character to reproduce challenging motor skills captured in motion data. We present a dynamic controller to physically simulate under-actuated three-dimensional full-body biped locomotion. Our data-driven controller takes mot ..."
Abstract
-
Cited by 30 (0 self)
- Add to MetaCart
Figure 1: Our data-driven controller allows the physically-simulated biped character to reproduce challenging motor skills captured in motion data. We present a dynamic controller to physically simulate under-actuated three-dimensional full-body biped locomotion. Our data-driven controller takes motion capture reference data to reproduce realistic human locomotion through realtime physically based sim-ulation. The key idea is modulating the reference trajectory con-tinuously and seamlessly such that even a simple dynamic tracking controller can follow the reference trajectory while maintaining its balance. In our framework, biped control can be facilitated by a large array of existing data-driven animation techniques because our controller can take a stream of reference data generated on-the-fly at runtime. We demonstrate the effectiveness of our approach through examples that allow bipeds to turn, spin, and walk while steering its direction interactively.
Sampling-based Contact-rich Motion Control
"... (a) A forward roll transformed to a dive roll. (b) A cartwheel retargeted to an Asimo-like robot. (c) A walk transformed onto a balance beam. Figure 1: Physically based motion transformation and retargeting. Human motions are the product of internal and external forces, but these forces are very dif ..."
Abstract
-
Cited by 23 (7 self)
- Add to MetaCart
(a) A forward roll transformed to a dive roll. (b) A cartwheel retargeted to an Asimo-like robot. (c) A walk transformed onto a balance beam. Figure 1: Physically based motion transformation and retargeting. Human motions are the product of internal and external forces, but these forces are very difficult to measure in a general setting. Given a motion capture trajectory, we propose a method to reconstruct its open-loop control and the implicit contact forces. The method employs a strategy based on randomized sampling of the control within user-specified bounds, coupled with forward dynamics simulation. Sampling-based techniques are well suited to this task because of their lack of dependence on derivatives, which are difficult to estimate in contact-rich scenarios. They are also easy to parallelize, which we exploit in our implementation on a compute cluster. We demonstrate reconstruction of a diverse set of captured motions, including walking, running, and contact rich tasks such as rolls and kip-up jumps. We further show how the method can be applied to physically based motion transformation and retargeting, physically plausible motion variations, and referencetrajectory-free idling motions. Alongside the successes, we point out a number of limitations and directions for future work. 1
Velocity-based modeling of physical interactions in multi-agent simulations in dense crowd
, 2013
"... We present an interactive algorithm to model physics-based interac-tions in multi-agent simulations. Our approach is capable of model-ing both physical forces and interactions between agents and obsta-cles, while allowing the agents to anticipate and avoid collisions for local navigation. We combine ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
We present an interactive algorithm to model physics-based interac-tions in multi-agent simulations. Our approach is capable of model-ing both physical forces and interactions between agents and obsta-cles, while allowing the agents to anticipate and avoid collisions for local navigation. We combine velocity-based collision-avoidance algorithms with external physical forces. The overall formulation can approximately simulate various physical effects, including col-lisions, pushing, deceleration and resistive forces. We have inte-grated our approach with an open-source physics engine and use the resulting system to model plausible behaviors of and interac-tions among large numbers of agents in dense environments. Our algorithm can simulate a few thousand agents at interactive rates and can generate many emergent behaviors. The overall approach is useful for interactive applications that require plausible physical behavior, including games and virtual worlds.
An Analysis of Motion Blending Techniques
"... Abstract. Motion blending is a widely used technique for character animation. The main idea is to blend similar motion examples according to blending weights, in order to synthesize new motions parameterizing high level characteristics of interest. We present in this paper an in-depth analysis and c ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
(Show Context)
Abstract. Motion blending is a widely used technique for character animation. The main idea is to blend similar motion examples according to blending weights, in order to synthesize new motions parameterizing high level characteristics of interest. We present in this paper an in-depth analysis and comparison of four motion blending techniques: Barycentric interpolation, Radial Basis Function, K-Nearest Neighbors and Inverse Blending optimization. Comparison metrics were designed to measure the performance across different motion categories on criteria including smoothness, parametric error and computation time. We have implemented each method in our character animation platform SmartBody and we present several visualization renderings that provide a window for gleaning insights into the underlying pros and cons of each method in an intuitive way. 1
Push it real: Perceiving causality in virtual interactions
- ACM Transactions on Graphics
, 2012
"... Copyright Notice Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profi t or direct commercial advantage and that copies show this notice on the fi rst page or initial scree ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
Copyright Notice Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profi t or direct commercial advantage and that copies show this notice on the fi rst page or initial screen of a display along with the full citation. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, to redistribute to lists, or to use any component of this work in other works requires prior specifi c permission and/or a fee. Permissions may be
Motion adaptation for humanoid robots in constrained environments
- In ICRA
, 2013
"... Abstract—This paper presents a new method to synthesize full body motion for controlling humanoid robots in highly constrained environments. Given a reference motion of the robot and the corresponding environment configuration, the spatial relationships between the robot body parts and the environme ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
(Show Context)
Abstract—This paper presents a new method to synthesize full body motion for controlling humanoid robots in highly constrained environments. Given a reference motion of the robot and the corresponding environment configuration, the spatial relationships between the robot body parts and the environment objects are extracted as a representation called the Interaction Mesh. Such a representation is then used in adapting the refer-ence motion to an altered environment. By preserving the spatial relationships while satisfying physical constraints, collision-free and well balanced motions can be generated automatically and efficiently. Experimental results show that the proposed method can adapt different full body motions in significantly modified environments. Our method can be applied in precise robotic controls under complicated environments, such as rescue robots in accident scenes and searching robots in highly constrained spaces. I.
Social-Event-Driven Camera Control for Multicharacter Animations
"... Abstract—In a virtual world, a group of virtual characters can interact with each other, and these characters may leave a group to join another. The interaction among individuals and groups often produces interesting events in a sequence of animation. The goal of this paper is to discover social eve ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
(Show Context)
Abstract—In a virtual world, a group of virtual characters can interact with each other, and these characters may leave a group to join another. The interaction among individuals and groups often produces interesting events in a sequence of animation. The goal of this paper is to discover social events involving mutual interactions or group activities in multicharacter animations and automatically plan a smooth camera motion to view interesting events suggested by our system or relevant events specified by a user. Inspired by sociology studies, we borrow the knowledge in Proxemics, social force, and social network analysis to model the dynamic relation among social events and the relation among the participants within each event. By analyzing the variation of relation strength among participants and spatiotemporal correlation among events, we discover salient social events in a motion clip and generate an overview video of these events with smooth camera motion using a simulated annealing optimization method. We tested our approach on different motions performed by multiple characters. Our user study shows that our results are preferred in 66.19 percent of the comparisons with those by the camera control approach without event analysis and are comparable (51.79 percent) to professional results by an artist. Index Terms—MOCAP, multicharacter animation, event analysis, social network analysis. Ç 1
Feature-Based Locomotion with Inverse Branch
"... Abstract. We propose a novel Inverse Kinematics based deformation method that introduces flexibility and parameterization to motion graphs without degrading the quality of the synthesized motions. Our method deforms the transitions of a motion graph-like structure by first assigning to each transiti ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
(Show Context)
Abstract. We propose a novel Inverse Kinematics based deformation method that introduces flexibility and parameterization to motion graphs without degrading the quality of the synthesized motions. Our method deforms the transitions of a motion graph-like structure by first assigning to each transition a continuous rotational range that guarantees not to exceed the predefined global transition cost threshold. The deformation procedure improves the reachability of motion graphs to precise locations and consequently reduces the time spent during search. Furthermore, our method includes a new motion graph construction method based on ge-ometrical segmentation features, and employs a fast triangulation based search pruning technique that confines the search to a free channel and avoids expensive collision checking. The results obtained by the proposed methods were evaluated and quantified, and they demonstrate significant improvements in comparison with traditional motion graph approaches.1
Eurographics / ACM SIGGRAPH Symposium on Computer Animation (2012) P. Kry and J. Lee (Editors) Component-based Locomotion Composition
"... Figure 1: Diverse locomotion outputs (in a happy style) synthesized by a linear mixture of combinatorial components decomposed from the 10 example set. When generating locomotion, it is particularly challenging to adjust the motion’s style. This paper introduces a component-based system for human lo ..."
Abstract
- Add to MetaCart
(Show Context)
Figure 1: Diverse locomotion outputs (in a happy style) synthesized by a linear mixture of combinatorial components decomposed from the 10 example set. When generating locomotion, it is particularly challenging to adjust the motion’s style. This paper introduces a component-based system for human locomotion composition that drives off a set of example locomotion clips. The distinctive style of each example is analyzed in the form of sub-motion components decomposed from separate body parts via independent component analysis (ICA). During the synthesis process, we use these components as combinatorial ingredients to generate new locomotion sequences that are stylistically different from the example set. Our system is designed for novice users who do not have much knowledge of important locomotion properties, such as the correlations throughout the body. Thus, the proposed system analyzes the examples in a unsupervised manner and synthesizes an output locomotion from a small number of control parameters. Our experimental results show that the system can generate physically plausible locomotion in a desired style at interactive speed.