SciELO - Scientific Electronic Library Online

 
vol.16 issue4Cross-language Plagiarism Detection Using BabelNet's Statistical DictionaryModeling and Control in Task-Space of a Mobile Manipulator with Cancellation of Factory-Installed Proportional-Derivative Control author indexsubject indexsearch form
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • Have no similar articlesSimilars in SciELO

Share


Computación y Sistemas

On-line version ISSN 2007-9737Print version ISSN 1405-5546

Abstract

ARENAS-MENA, Juan Carlos; HAYET, Jean-Bernard  and  ESTEVES, Claudia. A Motion Capture based Planner for Virtual Characters Navigating in 3D Environment. Comp. y Sist. [online]. 2012, vol.16, n.4, pp.391-407. ISSN 2007-9737.

In this work, a strategy to automatically generate eye-believable motions for a virtual character that navigates in a 3D environment is presented. The overall approach consists of four components as follows. (1) A state-of-the-art path planner that computes a collision-free reference path for the character's center of mass (COM). For this planner, a simplified model that bounds the character's geometry is proposed. (2) A segmentation algorithm that divides the path into behaviors. (3) A classifier that compares each behavior with the corresponding motion capture segments previously analyzed and stored in a database. (4) A whole-body motion generator that synthesizes the appropriate behavior determined by the classifier. The main contribution of this work is to produce a sampling-based global motion planner that generates different behaviors (in addition to locomotion) issued from environmental constraints. Several results of our algorithm in different environments are shown and its current limitations are discussed.

Keywords : I.3.7 computing methodologies; computer graphics; three-dimensional graphics and realism; motion planning; character animation; motion-capture classification.

        · abstract in Spanish     · text in English

 

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License