skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis

Abstract

We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying background music. A key step in our method is to train a music to motion matching quality rating function through learning the music to motion mapping relationship exhibited in synchronized music and dance motion data, which were captured from professional human dance performance. To generate an optimal sequence of dance motion segments to match with a piece of music, we introduce a constraint-based dynamic programming procedure. This procedure considers both music to motion matching quality and visual smoothness of a resultant dance motion sequence. We also introduce a two-way evaluation strategy, coupled with a GPU-based implementation, through which we can execute the dynamic programming process in parallel, resulting in significant speedup. To evaluate the effectiveness of our method, we quantitatively compare the dance motions synthesized by our method with motion synthesis results by several peer methods using the motions captured from professional human dancers' performance as the gold standard. We also conducted several medium-scale user studiesmore » to explore how perceptually our dance motion synthesis method can outperform existing methods in synthesizing dance motions to match with a piece of music. These user studies produced very positive results on our music-driven dance motion synthesis experiments for several Asian dance genres, confirming the advantages of our method.« less

Authors:
 [1];  [2];  [3]
  1. ORNL
  2. University of North Carolina, Chapel Hill
  3. Zhejiang University
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE Laboratory Directed Research and Development (LDRD) Program
OSTI Identifier:
1036181
DOE Contract Number:  
AC05-00OR22725
Resource Type:
Journal Article
Journal Name:
IEEE Transactions on Visualization and Computer Graphics
Additional Journal Information:
Journal Volume: 18; Journal Issue: 3; Journal ID: ISSN 1077-2626
Publisher:
IEEE
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; DYNAMIC PROGRAMMING; EVALUATION; GOLD; IMPLEMENTATION; LEARNING; PERFORMANCE; ROUGHNESS; SYNTHESIS

Citation Formats

Xu, Songhua, Fan, Rukun, and Geng, Weidong. Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis. United States: N. p., 2011. Web.
Xu, Songhua, Fan, Rukun, & Geng, Weidong. Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis. United States.
Xu, Songhua, Fan, Rukun, and Geng, Weidong. 2011. "Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis". United States.
@article{osti_1036181,
title = {Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis},
author = {Xu, Songhua and Fan, Rukun and Geng, Weidong},
abstractNote = {We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying background music. A key step in our method is to train a music to motion matching quality rating function through learning the music to motion mapping relationship exhibited in synchronized music and dance motion data, which were captured from professional human dance performance. To generate an optimal sequence of dance motion segments to match with a piece of music, we introduce a constraint-based dynamic programming procedure. This procedure considers both music to motion matching quality and visual smoothness of a resultant dance motion sequence. We also introduce a two-way evaluation strategy, coupled with a GPU-based implementation, through which we can execute the dynamic programming process in parallel, resulting in significant speedup. To evaluate the effectiveness of our method, we quantitatively compare the dance motions synthesized by our method with motion synthesis results by several peer methods using the motions captured from professional human dancers' performance as the gold standard. We also conducted several medium-scale user studies to explore how perceptually our dance motion synthesis method can outperform existing methods in synthesizing dance motions to match with a piece of music. These user studies produced very positive results on our music-driven dance motion synthesis experiments for several Asian dance genres, confirming the advantages of our method.},
doi = {},
url = {https://www.osti.gov/biblio/1036181}, journal = {IEEE Transactions on Visualization and Computer Graphics},
issn = {1077-2626},
number = 3,
volume = 18,
place = {United States},
year = {Thu Apr 21 00:00:00 EDT 2011},
month = {Thu Apr 21 00:00:00 EDT 2011}
}