Example-based automatic music-driven conventional dance motion synthesis

We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying backgroun...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 18(2012), 3 vom: 01. März, Seite 501-15
1. Verfasser: Fan, Rukun (VerfasserIn)
Weitere Verfasser: Xu, Songhua, Geng, Weidong
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2012
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article Research Support, Non-U.S. Gov't Research Support, U.S. Gov't, Non-P.H.S.
LEADER 01000naa a22002652 4500
001 NLM207722757
003 DE-627
005 20231224002704.0
007 cr uuu---uuuuu
008 231224s2012 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2011.73  |2 doi 
028 5 2 |a pubmed24n0692.xml 
035 |a (DE-627)NLM207722757 
035 |a (NLM)21519104 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Fan, Rukun  |e verfasserin  |4 aut 
245 1 0 |a Example-based automatic music-driven conventional dance motion synthesis 
264 1 |c 2012 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 28.06.2012 
500 |a Date Revised 07.12.2022 
500 |a published: Print 
500 |a Citation Status MEDLINE 
520 |a We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying background music. A key step in our method is to train a music to motion matching quality rating function through learning the music to motion mapping relationship exhibited in synchronized music and dance motion data, which were captured from professional human dance performance. To generate an optimal sequence of dance motion segments to match with a piece of music, we introduce a constraint-based dynamic programming procedure. This procedure considers both music to motion matching quality and visual smoothness of a resultant dance motion sequence. We also introduce a two-way evaluation strategy, coupled with a GPU-based implementation, through which we can execute the dynamic programming process in parallel, resulting in significant speedup. To evaluate the effectiveness of our method, we quantitatively compare the dance motions synthesized by our method with motion synthesis results by several peer methods using the motions captured from professional human dancers' performance as the gold standard. We also conducted several medium-scale user studies to explore how perceptually our dance motion synthesis method can outperform existing methods in synthesizing dance motions to match with a piece of music. These user studies produced very positive results on our music-driven dance motion synthesis experiments for several Asian dance genres, confirming the advantages of our method 
650 4 |a Journal Article 
650 4 |a Research Support, Non-U.S. Gov't 
650 4 |a Research Support, U.S. Gov't, Non-P.H.S. 
700 1 |a Xu, Songhua  |e verfasserin  |4 aut 
700 1 |a Geng, Weidong  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g 18(2012), 3 vom: 01. März, Seite 501-15  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnns 
773 1 8 |g volume:18  |g year:2012  |g number:3  |g day:01  |g month:03  |g pages:501-15 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2011.73  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 18  |j 2012  |e 3  |b 01  |c 03  |h 501-15