|[Oshita Lab.][Research Theme]||[Japanese]|
In this paper, we present a method for the transformation of motion style. We extract style features from differences between two example motions. We then apply the style features to an input motion to transform it to a styled motion. For example, we extract style features that represent "tired" from a walking motion and a tired walking motion. By applying the style features to a running motion, for example, we can transform the running motion to a tired running motion. We represent style features as the difference between two motions in an abstract form. The style features can be applied to a wide range of input motions whose motion directions or global translations are different from the example motions. The style features consist of postural and temporal differences between two example motions. To represent the postural differences, we use the relative positions of primary body parts instead of joint angles. The movements of the pelvis are divided into relative and absolute positions and orientations. The movements of the end effectors are represented in local coordinates computed from motion directions. We present some experimental results and discuss the effectiveness and limitations of our proposed method.
- Takuya Terasaki, Masaki Oshita, "Motion Style Transformation by Extracting and Applying Motion Features", International Conference on Computer Animation and Social Agents 2006 (CASA 06), Geneva, Switzerland, July 2006. [PDF]