纳金网
标题:
Data-driven Texturing of Human Motions
[打印本页]
作者:
彬彬
时间:
2011-12-28 09:11
标题:
Data-driven Texturing of Human Motions
Abstract
Creating natural looking human animations is a challenging and
time-consuming task, even for skilled animators. As manually generating
such motions is very costly, tools for accelerating this process
are highly desirable, in particular for pre-visualization or animation
involving many characters. In this work a novel method for
fully automated data-driven texturing of motion data is presented.
Based on a database containing a large unorganized collection of
motions samples (mocap database) we are able to either: transform
a given ”raw” motion according to the characteristic features
of the motion clips included in the database (style transfer) or even
complete partial animation, e.g. by adding the motion of the upper
body if only legs have been previously animated (motion completion).
By choosing an appropriate database different artistic goals
can be achieved such as making a motion more natural or stylized.
In contrast to existing approaches like the seminal work by Pullen
and Bregler [2002] our method is capable of dealing with arbitrary
motion clips without manual steps, i.e. steps involving annotation,
segmentation or classification. As indicated by the examples, our
technique is able to synthesize smooth transitions between different
motion classes if a large mocap database is available. The results
are plausible even in case of a very coarse input animation missing
root translation.
1 Overview
The basic idea of our method is to take advantage of motion samples
from large databases to improve a given motion. To this end,
for each frame pose of the input motion, matching motion segments
of a few frames in length are retrieved from the mocap database.
For efficient retrieval a technique called Online Lazy Neighborhood
Graph (OLNG) is employed [Tautges et al. 2011]. In essence this
method is able to identify global temporal similarities based on local
neighborhoods in pose space. In a second step, using multi
grid optimization techniques, a new motion is synthesized based on
the input and the prior information from the database. For our implementation
a skeleton-based pose representation with joints and
bones is assumed. However, since the method is directly applicable
to other motion data (i.e. positional marker data) this constitutes no
general limitation of our approach. In the following the individual
steps of our pipeline will be discussed in more detail.
Preprocessing. In a preprocessing step all mocap data from the
prior-database is first normalized with respect to global position and
orientation [Kr¨uger et al. 2010]. Based on normalized positional
data of all available joints we then build an efficient spatial indexing
structure (kd-tree) that is required for OLNG. In addition, linear
marker velocities as well as accelerations are stored. These quantities
are needed for subsequent prior-based motion synthesis.
Motion synthesis. We use an energy minimization formulation
which is frequently used in data driven computer animation. Our
specific choice of the energy terms to be minimized most closely
resembles the one used in [Tautges et al. 2011]. Here, the objective
function is consisting of three different terms: a control term
Econtrol that measures the distance of synthesized and given joint
positions included in the feature set, as well as pose Epose and motion
priors Esmooth and Emotion enforcing positions, acceleration
e-mail: kruegerb@cs.uni-bonn.de
and velocities of joints to be comparable to examples retrieved from
the database. The objective function is minimized using gradient
descent. In addition, to avoid skating artifacts, footprint constraints
were forced by an inverse kinematics approach.
To improve the robustness of our method and to speed up the process
of optimization, we employ a multi-scale approach.
2 Results
To test the effectiveness of our approach we made several tests for
three different scenarios that might occur in practice:
Motion completion: For a given motion missing joints are synthesized.
In our case an animation of the lower body was used as
input to our method, and a plausible upper body motion was created.
Motion texturing: In this case a rough low quality motion (e.g.
from interpolating few key frames) is transformed to a detailed full
body animation. We transform a rough walking and jumping jack
motion with stiff limbs and no root movement to a realistic full body
animation.
Style transfer: Here, characteristic features of one individual are
transferred to another within the same motion class. More precisely,
we took a complex walking sequence and adopted this motion to
match the style of a different subject. This was achieved by using
a database containing only motion samples from the respective
subject.
3 Conclusion and Future Work
In this work a general frame-work for automated data-driven motion
texturing, completion and style transfer for human motions was
sketched. Our approach works reasonably well across different motion
classes that previously could only be handled with massive user
interaction.
We need a mocap-database containing motions which are suitable
for processing a given clip according to our method. Thus,
the results strongly depend on the prior information stored in the
database. Investigating, the impact of using different databases is
of fundamental importance and requires more work.
References
KR¨UGER, B., TAUTGES, J., WEBER, A., AND ZINKE, A.
2010. Fast local and global similarity searches in large motion
capture databases. In Proceedings of the 2010 ACM SIGGRAPH/
Eurographics Symposium on Computer Animation, Eurographics
Association, Aire-la-Ville, Switzerland, Switzerland,
SCA ’10, 1–10.
PULLEN, K., AND BREGLER, C. 2002. Motion capture assisted
animation: texturing and synthesis. ACM Trans. Graph. 21
(July), 501–508.
TAUTGES, J., ZINKE, A., KR¨U GER, B., BAUMANN, J., WEBER,
A., HELTEN, T., M¨ULLER, M., SEIDEL, H.-P., AND
EBERHARDT, B. 2011. Motion reconstruction using sparse accelerometer
data. ACM Trans. Graph. 30 (May), 18:1–18:12.
作者:
彬彬
时间:
2012-1-13 14:47
作者:
奇
时间:
2012-2-12 23:28
好,真棒!!
作者:
菜刀吻电线
时间:
2012-3-16 23:19
提醒猪猪,千万不能让你看见
作者:
菜刀吻电线
时间:
2012-3-17 23:20
都闪开,介个帖子,偶来顶
作者:
菜刀吻电线
时间:
2012-4-30 23:26
读铁系缘分,顶铁系友情
作者:
菜刀吻电线
时间:
2012-5-9 23:27
路过……
作者:
菜刀吻电线
时间:
2012-6-14 23:26
顶!学习了!阅!
作者:
tc
时间:
2012-9-19 23:26
顶!学习了!阅!
作者:
C.R.CAN
时间:
2012-10-4 23:25
好可爱的字,学习了
作者:
菜刀吻电线
时间:
2012-10-18 23:27
不错 非常经典 实用
作者:
奇
时间:
2012-11-7 23:39
心中有爱,爱咋咋地
作者:
晃晃
时间:
2013-3-18 23:18
心中有爱,爱咋咋地
欢迎光临 纳金网 (http://rs.narkii.com/club/)
Powered by Discuz! X2.5