纳金网

标题: Kinect-based Facial Animation [打印本页]

作者: 彬彬    时间: 2011-12-19 16:03
标题: Kinect-based Facial Animation

Copyright is held by the author / owner(s).
SIGGRAPH Asia 2011, Hong Kong, China, December 12 – 15, 2011.
ISBN 978-1-4503-0807-6/11/0012
Kinect-based Facial Animation
Thibaut Weise
EPFL
Sofien Bouazizy
EPFL
Hao Liz
Columbia University
Mark Paulyx
EPFL
Abstract
In this demo we present our system for performance-based character
animation that enables any user to control the facial expressions
of a digital avatar in realtime. Compared to existing technologies,
our system is easy to deploy and does not require any face markers,
in***sive lighting, or complex scanning hardware. Instead, the user
is recorded in a natural environment using the non-in***sive, commercially
available Microsoft Kinect 3D sensor. Since high noise
levels in the acquired data prevent conventional tracking methods
to work well, we developed a method to combine a database of
existing animations with facial tracking to generate compelling animations.
Realistic facial tracking facilitates a range of new applications,
e.g. in digital gameplay, telepresence or social interactions.


1 Real-time Facial Animation
The technology behind the demo was first presented at SIGGRAPH
2011 [Weise et al. 2011]. Our current prototype system uses a simple
2-step process for facial tracking: First, the user performs a set
of calibration expressions. A generic facial blendshape rig is then
modified to best recons***ct these training expressions while keeping
the semantics of the blendshapes intact [Li et al. 2010]. Only
five training expressions are typically sufficient to enable compelling
animations. The resulting personalized facial rig is then
used for real-time facial tracking. Due to the high noise levels of the
input data a database of animations is incorporated into the tracking
framework resulting in stable and accurate facial animations.




2 Applications
In the professional domain our technology can be used as a virtual
mirror: animators drive their CG characters in 3D animation
software packages using their own facial expressions. Similarly, it
has never been easier to include compelling facial animations in the
previs pipeline for movies, or to create facial animations at almost
no cost for secondary characters and crowds in games. For consumers
our technology facilitates a range of new applications, e.g.
in digital gameplay or social interactions.
Acknowledgements. This research is supported by Swiss National
Science Foundation grant 20PA21L 129607.




References
LI, H., WEISE, T., AND PAULY, M. 2010. Example-based facial
rigging. ACM Trans. Graph. 29, 32:1–32:6.
WEISE, T., BOUAZIZ, S., LI, H., AND PAULY, M. 2011. Realtime
performance-based facial animation. ACM Trans. Graph.
30, 77:1–77:10.
e-mail: thibaut.weise@epfl.ch
ye-mail:sofien.bouaziz@epfl.ch
ze-mail:hao@hao-li.com
xe-mail:mark.pauly@epfl.ch
Figure 1: Real-time facial animation using the Kinect as an input
device: the facial expressions of the user are recognized and
tracked in real-time and transferred onto an arbitrary virtual character.


作者: 彬彬    时间: 2012-1-13 15:03



作者: 奇    时间: 2012-4-11 23:20
你们都躲开,我来顶

作者: tc    时间: 2012-4-11 23:28
读铁系缘分,顶铁系友情

作者: 菜刀吻电线    时间: 2012-5-9 23:27
凡系斑竹滴话要听;凡系朋友滴帖要顶!

作者: 奇    时间: 2012-7-9 23:25
我就看看,我不说话

作者: 铁锹    时间: 2012-7-10 09:02
传英特尔提议超极本配备3D显示屏



英特尔促使Ultrabook制造商使用3D屏技术




“克隆人”时代来临 3D打印骨骼肌肉血管







索尼首创3D科普秀《PICO冒险奇兵》发布




3D Tile Format技术 闪耀亚洲广播展




3D打印世界上最快的跑鞋_仅96克

作者: 菜刀吻电线    时间: 2012-7-23 23:24
凡系斑竹滴话要听;凡系朋友滴帖要顶

作者: C.R.CAN    时间: 2012-7-27 23:20
我就看看,我不说话

作者: 菜刀吻电线    时间: 2012-9-23 23:31
跑着去顶朋友滴铁

作者: 奇    时间: 2012-9-30 23:26
心中有爱,爱咋咋地

作者: 晃晃    时间: 2013-1-27 23:21
其实楼主所说的这些,俺支很少用!

作者: 奇    时间: 2013-2-21 23:32
呵呵,真得不错哦!!

作者: C.R.CAN    时间: 2013-3-18 23:19
呵呵,真得不错哦!!

作者: C.R.CAN    时间: 2013-3-21 23:22
很经典,很实用,学习了!





欢迎光临 纳金网 (http://rs.narkii.com/club/) Powered by Discuz! X2.5