Face Animation Parameter
A Face Animation Parameter (FAP) is a component of the MPEG-4 Face and Body Animation (FBA) International Standard (ISO/IEC 14496-1 & -2) developed by the Moving Pictures Experts Group.[1] It describes a standard for virtually representing humans and humanoids in a way that adequately achieves visual speech intelligibility as well as the mood and gesture of the speaker, and allows for very low bitrate compression and transmission of animation parameters.[2] FAPs control key feature points on a face model mesh that are used to produce animated visemes and facial expressions, as well as head and eye movement.[1] These feature points are part of the Face Definition Parameters (FDPs) also defined in the MPEG-4 standard.[3]
FAPs represent 66 displacements and rotations of the feature points from the neutral face position, which is defined as: mouth closed, eyelids tangent to the iris, gaze and head orientation straight ahead, teeth touching, and tongue touching teeth.[4] These FAPs were designed to be closely related to human facial muscle movements. In addition to animation, FAPs are used in automatic speech recognition,[5] and biometrics.[6]
References
[edit]- ^ a b Ostermann, Jörn (August 2002). "Chapter 2: Face Animation in MPEG-4". In Pandzic, Igor; Forchheimer, Robert (eds.). MPEG-4 Facial Animation: The Standard, Implementation and Applications. Wiley. pp. 17–55. ISBN 978-0-470-84465-6.
- ^ Tao, Hai; Chen, H.H.; Wu, Wei; Huang, T.S. (1999). "Compression of MPEG-4 facial animation parameters for transmission of talking heads". IEEE Transactions on Circuits and Systems for Video Technology. 9 (3). IEEE Press: 264–276. doi:10.1109/76.752094.
- ^ Kshirsagar, Sumedha; Garchery, Stephane; Magnenat-Thalmann, Nadia (2001). "Feature Point Based Mesh Deformation Applied to MPEG-4 Facial Animation". Deformable Avatars: 24–34.
- ^ Petajan, Eric (September 2005). "MPEG-4 Face and Body Animation Coding Applied to HCI". In Kisačanin, B.; Pavlović, V.; Wang, T.S. (eds.). Real-time vision for human-computer interaction. Springer. pp. 249–268. ISBN 0387276971.
- ^ Petajan, Eric (January 1, 2009). "Chapter 4: Visual Speech and Gesture Coding Using the MPEG-4 Face and Body Animation Standard". In Wee-Chung Liew, Alan (ed.). Visual Speech Recognition: Lip Segmentation and Mapping. IGI Global. pp. 128–148. ISBN 9781605661872.
- ^ Aleksic, P.S.; Katsaggelos, A.K. (November 2006). "Audio-Visual Biometrics". Proceedings of the IEEE. 94 (11). IEEE Press: 2025–2044. doi:10.1109/JPROC.2006.886017.
External links
[edit]