12
Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional Russell’s Emotional Model Junghyun Ahn a , Stephane Gobron a , Quentin Silvestre a , and Daniel Thalmann a a Ecole Polytechnique F´ eerale de Lausanne, 1015 Lausanne, Switzerland [email protected], [email protected], [email protected], [email protected] Abstract. This paper proposes a novel method to compute emotional parameters based on Russell’s 2D circumplex diagram for simulating and visualizing asymmetric facial expressions. Based on a large set of emotion coordinates allocated by psychologists, we have selected a uni- formly well-distributed subset and associate to each corresponding facial expressions using 16 facial joints with three degree of freedom for each (rotations). From the advanced 2D (valence, arousal) emotional space and facial expression samples, we were able to demonstrate smooth and asymmetric facial interpolation in the entire 2D circumplex space. The proposed method for facial expression can be applied to various envi- ronments relative to entertainment such as movies, games, and virtual reality. Keywords: Facial emotion, Virtual Reality, Virtual human, Emotional model, Agent profile 1 Introduction Diversity of human’s physical shape and psychological state makes very difficult to derive facial expression with emotion. Composition of muscle, personality, so- cial relationship, and external environment can affect many different possibilities to define a state of emotion. Nevertheless, researchers tried to define a theoretic emotional space and to find relationship between part of face and emotion. In terms of computer graphics, these previous researches are applied for deriving complex facial expressions. However, they were not easy to derive emotional ex- pression from different personalities and to show asymmetric facial expressions. In this paper, we propose an advanced interpretation of facial emotion, based on the Circumplex Model [16] for emotional space, and Facial Action Coding System [7] for facial expression. The main contributions of this paper are as follow: refer to the emotional model that has been brought by psychologist easy to define diverse asymmetric facial expression use of 2D emotional space as 2.5D like space have possibilities to derive emotion according to his/her personality

Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on anAdvanced Interpretation of Two-dimensional

Russell’s Emotional Model

Junghyun Ahna, Stephane Gobrona, Quentin Silvestrea, and Daniel Thalmanna

aEcole Polytechnique Federale de Lausanne, 1015 Lausanne, [email protected], [email protected], [email protected],

[email protected]

Abstract. This paper proposes a novel method to compute emotionalparameters based on Russell’s 2D circumplex diagram for simulatingand visualizing asymmetric facial expressions. Based on a large set ofemotion coordinates allocated by psychologists, we have selected a uni-formly well-distributed subset and associate to each corresponding facialexpressions using 16 facial joints with three degree of freedom for each(rotations). From the advanced 2D (valence, arousal) emotional spaceand facial expression samples, we were able to demonstrate smooth andasymmetric facial interpolation in the entire 2D circumplex space. Theproposed method for facial expression can be applied to various envi-ronments relative to entertainment such as movies, games, and virtualreality.

Keywords: Facial emotion, Virtual Reality, Virtual human, Emotionalmodel, Agent profile

1 Introduction

Diversity of human’s physical shape and psychological state makes very difficultto derive facial expression with emotion. Composition of muscle, personality, so-cial relationship, and external environment can affect many different possibilitiesto define a state of emotion. Nevertheless, researchers tried to define a theoreticemotional space and to find relationship between part of face and emotion. Interms of computer graphics, these previous researches are applied for derivingcomplex facial expressions. However, they were not easy to derive emotional ex-pression from different personalities and to show asymmetric facial expressions.In this paper, we propose an advanced interpretation of facial emotion, basedon the Circumplex Model [16] for emotional space, and Facial Action CodingSystem [7] for facial expression. The main contributions of this paper are asfollow:

– refer to the emotional model that has been brought by psychologist– easy to define diverse asymmetric facial expression– use of 2D emotional space as 2.5D like space– have possibilities to derive emotion according to his/her personality

Page 2: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

2 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

1.1 Background

In 1980, Russell proposed a circumplex model of emotion [16]. This model de-fines an emotional space along with unpleasant-pleasant axis (valence axis) anddeactivation-activation axis (arousal axis). Russell’s model is referred to fur-ther researches on different fields such as psychology and computer graphics.Scherer [18] proposed a multicomponent process in order to overcome the limitof two dimensional emotion space. Barrett and Russell [1] proposed a seman-tic structure of affect in order to show the bipolarity. Scherer [19] proposed analternative dimensional structures of the semantic space for emotions. This di-mensional structure improved the emotional space of Russell’s circumplex model.From those previous works, we designed an advanced two dimensional emotionspace for generating asymmetric facial expression. There were also some ap-proaches to find the 3rd dimension for emotional modeling [9]. Our proposedmodel is also extendable for any kind of 3rd or multi dimensional models.

In 1978, Ekman and Friesen [7] proposed Facial Action Coding System (FACS),and in 2002, Ekman et al [8] presented an improved version of manual for FACScoding. This system has presented how facial parts are related to emotions. Cohnet al [5] proposed an automatic facial analysis by tracking feature of images. Theanalysis is processed based on FACS coding. Sato and Yoshikawa [17] proposeda method that blends facial emotion in timeline. Krumhuber et al [12] has pre-sented a research on temporal aspect of smile related to head-tilt and gender.Studies on facial expression helped us to find connections between emotionalwords and facial expression itself.

Research on the emotional modeling and studies on facial expression hasalso affected to the computer graphics field. Cassell et al [3] proposed an en-tire framework on embedded conversational VH. This approach described aboutmany issues such as emotional modeling, gesture, facial expression, and spokenintonation. Spencer-smith et al [20] proposed a parameterized model for express-ing facial emotion in 3D graphics. However, it shows only a discrete number ofemotions created by their own parameterized model. Hess et al [11] presented amethod to distinguish facial emotion between women and men. Pelachaud [14]defined a representation scheme that describes the temporal evolution of theexpression of an emotion. Gobron et al [10] proposed a framework on conversa-tional environment based on textual inputs. They show a parameterized emo-tional space based on Russell’s circumplex model. However, it was not able toexpress asymmetric facial expression that enrich the diversity among VH. Ourapproach proposes an advanced interpretation on emotional space and its linkto the facial expression of virtual human (VH). For generating final emotionfrom the 2D coordinate, we considered a similar interpolation technique thatwas presented in [15] and [4].

2 Method

Our main goal is to produce a convenient method for producing facial expres-sions, and especially suitable for asymmetrical facial emotion as illustrated in

Page 3: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on 2D Russell’s Emotional Model 3

Unhappy Suspicious Sad Surprised Happy Delighted

Fig. 1. Asymmetric facial expression taken from real environment: first row, source fa-cial expressions; second row, source image after Boolean filter; last row, hand extractedasymmetric smileys (see Figure 3 to compare with symmetrical smileys)

Figure 1. We propose to develop a model based on the Russell’s circumplexmodel of emotion (see Figure 2). This multi-dimension approach is commonlyused in second order (i.e. two dimensions), where axes are the emotional “va-lence” and “arousal”. The valence represents the positive or negative feelings asubject perceive, whereas the emotional arousal symbolize the energy of the af-fective state evokes in the individual. In the Figure 3, we demonstrate how onlyfew simple geometric rule can result into explicit facial expressions even withsimple smileys. Notice that in this example, the third dimension of the Russell’semotion model is also presented (the potency), representing a defensive (low)or an offensive (high) emotional attitude/state of mind. Geometric rules thatapplied in this figure are:

– Valence: mouth corners up or down (i.e. smile) enabling to transmit a rep-resentation of positive or negative feeling;

– Arousal: mouth and eyes widely open or almost closed representing respec-tively an high or low energetic emotional state;

– Potency: head position and central position of eyebrows up or down repre-senting respectively an offensive or defensive attitude.

Page 4: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

4 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

-0.1

-0.2

-0.3

-0.4

-0.5

-0.6

-0.7

-0.8

-0.9

-1.0

-0.1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0-0.2-0.3-0.4-0.5-0.7-0.8-1.0 -0.9 -0.6

Active / Aroused

Obstructive

Negative

Low Power/Control

Passive / Calm

Conductive

Positive

Hight Power/Control

AROUSEDASTONISHEDALARMED

TENSE

ANGRY AFRAID

ANNOYED

DISTRESSED

FRUSTRATED

EXCITED

DELIGHTED

HAPPY

PLEASED

GLAD

SERENE

CONTENTAT EASE

SATISFIEDRELAXEDCALM

SLEEPYTIRED

DROOPY

BORED

DEPRESSED

SAD

GLOOMY

MISERABLE

adventurous

triumphant

selfconfident

courageous

ambitiousconceited

lusting

feeling superior

convinced

light hearted

passionate

expectantinterested

amused

determined

enthusiastic

joyous

bellicose

hostile

envioushateful

defient

contemptuous

enraged

jealous

indignantdisgusted

loathing

discontented

bitter

insulted

distrustful

startled

suspiciousimpatient

disappointed

dissatisfied

taken aback

apathetic

worried

feel guilt

languidashamed

despondent

uncomfortable

desperate

waveringhesitant

embarrassed

melancholic

dejected

anxious

doubtful

impressed

confident

hopeful

amorous

feel well

solemn

friendly

peaceful

contemplative

attentivelonging

pensive

serious

conscientious

polite

compassionatereverent

Fig. 2. Two dimensional circumplex space model and its emotional samples plottedon the coordinate [16] [1] [19]. Emotions in orange square are emotional samples thatwe took for our experiment. Emotions with green line are emotions that correspond toFigure 3.

2.1 From square to circle

In the two dimensional circumplex model of Russell, emotions are included ona disk. As (v, a) values refined from source classifiers range both from −1.0 to+1.0, we must scale them so that (v′, a′) satisfies the r disk radius, implying√

v′ 2 + a′ 2 = r. Then, considering z to be the smallest value between a and v,we derive:

v′ =v√

1 + z2(1)

a′ =a√

1 + z2(2)

Page 5: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on 2D Russell’s Emotional Model 5

Aro

usa

l

Valence

Aro

usa

l

Valence

Aro

usa

l

Valence

EmbarrasedSurprised

Delighted

Neutral HappyUnhappy

SleepyGladAnxious

Disagreement

ViciousSulky

FuriousIn pain

Excited

Angry

High potency

(o!ensive)

Low potency

(defensive)

Malicious

Annoyed

AfraidScared

Empathic

Depressed Sad Compassionate

BoredTired

Peaceful

Fig. 3. Illustration of the third order of the Russell’s circumplex model of emotion, thethree orders being respectively the valence, the arousal, and the potency. Using dimen-sional model only, smileys can only be symmetric and lack of emotion (see Figure 1).

The first row of Figure 4 illustrates this concept: in 4(a) square grid, 4(b)mapped disk grid, and (c) 4four examples of mapping, the dark green cross beenthe input, and the light green one the output.

Page 6: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

6 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

(a) (b) (c)

(d) (e) (f)

(g) (h) (i)

Aroused

AfraidExited

Delighted

JealousDistressed

Convinced

Angry

Insulted

Suspicious

Happy

Mirerable

Dissatis edImpressed

Glad

WorriedSad

Ashamed

Bored

Serious

TiredCompasionate

Relaxed

Attentive

Serene

Neutral

InsultedSuspicious

Dissatis ed

Insulted

Suspicious

Valence axis

Arousal axis

Fig. 4. Finding a specific facial expression –including asymmetries–from a set of po-sitioned emotions defined by psychologists: Row 1, from square to disk grid; Row 2,initial set of 26 emotions; Row 3, guarantying continuity at any (v, a) position

2.2 Finding average emotion

From a set of emotions given by psychologists, associated with precise respective2D coordinates, we plan to find an average emotional state for any couple ofemotional parameter (v, a) resulting from classifiers. The second row of Figure 4present the input –i.e. initial set of 26 emotions–we used, 4(d) 2D position of theinitial set of referred emotions, 4(e) input emotion set inside the disk-grid with aninput (v,a) emotion (dark green), and its corresponding mapped position (lightgreen) close to a specific input set emotion (light red), 4(f) same as previous withthe computed proportionality illustrated with cross sizes (a dot representing aminimum of 1 percent);

We found difficulties to find a good solution, as we tried the following tech-niques : averaging by distance to closest set of included triangles; closest three

Page 7: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on 2D Russell’s Emotional Model 7

emotions taking into account collinear exceptions; pre-computation of Voronoidiagrams and associating with inner polygon edge lengths.

To avoid any discontinuities and to match precisely the emotional input set(E1, E2, ..., En), we finally assumed that the computation of E(v,a) should takeinto account all source emotion proportionally to the inverse of the squareddistance:

First, the emotion E as to take into account all the input emotions:

E(v,a) = k1.E1 + k2.E2 + ... + kn.En =n∑

i=1

ki.Ei (3)

With the following constrain

n∑

i=1

ki = 1 (4)

Considering the set of distance D = d1, d2, ..., dn with

di =√

(E.v − Ei.v)2 + (E.a−Ei.a)2 (5)

if ∀j∃dj = 0, the emotion is directly found E(v,a) = kj .Ej = Ej . Else, we candetermine a proportional parameter p such that

p =n∑

i=1

1d2

i

(6)

which satisfy the constrain (2) as

(3) ⇒n∑

i=1

1p.d2

i

= 1 (7)

where we can finally deduce from (1) and (4)

E(v,a) =n∑

i=1

1p.d2

i

.Ei (8)

Results are presented row 3 of figure 4, where from the uniformly well-distributed subset we guaranty a coherent facial emotion at any (v, a) position,e.g. in 4(g) far from all input references, 4(h) getting close to a specific emotion,and 4 (i) been almost exactly at the coordinate of a referred emotion.

In the next section, we first propose the structure of the VH face we usedfor testing our approach, then we briefly discuss about our choice of subset (i.e.26 well distributed emotions), and finally, we present a set of resulting facialexpressions defined at source coordinates and then at any intermediate point.

Page 8: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

8 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

Fig. 5. Position of 16 facial joints shown in wireframe mesh - front and side view. Facialjoint named with ‘s’ at the end contain left and right joints.

3 Facial Expressions

3.1 Structure of virtual human’s face

As depicted in Figure 5, emotion samples of the circumplex space are controlledby 16 joints with 48 DoFs. Facial joints consist of head, left/right eye ball,left/right eye brow, left/right eye lid, mouth, lower lip, left/right cheek, left/rightlip corner, left/right lower eye lid, and upper lip. Each emotion is created man-ually based on the knowledge of FACS coding [7] [8] and Figure 3. Each facialjoint is rotated by three axes and can be controlled and adjusted by our 3Dvisual software tool.Facial joint’s attributes is a relative angle value from theneutral emotion. Therefore, personalized emotional samples can be generatedby modifying only the neutral emotion.

3.2 Emotional samples on the circumplex space

In order to derive facial expression, we have selected 26 emotional samples amongthe pre-defined emotional words from circumplex model [16] [1] and semanticspace of emotion [19]. Emotional keywords have been selected uniformly fromthe advanced circumplex space. As was depicted in Figure 6, 26 facial expres-sions along with 2D coordinate are generated from our graphics tool. Emotionalwords written in upper case are the emotional words from Russell’s circumplexmodel [16]. On the other hand, emotional words written in lower case are fromthe advanced circumplex model proposed by Scherer [19].

Page 9: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on 2D Russell’s Emotional Model 9

ASTONISHED

ANGRY AFRAID

DISTRESSED

EXCITED

DELIGHTED

HAPPY

GLAD

SERENE

RELAXED

TIRED

BORED

SAD

MISERABLE

convinced

jealous

insulted

suspicious

dissatisfied

worried

ashamed

impressed

attentive

serious

compassionate

Fig. 6. Emotion of 26 samples from the advanced 2D circumplex space. Horizontaland vertical axes shows valence and arousal axes, respectively. From left to right thevalence is increasing, and from bottom to top the arousal value is increasing.

3.3 Facial expression results

As was depicted in Figure 7, we have tested facial expressions on any randomv, a coordinate. To show the robustness of our system we used different VHmodel for each random v, a coordinate. In this Figure, first column shows fa-cial expression driven by our method. The second column shows circumplexspace coordinate (light green ‘+’) of the current facial expression. The third,

Page 10: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

10 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

Fig. 7. Three different facial expressions driven by our method. From left to rightcolumn: facial expression driven by our method; the advanced circumplex space andcurrent valence and arousal values; top weighted emotional samples

fourth, and fifth columns shows three top weighted emotional samples thatare related with the driven emotion on the first column. The total weightingvalue among all the emotional samples is 100 percents. Therefore, the remain-ing weighting values are distributed to the other emotional samples that arenot shown in the Figure. The demonstration of our approach can be found at:http://asymmetricalexpressions.serveftp.org/.

4 Conclusions

In this paper, we addressed the problem of generating VH’s asymmetrical facialexpression from the advanced emotional space. We also proposed a method thatcan facilitate to derive diverse emotional expressions in computer graphics ap-plications. For the emotional space, we referred to Russell’s circumplex modeland further improvements on this model, described by psychologist. As a result,

Page 11: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

Asymmetrical Facial Expressions based on 2D Russell’s Emotional Model 11

we obtained a smooth interpolated facial expression along with both axes - va-lence and arousal. Our developed software tools were able to edit customizedemotional samples that lies on the emotional space.

In order to improve the proposed method, we are planning to find a multi-dimensional emotional space with more emotional samples. In this case, theproblem of calculation time will occur, however, the expressivity of emotionwill increase at the same time. Our model basically controls facial expressionby rotating joint axes. It can be extend to any facial structures such as highresolution mesh by interpolating position and rotation of mesh vertices insteadof joint angle. Since our angle attributes are defined by delta angles from theneutral emotion, we can also modify neutral emotion itself and show biasedemotional space of different personality of VH.

In terms of realistic visualization, we hope to exploit more on captured facialexpression [21]. Moreover, body movement such as breathing or idle motion [6]should be implemented into our system. The emotional body movement [13] andits control [2] are also challenging issues to be solved.

4.1 Acknowledgements

This work was supported by a European Union grant by the 7th FrameworkProgramme, Theme 3: Science of complex systems for socially intelligent ICT,which is part of the CYBEREMOTIONS Project (Contract 231323). Authorswould like to thank to Mireille Clavien for the help of VH modeling and toMathieu Hopmann, Dr. Arvid Kappas, Dr. Janusz Holyst for their respectivecontributions on facial expressions (i.e. pictures and avatars).

References

1. Barrett, L.F., Russell, J.A.: Independence and bipolarity in the structure of currentaffect 74(4), 967–984 (April 1998)

2. Boulic, R., Maupu, D., Thalmann, D.: On scaling strategies for the full-body pos-tural control of virtual mannequins. Interacting with Computers 21(1-2), 11–25(2009)

3. Cassell, J., Pelachaud, C., Badler, N., Steedman, M., Achorn, B., Becket, T., Dou-ville, B., Prevost, S., Stone, M.: Animated conversation: rule-based generation offacial expression, gesture & spoken intonation for multiple conversational agents.In: SIGGRAPH ’94: Proceedings of the 21st annual conference on Computer graph-ics and interactive techniques. pp. 413–420. ACM, New York, NY, USA (1994)

4. Cohen, M.F., Rose, C., Sloan, P.P., Liu, Z., Zhang, Z.: Example-based everything.In: In Proc. International Workshop on Human Modeling and Animation. pp. 87–91 (2000)

5. Cohn, J.F., Zlochower, A.J., Lien, J., Kanade, T.: Automated face analysis byfeature point tracking has high concurrent validity with manual facs coding 36(1),35–43 (January 1999)

6. Egges, A., Molet, T., Magnenat-Thalmann, N.: Personalised real-time idle motionsynthesis. In: Pacific Graphics 2004. IEEE (2004)

Page 12: Asymmetrical Facial Expressions based on an Advanced Interpretation of Two-dimensional ... · 2015-09-17 · Asymmetrical Facial Expressions based on an Advanced Interpretation of

12 J. Ahn, S. Gobron, Q. Silvestre, and D. Thalmann

7. Ekman, P., Friesen, W.V.: Facial action coding system. CA: Consulting Psycholo-gists Press (1978)

8. Ekman, P., Friesen, W.V., Hager, J.C.: Facial Action Coding System - The Manual.A Human Face (2002)

9. Fontaine, J.R., Klaus R. Scherer, E.B.R., Ellsworth, P.C.: The world of emotionsis not two-dimensional. Psychological Science 18(12), 1050–1057 (November 2007)

10. Gobron, S., Ahn, J., Paltoglou, G., Thelwall, M., Thalmann, D.: From sentence toemotion: a real-time three-dimensional graphics metaphor of emotions extractedfrom text. The Visual Computer 26(6-8), 505–519 (June 2010)

11. Hess, U., Adams, B., Grammer, K., KLeck, R.: Face gender and emotion expression:Are angry women more like men ? Vision 9(12), 1–8 (2009)

12. Krumhuber, E., Manstead, A., Kappas, A.: Temporal aspects of facial displays inperson and expression perception. JNB 31, 39–56 (2007)

13. McDonnell, R., Jorg, S., McHugh, J., Newell, F., O’Sullivan, C.: Evaluating theemotional content of human motions on real and virtual characters. In: Proceedingsof the 5th Symposium on Applied Perception in Graphics and Visualization, APGV2008. pp. 67–74. ACM (2008)

14. Pelachaud, C.: Modelling multimodal expression of emotion in a virtual agent.Philosophical Transactions of the Royal Society B: Biological Sciences 364(1535),3539–3548 (2009)

15. Rose, C., Bodenheimer, B., Cohen, M.: Verbs and adverbs: Multidimensional mo-tion interpolation 18(5), 32–40 (1998)

16. Russell, J.A.: A circumplex model of affect 39, 1161–1178 (1980)17. Sato, W., Yoshikawa, S.: The dynamic aspects of emotional facial expressions.

Cognition and Emotion 18(5), 701–710 (2004)18. Scherer, K.R.: Emotion as a multicomponent process 5, 37–63 (1984)19. Scherer, K.R.: What are emotions? and how can they be measured? Social Science

Information 44(4), 695–729 (2005)20. Spencer-smith, J., Wild, H., Innes-ker, A.H., Townsend, J., Duffy, C., Edwards, C.,

Ervin, K., Merritt, N., Paik, J.W.: Making faces: Creating three-dimensional pa-rameterized models of facial expression. Behavior Research Methods, Instrumentsand Computers 33, 115–123 (2001)

21. Weise, T., Li, H., Van Gool, L., Pauly, M.: Face/off: live facial puppetry. In: SCA’09: Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Com-puter Animation. pp. 7–16. ACM, New York, NY, USA (2009)