vor 2 Jahren

Back to news

Facial animation in SWTOR

Star Wars: The Old Republic features facial animation that looks handcrafted, yet no animator is required, everything is automatically generated. We've spent the last few weeks figuring out the FaceFX files and are here to share our findings.

One key part of facial animation is lip sync, specifically detecting phonemes in the audio and moving the mouth to phoneme shapes. This problem is already solved via natural language processing and various off-the-shelf products are available. The harder part is moving the eyebrows and lips to convey emotion, and adding subtle head movements so the character feels lifelike.

We'd be remiss to talk about facial animation without looking at previous BioWare games. In Knights of the Old Republic, characters were already doing lip sync but the emotions were limited to a few static animations.

Starting with Mass Effect, BioWare licensed the middleware FaceFX by OC3 Entertainment. Its key innovation is the face graph, which receives some input values, adds them up (within min/max bounds to maintain realism), and drives the bone positions. This has the advantage that designers can dial up an emotion with a slider without having to worry about bone poses, and create a unique animation for each line.

But the developers did not stop there: They created a tool called RoboBrad to automatically generate a baseline facial animation, so any issues that remained became a matter of manual cleanup instead of having to start from scratch. The same tool was used for Dragon Age: Origins.

For SWTOR, the developers further advanced RoboBrad so that all emotions and gestures can be fully generated without any manual touch-up required. In his GDC 2010 talk "Automated Emotion", Ben Cloward explained the system in depth. The jist is that the system looks at whether a sentence ends with a period, exclamation or question mark, as well as a character's personality and emotional state as set by the designers. It then narrows down the choice of poses and randomly picks one that fits.

The game files contain the final result: The Actor files (.fxa) contain the face graph for each of the 22 body types, while 12,586 external AnimSet files (.fxe) store the animation curves for each cutscene.

During our analysis, we discovered that originally, Ortolans were created as a separate body type with animated ears. This was cut before release; they now use the "dwarf" body type (together with Drall, Ewoks, Jawas, Ugnaughts & Yoda's species) and their ears are driven by cloth simulation.

After release, more body types were added for the Terror from Beyond operation, specifically "gree" with Game Update 1.3, and "wampa" & "wampakephess" with 1.4.

Since then, the animations have remained mostly unchanged, with only one addition in 4.7.2 for Knights of the Eternal Throne: Three new bones are fed into the shaders to produce animated wrinkles.

Looking back, it seems that going with fully automated facial animation has worked pretty well for SWTOR. Even though the development team changed and the budget was reduced, the facial animations have stayed consistent and timeless.

Other games, most notably Mass Effect: Andromeda, show what happens without a fully-automated approach: There's not enough time to tweak every single dialogue, so you are left with high-priority dialogue that gets manually tweaked, and lower-priority dialogue that is automatically generated (but in a much worse way than in SWTOR) and looks bland and lifeless. Meanwhile, newer AAA games have started to mocap the facial performance during recording, which increases realism but limits those games to static cutscenes.

To see the animations yourself, open any .fxa or .fxe file in our file reader: