THANOS Artist Interview
“Avengers: Infinity War” changes all that. The character of Thanos, played by Josh Brolin, is front-and-center for 40 minutes of the film’s 2 ½ hour running time, and if audiences couldn’t relate to him as a real, flesh-and-blood human character, the movie would flatline – game over. As it turns out, the combination of a restrained, tragic performance by Brolin, animated by the most fine-grained and responsive CG face in movie history, created an all-digital leading man who could, and did, carry the film.
Figure 2. Digital Domain’s Jan Philip Cramer can claim animation and VFX credits on "Avengers: Infinity War" (2018); "Beauty and the Beast" (2017); "Spider-Man : Homecoming" (2017); "Deadpool" (2016), "X-Men: Days of the Future Past" (2014); "Jack the Giant Slayer" (2013) and more.
How did the team behind the movie pull this off, and what might it mean for the future? I recently caught up with the very busy Jan Philip Cramer, Head of Animation at Digital Domain (DD) in Vancouver, for a highly informative discussion about the process that brought the very evil, but very empathetic, Thanos, to life. A keen observer of the face in action, Phil was kind enough to mention that reading my book, “The Artist’s Complete Guide to Facial Expression,” several times was an important part of his early training.
In press coverage on the making of the movie, much attention has been paid to new software that allowed for an unprecedented level of efficiency and accuracy in transforming the details of Brolin’s facial performance to Thanos. For years there has been a push in the industry to make this pipeline – actor to character - more and more automated, and it sounded like the version employed in Infinity War was more efficient than ever—so much for the artists.
And one critical and subtle detail stands out above all as central to the success of Thanos which proved to be the greatest failing of previous attempts to create a convincing CG main character: the eyes.
in action in order to build a more detailed and complex Thanos on screen.
Figure 4. Digital Domain used new machine-learning algorithms to advance the facial performance capture process.
It’s one thing to put 250 dots on a face and track the precise movements of large, less mission-critical areas like the forehead, jaw, or cheeks. It’s quite another to expect a few data points to catch the amount of exposure of the iris, the exact course of the eyelids, or the direction of the gaze. And within the jewel-like precision of that geography rests the essence of what makes sadness look sad, or anger look angry.
In the case of Thanos, the Digital Domain team employed several first-time software tools to automate the bulk of the human-to-model transfer. One piece of software – Masquerade – used machine learning to match super high resolution Brolin poses stored in a digital library with much lower resolution on-set shots filmed by a MoCap helmet camera while he is in performance. Another piece of software – Direct Drive – then mapped the resulting hi-res Brolin mask onto DD’s Thanos rig, giving Phil’s animation team a huge head start on their work; Phil estimated that a shot that might have previously taken three to four weeks to animate could now be done in three to four days (a shot being several seconds of screen time) given the sheer amount of machine-modified imagery that now flowed into their animation systems based on the transfer techniques.
From a perceptual point of view, this matches exactly the expectations that audiences unconsciously bring to their experience of a human face, where they unconsciously use the eyes to signal the degree of human presence; if any of these elements – and there are lots! – aren’t right, the Uncanny Valley rears its ugly and disconcerting head, and the audience perception of Thanos goes from human to robotic.
Ultimately, Thanos works as well as he does because of the hyper-realism of his eyes more than any other single detail. Phil tells the story of the test that originally helped reassure the filmmakers of the process. The animators were particularly aware of two things – the directors saw Thanos as a dialed-back villain (think anti-Joker) visualized as scarier for his restraint and, they knew that getting his eyes “right” would be critical to their success. “I got the most positive feedback I’ve ever received in my career," Phil told me, “in response to that short.” Ironically, the content of the reel is simply Brolin/Thanos sitting in the dark on a throne, talking quietly, using random dialogue having nothing to do with the movie. No fist-pumps, no explosions, but great eyes.
Thanos manages to vault across the Uncanny Valley.
Phil and I ended our conversation by discussing the current state of the art of the robotics industry, where their efforts to create empathetic faces for their mechanical creations remind Phil of the animation industry in the early days of the CG revolution. Even though there are promising signs, today’s robots have a long way to go to reach the level of realism possible in CG. But the first steps are being taken!
Unlike robot designers, filmmakers now have the electronic tools to literally make convincing humans out of thin air. Fortunately, actors and artists are still 100% required. Tools are one thing, the vision and heart to exploit these resources in ground-breaking ways will be revealed in years to come.
in a Hollywood movie.