The Games of a Generation promotion launches on PlayStation Store Wednesday, September 30, giving you the chance to discover some of the best gameplay experiences PS4 has to offer. That includes this year’s acclaimed open world samurai epic Ghost of...
The story behind The Last of Us Part II’s staggeringly realistic in-game character facial animation
It’s my first day in Seattle. Ellie and Dina are exploring a side room of a ruined synagogue. As has happened with frequent regularity since the game’s start, I open up Photo Mode. Naughty Dog’s latest is quickly proving to be a treasure trove for the photographic eye.
I pan round to compose my shot, looking to capture expressions, body language. To work out the best position for lighting. In doing so I spot Ellie’s gaze seemingly drawn to a painting in the room. I can’t interact with it, there’s no dialogue attached to its presence. Yet her facial expression gives every indication that she’s mulling the artwork over. Could Naughty Dog really have created subtle reactions for everything in the game?
The answer, in a way, remarkably, is yes. The illusion is carefully crafted from multiple game systems working simultaneously. All built to depict the most realistic rendering of in-game character models possible.
These models sit separate from the motion capture tech utilized to render real-world actors in-game for cinematics. When those end and gameplay begins, that’s when the careful handiwork coordinated between multiple teams in the studio kicks in.
Taking me on a tour of this is Keith Paciello, the studio animator who masterminded the in-game facial tech after his internal pitch was greenlit and who collaborated on numerous other animation processes.
“In that instance,” he explains during our video call after I describe the scene to him, “you as a player are aiming with the controller for Ellie to look at the painting, which is triggering a ‘look at target’ placed by a designer. On top of that, I animated small eye darts (saccades) within the character’s facial idles to try and indicate an overall thought process. So animated eye saccades sitting on top of the eye-aim, work together to create what looks like focus and thought process.”
It’s an example of that close collaboration between the different teams during development, but also helps highlight the work being performed by Paciello’s Emotional Systematic Facial Animation system. In simplest terms, it picks a facial expression from a range of nearly 20 different emotional states for any of the 25 key characters that are on-screen. That covers leads, co-op partners, enemies, even Infected to an extent.
Facial animation working in unison with eye movements, body language, breathing… all interlinked and triggered by script beats, dialogue, encounters or ambient moments, like Ellie seemingly being absorbed by a painting. The illusion of emotion is constructed with mathematical precision. “It brings a depth to the characters that we’ve never seen before,” says Paciello.
After 17 years working as an animator, nine of which have been spent at Naughty Dog and most recently working as Lead Cinematic Animator on Uncharted: Lost Legacy, Keith was already invested in finding the best way to visually capture a character’s feelings.
Yet the epiphany to create something groundbreaking for The Last of Us Part II came from, of all things, a single blade of grass.
“Everyone was stepping up their game for TLOUII,” recalls Paciello during our video call. “We were looking at and talking about how to make a blade of grass even better [in game]. In doing so, we panned up, and there was this blank face on the character. I was like, “Oh.” It was then I wondered how we could simply, across the entire game, add these emotional beats to the characters, so at any point, you can tell what that character is feeling.”
Most of his development time would be spent solving that riddle. Using the Ellie facial model as a base, he started by sculpting expressions based on seven universally recognizable emotions: joy, sadness, fear, anger, surprise, disgust and contempt. Liaising with the dialogue team, they tagged emotions to be triggered at specific lines, building a solid foundation and allowing characters to smoothly transition from one to another. “We could emotionally pace out our characters from the absolute beginning of the game to the absolute end of the game and blend in and out of cinematics seamlessly.”
Paciello was then asked to support melee, fashioning realistic reactions for combatants. He also aided in animating the breathing system (“six animations, from small breathing all the way up to exhausted”) which would go to such lengths as to see characters switch from open to closed mouth breathing depending on their proximity to an enemy. “That was one of my favourite collaborations on the project,” reminisces Neil.
Outside of script-specific triggers, characters would also have a number of ‘neutral’ idle states with emotional overlays. He also brought his expertise to hand-sculpting poses for each character to match their emotional states. Some 40 poses for each of the 15-20 emotion sets. By the end of production, he’d totted up an impressive 15,000 individual, hand-sculpted poses.
That hard work was paying off. When the system was fully activated, positive feedback was universal: staff, playtesters, even Keith himself were finding a deeper emotional connection with the game’s cast during gameplay.
Paciello points to a pivotal moment when Ellie is at Joel’s grave as an example of the seamless transitions between cinematic and gameplay, “Flip the camera [after the cinematic ends] and you see that she’s still devastated,” he explains.”I talked to the character team to see if we could get the textures and make sure it looks like she was just crying, so everything blends together nicely.”
Yet even with the system in place and working as it promised to, the teams continued to fine tune every scene, further polishing each character’s emotional state. Ellie’s obvious exhaustion during the Santa Barbara sequence was the subject of additional focus. “We didn’t know if you wanted to go with pain, because she’s just gotten punctured. Ultimately Neil was happy with the exhausted look.”
And certain scenes demanded additional, unique flourishes to reinforce key character beats. Ellie’s reaction to seeing the Tyrannosaurus Rex statue during a flashback “was specially made for that moment,” while for the climactic beach fight, Paciello went in and redid all of Abby’s emotional sets to give the scene the gravitas it deserved.
“We all learnt a lot from that,” says Paciello. “When I first got to it, Abby, being so emaciated, still looked savage. I talked to Christian and said ‘I’m going to duplicate all her emotional sets for melee and I’m going to make them all exhausted’. That way you choose between fierce and exhausted, so maybe she musters up enough strength but after she throws [a punch], goes into her exhausted state.”
I ask Paciello if he has any personal favourite moments from the game that shows the system at work. Understandably, he quickly rattles off several (“Dina and Ellie’s conversation back and forth on horseback, or Lev and Abby. Yara, in the aquarium talking about Lev.”), but ultimately he comes back to that climatic beach scene. He sees it as the culmination of everything the teams had worked on and proof that the system had achieved its goal. As Ellie and Abby come to the end of their respective arcs in an explosive, exhaustive clash, the emotional scars are clearly etched on their faces.
“It made me really made me feel like, ‘Okay, we’ve really pushed it.’ It’s what I wanted, it’s what I dreamt of when I pitched the idea of the system.”
Leave a Comment
It’s been incredibly exciting for us these past few months to discuss plans for the next generation of Warframe. As we continue to grow, we want to highlight more about what the future holds regardless of if you play on PS4 or PS5. This is a monume...