Weta, the famed visual effects studio, has spent two decades turning impossible fantasies into cinematic reality, transporting audiences to far-off corners of imagined vistas and bringing the unimaginable to our mundane plane of existence. The one constant? Giant apes.
Visual effects supervisor Erik Winquist has worked on three distinct franchises featuring massive computer-generated simian creatures over the last 13 years, a progression of films that chart the course of the company's development over that time. First came Weta founder Peter Jackson's King Kong reboot in 2005, then the three groundbreaking Planet of the Apes prequel films, and on to last month's rock-'em sock-'em blockbuster Rampage. To look closely at the apes in each film is to see how much has changed in the technology used to create them.
As with Darwin's enshrined theory, the evolution of primate rendering occurred through many small but crucial steps, imperceptible to the untrained eye on their own but adding up to undeniable, remarkable advancement. The starting point was no primordial ooze, either; in 2005, Kong was a massive achievement, a product of a beefed-up staff, further application of the motion capture used in the Lord of the Rings movies, and brand-new rendering systems for fur and details.
"Moore's law basically dictates that the graphics cards will get more powerful," Winquist tells SYFY WIRE, by way of explaining the basis of at least some of the technological leaps over the last decade and a half. "That helps us speed up the processing of what we can do now, which would have been completely impossible from a cost and time perspective to do back in 2005. The desire was always there, but the hardware finally caught up to what we had been wanting to do with it."
As Winquist explains, Kong was an important landmark in facial motion capture, as data captured from Andy Serkis's mug was used to inform the animation of the giant ape's own face. As can be glimpsed in the production diaries that accompanied that film, Serkis' face was covered in dozens of tiny receptor dots to send data back to the computer. That influenced the way the character was animated, and was revolutionary for the time, but looks a bit archaic in hindsight.
In working on 2009's Avatar, Weta introduced a real-time virtual camera on motion capture that allowed filmmakers to watch a digitally rendered version of an actor's face in real time. Over the years, advances in that show technology have been paired with medical technology and years of research into biology and primatology. By the time the first Planet of the Apes prequel began production, Weta was able to incorporate Andy Serkis' own features into Caesar, the leader of the ape uprising, and over the course of three movies included more and more of his distinct facial structure and expressions as the apes rapidly evolved in the story. They were truly hybrids, given their origin, which gave the company a certain amount of animation breathing room.
In the case of the Dwayne Johnson-starring Rampage, a more traditional gorilla was needed, even if the movie was based on an arcade game that took liberties with the species. And so actor Jason Liles provided the motion capture for George the ever-metastasizing ape, after which his facial features were combined with a gorilla's, a process that brought animation artistry into the fold.
"So much of that has come from continuing push to be as physically plausible every step of the way," Winquist says. "It's heavily grounded in the actual skeletal system of a gorilla. In the case of George, the muscles are entirely based on MRI scans of apes and medical reference that we've been studying for three years now. The entire way that the musculature and tissue system works in an actual ape, we're basically simulating a lot of those muscle interactions, the way that the fat and the skin slides over the top of all that."
All of those details are crucial, but audiences aren't exactly watching the musculature with any advanced knowledge of how apes move. Their eyes are on the surface — that's what people can immediately identify, and helps distinguish good CGI from the mediocre and cheap. And while effects houses still haven't quite cracked the uncanny valley of entirely CGI humans, over the years Weta has figured out how to make increasingly realistic hair and fur. The company's artists use a fittingly named tool called Wig to edit down to the single hair fiber.
Also crucial: rendering the environment in a realistic way. When you insert computer-generated creatures into a real-world scene — or generate an entirely CGI environment — you don't automatically benefit from seeing how the environment reacts to the character, or how the character reacts to the environment. That problem was solved by a proprietary software, known as Manuka.
"The tools that we've written since King Kong have made that process so much more intuitive for the artists and so much more real-time in terms of the way they can interact," Winquist said. "They can effectively brush fur and set the way that it clumps. There's been a number of papers published along the way from people internally about the way that we actually render and shade, based on the way that light actually interacts with actual hairs, based on studying microscopic imagery of actual hair fibers."
Manuka renders environments in real time during post-production, allowing effects pros to see how every element of the film — whether it's being filmed on location or created in the computer — will work together.
"It provides a literal stimulation of the way that light bounces around the scene in the real world," he explained, "and the way that the light scatters through skin, fur, and everything else."
And yet despite the enormous advances in technology, sometimes old-school techniques were required. When Winquist and his team wanted to bring something on set to see how light might reflect off an albino gorilla during production, he went to the team at Weta Workshop, which makes physical models and props, and asked for a little bit of help.
"They dug back into their archives and they found the original mold foundation for the bust that they had made for King Kong in 2004," he said. "We made some changes to that, casting it in pink, fleshy silicon and hand-punching all the hair. So some of the original DNA from King Kong somehow works its way back into this movie, as well."