Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!
Spoilers ahead for this season of Westworld.
Creating a world that takes place in the future, employs virtual reality and simulations, and features many kinds of robots walking around, some real enough to pass for human, requires a lot of visual effects work. For Westworld, it's up to VFX supervisor Jay Worth to face those challenges. Worth chatted with SYFY WIRE prior to Sunday’s finale about the new technology they used on Season 3, designing new robots, and (accidentally) creating possible clues for fan theorists in the audience.
What was new for you this season?
As we leave the park, we were able to really develop and figure out what this new world looks like. We wanted a futuristic vibe, but not in the somewhat expected post-apocalyptic way, and we were trying to figure out what it would be. We had dinner one night with Danish architect Bjarke Ingels, because he happened to be in Spain. His buildings inspired a lot of what we captured, especially from Singapore, so that was fun. That was the baseline for our new reality.
You consulted with Jon Favreau regarding some new technology for The Mandalorian. How did that help you solve a particularly tricky optical phenomenon known as the parallax effect, which makes it tricky to create an illusion of depth in some VFX images?
Almost 5 years ago, [showrunner Jonathan Nolan] came to me and said, “I want to take a real-time game engine and be able to create a background that is computer-generated. But then you put a sensor on the camera, and sensors in the room, and you move the camera, and the background can, in real-time, translate to the camera, like a real three-dimensional background.”
We had this idea, and we built our own system, but the video cards couldn’t do it, and we couldn’t figure out the best methodology to do it. Fast forward a bit. We were able to visit Jon Favreau on the set of The Mandalorian, and we got to see, OK, the technology is now where we want it to be. How we can use it for Westworld, which is a very different challenge than some of the other shows out there using projection and LED technology?
We figured out how to do a massive LED-screen wall that would utilize real-time gaming technology and project the image, and that’s why in Episode 3, at Hale’s office at Delos, everything you see out the window was captured in-camera. We didn’t touch a single thing of it. As the camera moves, everything translates in real-time. We can really get the vibe from where the natural light rolls off, the tone in the room, all the reflections. We used a real angle from the City of Arts and Sciences in Spain to build out that asset, which in some ways makes it more challenging, because it has to hold up to the look and feel of being real, in terms of the geography and the lenses.
Then [Nolan] wanted to do these flying personal craft to go around the city. We knew we could use the LED-technology for that, so we would not have to use a blue screen. We could have real reflections, real lighting on our actors the whole time. The number of times we scheduled and tried to go up in helicopters around L.A. to shoot our plates through all the clouds of June gloom and the different restrictions based on sporting events! We had to fast-track the plates to get our future city placed into those plates so we could then project those plates for our actors while they were sitting in what we called our airpod on our stages. It was really amazing what our team built as both kind of an amusement park ride with a gimbal that could bank, so when you see Dolores take off, she’s really leaning. It’s not Star Trek. The ship is actually moving, which creates a realism that is tangible.
Let’s talk about the new robots…
The interesting thing about creating robots is you want them to be unique, right? It’s hard to have them not feel like Star Wars robots or other kinds of robots. I think one of the challenges that we have is in building these more basic, utilitarian robots right next to the most high-tech, human-feeling robot you’ve ever seen in your life. That’s the odd thing, because they’re both robots, but one is just very basic and used at construction sites, like George, or built to withstand riots, or just a technical robot, like Harriet at Serac’s lab…
Wait, that’s what that robot is called?
Yes. We call her Harriet. She’s just doing normal operational stuff, but it still has to feel real, and in some ways, less human than the other robots.
And what about the riot robot? What’s his name?
Mech. Just Mech.
How did you do Mech? I’m guessing that we haven’t seen the last of his type, since it looks like a riot is happening in the finale, based on the teaser.
For Mech, it was a lot of work trying to figure out walk cycles, to make it feel like it’s really there, really large, and really destructive. [Laughs.] We had these boxes, and we didn’t want it to necessarily look like Transformers, if that makes any sense. We wanted it to be more mechanical, and not quite as sleek. If you were a riot robot and transported in large boxes and could assemble yourself, what would that look like? That was a fun challenge, because it all still makes sense design-wise. We didn’t see too many ways of how those pieces could go together. But even though we take some creative license with how they form together, we really wanted that to feel like that could really happen or could really work.
We didn’t do a lot of new things for the finale, other than what we’ve done before — crowd extensions, airpods, and some Mechs. It’s very large in scope. We had more shot volume than last year’s finale — close to 700 shots — but the majority of it was clean-up and continuities. And we just couldn’t, with the volume than we had and the timetable that we had, do all of it. We couldn’t remove everything.
Sometimes fans find clues, or what they think are clues, to propose theories from what could be continuity errors, or intentional continuity errors. For instance, there's a theory that this world is a simulation.
It’s not a hard-and-fast rule that there are zero traffic lights in our world. It’s just that they are much reduced, due to all automated traffic in our version of cities in Westworld. But there are still traffic lights and street signs in some pockets.
I found it interesting when Hale pricked the Man in Black with something, and the screen said, “protein,” how some people were like, “He must be a host!” or “He’s totally a human!” It’s interesting to see people battling back and forth over his identity, but we were clear in the story that what she pricked him with was tracking his location. We shot at an old detention facility, but we changed some of the signs to Spanish, because it’s supposed to be Mexico, and some we left in English. That’s not a continuity error. Or when the Man in Black has his full hand during the virtual reality therapy session. We had a lot of conversations about whether he should have his missing fingers or not. It makes more sense from a story-telling perspective that in his reality in that moment, he would have his hand. That’s how he would recognize himself.
I’ve had this experience my whole career, from Alias to Lost and Fringe. The amount of scrutiny for clues, and sometimes people just reading into a shadow on a wall, it’s amazing to me. And it’s fun!