Transcript for:
Cinematic Innovations in Middle Earth Disaster

When we read that Mount Doom was going to explode and create an environmental disaster for the land of Middle Earth, we really started to investigate what would that look like. Alex Dissenhoff, who is actually the youngest active member of the ASC, was our director of photography on episodes 6 and 7. It was a very exciting moment for me because Peter Jackson films came out when I was in high school. Something about the magic that he brought to Middle-earth made me fall in love with the idea of making it a career.

So to get the call to be back in Middle-earth was a real treat. We knew we had to start in the village. Picking up where we left off in Episode 6 with our main character, Gladriel.

Alex is a classically trained cinematographer, so he really comes at it from a photographic storytelling kind of way. So, two things. It's got a little TV.

When you get back to your sight-seeing bush. He's also uniquely... talented in technology. That's the first one.

So we didn't quite get to the edge there. No, we need to get to the edge. So then we need to do one more.

When I hit the ground in New Zealand, one of the first things we had to tackle was how to shoot this post-apocalyptic landscape. The preparation for this scene began. with Ramsay Avery, our production designer, coming up with a few different looks.

The plan that he had gave us a look at the very beginning of the scene when we're in the direct aftermath of the event. And in that moment, we're basically in a version of hell, right? We want the audience and our characters to feel we've descended as far as possible before you die. I had a firsthand experience with some wildfires in Portland that we looked at. Also, there was some imagery from wildfires in California in 2018. Once we settled on what the general look was, we were able to then hone in on, okay, how do we achieve that?

We wanted to commit to that look. If you shoot it all in camera with that red look, you can't back out of it later. And we saw that limitation as a plus.

We knew that we needed control, so we determined that that needed to be on stage. For the sequence, the post-volcanic eruption, my main job was to make sure that Ron and Jason, Alex, all had what they needed on sets and in preparation to achieve the look and feel and the vision that the team had. We came up with a color palette that was very specific.

We tested four rounds of different colors of reds, oranges, yellows. We didn't just want the world to be red. We wanted the world to be deep red in some of the darker tones.

And then kind of slowly backs out to an orange red as things get brighter. And then as you get into the fires, it kind of backs out even further into yellows. One of the key challenges was we didn't want the fires to go past yellow to green.

And you might think, well, fire doesn't do that. But we aren't just dealing with wavelengths of light that a computer is reading, right? We are dealing with human viewers.

And when you have a human viewer, and they're looking at a frame that's all red, then your brain starts to compensate, and your brain starts to tell you, that orange fire right there is actually green. And so that was something that Alex and I talked about a lot. We tried to see, like, what can we do to take care of that in camera?

We did try different filters, and I think we ended up using a filter to help us to get what we needed in terms of the rich color anyway. But there was really nothing that we could do because the problem was perceptual. The problem was all in our own brains.

You know, it wasn't even in our eyes. Essentially, we created this very, very... narrow bandwidth of color on set.

What we came up with was a very specific look with the sky panels creating a very narrow spectrum of light hitting the sensor. I was starving the sensor. We found the sweet spot so within an image you'd have this beautiful gradient of color even though really it was in one very specific very narrow range of color. What we ended up having to do in post, and we did this in visual effects, and we did this with Skip Kimble at Company 3, is in a very targeted way, with the fires that we added and the fires that we had on set practically, we had to go through each shot.

We had to make sure to shift them just right. The darker tones a certain amount and the brighter tones even a little bit more. So that to an unnatural degree, we shifted the fires even more toward red.

Okay, I'm happy. In terms of controlling the light, we had full control over every light on set. We created this massive softbox.

over and around the set. So we had a huge piece of muslin going all the way across the set from the top and the sides. And we had several hundred sky panels, S60s in the roof, shooting through it. And then mostly S360s, Aerie sky panels around, providing a soft backlight any direction we looked.

And then I did use a hard tungsten fresnel through the muslin. to create a kind of hazy sun in the background, which VFX could then take and comp and move per the movement of the camera. Oftentimes, to create contrast, we would turn off half the lights on the ceiling and only put half the stage lights on, and so you automatically created a dark side to the camera. It became a very fast and efficient way, actually, once you started shooting, because we basically had lights everywhere. Ultimately, when we were on the ground shooting, All we really needed to do was add a little bit of eye light, and that was it.

One thing we did on this project in visual effects is we experimented a lot with different backings. This is a scene that's all fully red, it's very smoky. The world falls off into this fog bank that kind of disappears into nothingness.

And so we thought, okay, what makes sense to put back there? What's going to make it easiest for us to extend the world a little bit and then have it fade off into completely smoked out? We thought about black and found that we had these details in frame like silhouettes that showed up in the red smoky world as shades of darker red and then if we put black behind everything it's almost like it came forward in a way and kind of Became a part of the scene and kind of closed us off so that to add anything We would have to push it back with more smoke before we could do anything We would have to roto everything in the foreground just to be able to add more layers but with the muslin what we found is that almost kind of acted as a final layer of smoke. That background muslin kind of pushed the world back.

And instead of it being right in front of camera in the way that it felt, it felt like it was really, really far away. So now we saw, okay, that's where we need to add additional layers in visual effects to kind of extend that world. Once we hung up the muslin and we actually put some fog in the room and had the special effects going, it was really kind of incredible how the detail kind of fell away.

You know, even the seams in the muslin went away. All of the material that we were using to mount the muslin kind of went away. And we ended up with a pretty nice palette that in visual effects we were able to use to put stuff in the background.

Because it was pretty close to what we would expect that final layer of smoke depth to be, we didn't have to roto people's hair all throughout the frame to have that background. We were able to have that as a starting point and only where we were positioning some added element. Like, let's put a building right here.

Only where we did that did we have to roto. And what that meant... is we could choose where to put those things based on where people's hair was.

And those shapes that we would add into the frame are all added in concert with everything we're adding in the foreground. Because in visual effects, we're adding some of the stuff that you might think is shot in camera, including some of the layers of smoke, some of the texture in the smoke, some of the black smoke in particular is stuff that we had to add in visual effects because black smoke is not as safe, and sparks flying through the air. And all of those things...

play a position in the frame in space and in time. When we went out onto location, the challenge was, was how do we mimic that look without having the ability to light these massive forests? We had to rely on daylight. I had to mimic it using filtration. So we used a Tobacco 3 filter to start.

We dropped the filter in to get an image closer to what we had on stage. It wasn't a perfect match by any means, but then what my DIT and I could do on set was through contrast, through more color correction, we could push the image very, very, very close to what we had on stage. What we did find in testing was when we didn't have the filters in the camera, pushing the color that extreme, it started to break.

But as soon as you filter the camera and you get the warmer light coming through that tobacco filter, It got the image in a really nice place where you could push it very extreme, and the Alexa can be pushed to extreme places, but we were right on the edge of what was possible, and we found a way to do it. It was amazing to see how close we could get the image to our stage look, just using a Tobacco 3 filter, a Tobacco 2 filter, Tobacco 1 filter, and then some color correction. The one thing I love about large format is the ability to shoot at my desired stop and still have a shallower depth of field, which was important in this scene because Ultimately, we did have a lot of visual effects and we wanted the world to be a bit hazy behind our characters, not just through smoke, but through focus. If you were using Super 35 format, you'd have to be a little wider. So to be able to go to a slightly longer lens, to not have to open the lens all the way up to its widest aperture and still have a shallow depth of field was quite nice.

Where is he? Where is my son? Our characters leave the village to escape and ultimately they go through different landscapes, mostly forests, to higher ground.

As the characters journey out of it, we wanted to transition, and eventually they get back to blue sky. Alex and I started to kind of map out how this transition was going to happen from the reddest red all the way through these yellows and pale grays through to blue sky again. We picked basically seven levels and each one of those levels we attached an image that we put together together kind of saying okay here's the ingredients that are going to take place in level one here's what level two looks like and here's what level three looks like and those became kind of a touchstone for every single department and so you start to get this sense as you come through these seven stages that you know you're leading the audience to these different stages of it's getting better and better it's more and more okay We use the DNALF lenses. I inherited that choice from our first cinematographer, Oscar Faurah. They are my favorite prime lenses.

I really love the look of the DNALFs. Knowing that we're going to be shooting the show on them, I was very, very happy. I love the fact that you have all the modern mechanisms that allow us to shoot fast. They're lightweight. My assistant cameramen love them because of how easy they are to work with.

But for me, trying to visually realize the story. They provide this roundness to the image that it's hard to explain. There's the softness to them that a lot of modern optics don't have.

The other nice thing about these DNA lenses is that in spite of the fact that they're using vintage glass, they're still capturing metadata. And in visual effects, we're able to take that metadata and that feeds into our visual effects production pipeline. From the very beginning, We knew that metadata was going to be a really important part of our structure. What we're discovering is that there are ways of using metadata that we had never imagined.

Metadata is any kind of information that's coming from the set with the files to tell us a little bit about how that file was made. So some metadata comes directly with the image files themselves. We also have our technicians on set who are writing down everything that they can.

learn about how something is being shot. And if you're an artist who's down the stream, you might be receiving those files six months later. And that person has long gone on to the next show.

But having that metadata flow through, it's almost like that artist is there on set. The metadata is their guide. And if it's not easily available to them, they won't look for it. They have too much to do.

But if it's presented and it's right there, that information is super useful for visual effects. The journey of the metadata for this particular sequence in 1.07 was of extreme importance just because of the marriage between visual effects and onset photography. Having all of that information from all those different departments all in one place accessible to the VFX teams and to the post teams was I think what made that sequence look as good as it did in the very end. Using ARRI systems, smart lenses, intelligently talking to the camera body and putting that information into the headers, we continue to inject information into the frame and it continues basically all the way till final distribution and then beyond into archive.

We, as a visual effects department, benefited greatly from having an ARRI camera and ARRI lens to feed that data, knowing that it was captured properly and gave us good negatives to work with. It just seemed like the platform of choice. Cloud-based production was the only way that we were going to get this particular show done.

As far as I know, Rings of Power is the first production to be fully cloud-based. The nature of multiple units shooting simultaneously. The new element of what happened in the world in 2020 and the COVID lockdowns required us to utilize cloud-based technologies in a way that I don't think we've ever done before.

And in doing so, I think we were able to introduce kind of a new holistic approach to filmmaking where cloud wasn't just a place where we'd store stuff at the very end or maybe as an intermediate step, but was truly the foundation by which we were able to complete the show. We worked with a company in New Zealand, Rebel Fleet, to capture our material from the lab and send it directly to the cloud. We were sending the material directly to an S3 bucket where all of our camera originals resided.

It was there with metadata, ready to go at all times. From there till the end, we worked with Company 3 and Blackmagic Design to come up with color correction in the cloud. So we had color correction where...

All of the material was cloud-based. We were looking at a monitor in New Zealand. Our colorist, Skip Kimball, was sitting in Los Angeles.

We were looking at the same type of monitor, same color correction, and all of the files were in the cloud. None of it was local. People talk about cloud-based film production as the wave of the future. It's actually present. It is here now.

It is actually being done. What I love about Aira is that from the time where we were still using film, to now where we're cloud-based. ARRI has been there from the very beginning.

Now, as we move into the digital world and cloud-based computing, ARRI and the Alexa cameras and smart lenses all communicating together is really how cloud-based computing works. We're working with ARRI to make everything talk and work together into the future. For such a visual effects heavy production, We felt completely free to be able to just make the film that we wanted to make as we were going. We didn't have to worry about getting really specific for visual effects because all the metadata was already baked into the pipeline.

The visual effects team of Ron and Jason and all of their amazing artists, they had such an infrastructure in place that there were detailed note-taking happening on their end, there were witness cameras, there was all this amazing stuff that they had put in. We were able to get the final product out of the pipeline. where they were essentially surrounding us with this tech bubble that we could just work within and not think about. They would take care of everything.

And so it was actually one of the most freeing experiences I've had on this kind of level of filmmaking in terms of the technical side. I didn't have to worry about any of it. We could just focus on telling the story.