Further Uncovering VFX

Charlie Tait and Michael Sarkis Discuss VFX Challenges, Camera Settings, and Production Best Practices

Collaboration between crew and understanding other departments are focuses of the Techos’ Guild workshop series. Following on from the VFX event earlier this year, NZTECHO put a few questions to Weta Digital’s head of compositing Charlie Tait and senior onset VFX technical director Michael Sarkis.

You’d have to live on Mars to not know that the team at Weta Digital are world leaders in VFX production. Weta Digital is behind 39 feature films, with another four in production now. Currently more than 1100 crew are working on The Hobbit and other projects. The use of VFX is a growing component of filmmaking these days. The more crew understand VFX, the better prepared we will be for this changing landscape. Weta Digital’s head of compositing Charlie Tait and senior onset VFX technical director Michael Sarkis take a moment for a some Q & A.

  1. When to use green screen/when to use blue screen?

Charlie Tait: Green is generally preferable. The blue channel of the light spectrum, whether it is on film or video, is always more noisy, which makes keying more difficult. Blue has been favoured historically due to it being opposite to skin tones. The most significant factor in deciding between green and blue is what colour is in the foreground and which of the two will provide the most contrast. In saying that, The Hobbit had a lot of green in the forests and landscape sets, and we chose to use green screen, as in tests we found it easier to key greenish foreground objects over green, than using blue as the blue channel was showing a lot of noise and compression. The important thing is to perform tests before shooting begins, to determine what will work best.

2.How do motion capture and survey markers work? Does anything interfere with them onset and how are they used?

**Michael Sarkis:** The placement of markers on greenscreens or sets is part of the forensics process necessary in visual effects. In the same way as a crime scene may need to be reconstructed, the film set will often need to be rebuilt in a CG environment. Much of what we’ve been doing on The Hobbit and other projects at Weta Digital requires critical contact between the filmed set and performers, set extensions, and CG creatures. With few exceptions, a virtual camera will need to be created in order to precisely match the lens and motion of the cine camera for each shot. If the virtual camera doesn’t exactly match the movement of the filmed plate for every frame, the CG simply won’t look right.Adding markers to the set allows us to have something visual to ‘track’ through the hero plate which may otherwise have only a plain greenscreen background. That plain greenscreen is great for achieving a key to extract the foreground element in compositing, but not so good for analysing the camera motion in order to then render a CG background or creatures. Once we know that tracking markers are visible in shot, measuring or surveying them with a theodolite/total station will provide the matchmove artist with the correct spatial and scale information to calculate the camera’s motion.

Too many tracking markers though, will defeat the purpose of the greenscreen and corrupt the key so there’s a constant balance between getting a clean key and being able to recreate the physical camera movement. Once you have added to this multiple cameras shooting from different angles and different lenses, the ability to balance what’s needed starts to become more challenging. Throw in frequently moving greenscreens, wind on location or from studio fans which blow the markers on greenscreen curtains, flickering effects lighting which spill onto the greenscreen and affect the ability to both key and track those markers.

You’d expect this would all take meticulous planning because of the sheer amount of virtual weapons, head replacements, mystical animals and every type of simulation possible. ‘Yeah, nah’ as they say here. These productions have to shoot a lot of material and don’t want to slow down for anything, including visual effects. This usually means many set changes and constantly moving greenscreen setups – all of which need to be re-surveyed, photographed and documented for virtual reconstruction. It’s really up to the onset VFX crew to keep up with the pace of shooting and gather the info needed.

There are certainly occasions when the analogy of a crime scene doesn’t seem too far off the mark.

3.What methods do you use to record and recreate the lighting onset when building the VFX?

Michael Sarkis: In the same way as we aim to reconstruct the film set or camera move spatially we need to match the lighting—at least as a starting point. The method used at Weta is via image-based lighting (IBL). These comprise of a stitched set of ‘high dynamic range’ images (HDRi). By setting up a stills camera with a fisheye lens right where the action takes place on set, we’ll photograph a series of RAW images facing north, east, south, west, up and down. Each position is bracketed through a range of 12+ stops. This yields a spherical map of the environment surrounding the subject in front of the camera, with detail in the brightest light down to the shadows. Once in CG this photographic sphere is used to illuminate a virtual object or character in a similar way to light passing through a slide projector onto a wall. It can then be dialed up or down to best match the exposure and color. In conjunction with this, we’ll do what is affectionately known as a ‘ball pass’ with the cine cameras. Walking an 18% grey and a chrome ball through the set allows for the correct orientation of the IBL map and as a guide for confirming the accurate matching of specular and diffuse light falling on objects of known reflectivity. The little mentioned benefit of the ball passes is also that it allows for the ubiquitous and never-ending opportunities for ball jokes.

4.How do camera settings and setup affect the placement and addition of visual effects?

Michael Sarkis: All DOPs are concerned about the final quality of their image, including the VFX that may be required to complete it. In the same way that any camera crew will test and prepare their equipment for each project, visual effects requires a setup and calibration process with those specific cameras. Dynamic range, color rendition, lens aberrations all need to be measured and matched in order for the CGI rendered element to blend seamlessly with the live-action footage. There was a thorough testing period on The Hobbit which allowed for the specific RED camera settings to be decided upon. Ultimately you want to capture the most amount of information in that image as possible. As cameras differ, they need to be optimized so that the final result with VFX holds up for stereo images on an IMAX screen or a 60-inch-playing Blu-ray.

This doesn’t just go for the technical image specs but also in the stereo alignment when shooting 3D. Correcting for misaligned stereo images will have a flow-on effect in the time and resources required to add the CGI and deliver each shot. Shooting at 48 frames per second will require twice the disk space. For a large show which needs to calculate the rendering time for CG water simulation per frame these details become significant factors to consider.

So rather than being seen as a postproduction process, visual effects need to be considered as an involved part of preproduction. How something is shot for visual effects will have an impact on the final production costs, it is cause and effect. Who pays for the difference is the question.

Having said that – the role of VFX is to enhance a film and not impede the creative process. So aside from being well prepared, there isn’t much that’s left to do with the way cameras are used… or abused for that matter. There’s no sort of handheld, running through rain or stunt on all-terrain-tracking-vehicle-through-inhospitable-landscape shot that that we haven’t tackled.

5.Where are the most common missed opportunities with respect to VFX onset? Lack of time to calibrate, inability to LiDAR, etc?

Michael Sarkis: Lack of time is the biggest killer, but you do what you can. Being involved and aware of what’s coming up is crucial—but ultimately regardless of the budget it’s all guerilla filmmaking. The most missed opportunities occur when you’ve missed that quick comment about lowering the camera between takes—all of a sudden there’s no greenscreen in the top of frame.

You need to have your ear to the ground and have a quick response to changes. The two minutes it would have taken to raise the greenscreen while final checks were happening could have saved thousands of dollars. Being close to the director when it’s decided for safety reasons to replace a real sword with a green stick can also mean a different approach to what data is needed to gather in order to minimize the impact of that decision and allow for it to be achieved to standard.

Being practical and pragmatic when time is short and light is running out is the only real option. A film set is such a dynamic place, especially with stunts, weapons and special effects explosions—there are always lots of other people who want to stand where you are trying to survey or get a LiDAR scan. Sometimes the best option knowing a quick turnaround is needed is to come in at 4am to pre-scan the sets you know will be unavailable later.

Although you may like to think people recognize the importance of visual effects, they all have jobs to do too. The entire set needs to be struck on wrap but we have 50 VFX shots on it with water simulation—how can you squeeze in the 30 minutes to get two scans? If there is no time for LiDAR when filming, it could be rescheduled or it may mean you get the data through photos and some survey. As the VFX person there trying to capture the necessary data for a cost effective and optimal quality result, you still may need to compromise.

Like all other departments, VFX needs an understanding of the film's schedule and the flexibility to change a plan when necessary.

6.What has the experience taught you about how you approach the work back at Weta Digital?

Charlie Tait: The greatest insight I gained from being onset was being able to know the details and history of incoming shots and why decisions were made on set that may not be ideal for us to work with in post. I learned that there is often a payoff to be considered whether a problem should be fixed on set or if we should do it later on.

image.png
No items found.