WWI: The Cinematography of ‘The Jungle Book’

The Jungle Book was a recreation of the 1967 classic. The extravagant visual effects and beautiful look was achieved through alot of bluescreen however the lighting and  creative use of camera angles to conincide with this film-making process is incredible. Bill Pope’s apt use of lighting and camerawork in coordination with visual effects setups was careful, well thought out and shared the same creative vision as the project’s director Jon Favreau. This modern virtual production included a lot of computer generated animals and scenery yet adopted a very experienced DP known as Bill Pope who worked on Spiderman, The Matrix and Men in Black. Although he was therefore no stranger to digital production this film takes it to the next level covering 95% of the seen environment. The VFX producer was intent on making jon Favreau and bill Pope’s traditional film-making sensibilities easy to transalte onto this project to achieve higher realism.

Clearly the Pre-production process was designed to create an environment beuatiful to the eye. When considering how to shoot the film they realised that it is very difficult to control how the lighting from the trees move as well as all of the other limiting contributing factors. The MPC spearheaded this design process to design every shot of every scene painting the environment and talent. After this concept process the crew use the motion capture suits to capture the movement for the majority of the film for a 3D reference and to gather resources bt Animatrik. Then the virtual art department led by Gary Roberts would go into the environment and 3D capture aspects such as a rock, or a plant to create a catalogue of unique elements and so that DP Bill Pope could move aroudn and arange the set using a headset. Then Pope would find the correct angles and move elements around like virtual location scouting. After this Pope and Roberts experimented with the lighting to create a rough pass on the look of the jungle for almost every scene using Renderman as their software tool.   Finding the dappled light informed the crew how to light the practical set on principle photography. In this case the DP knew everything about the final render and therefore could adapt to this knowledge.

To capture the film part of the process they used the Cameron + Pace Alexa M Smart Rig with Pannavision Premo lenses on a Lieberhead with massive extensions from the technocrane. The lighting was very complex to create an overlap between real and virtual lighting. They used sunhits to create dappled light at a beautiful angle and being designed by Bill Pope. In addition there is sme complex ambient lighting above. A lot of real world lighting had to take place to make sure that Mowgli seemed interactive and to make that stick with the overall design of the lighting and cinematograph.

A lot of live handheld camerawork was used in the actual film to create an organic feel. This was enhanced by a physical set for more intimate gestures such as mud and tree bark for the talent to interact with. At times they would use real camera moves such as with a technocrane on track with a Chapman F3 to either recreate the actual movement or use the live-action plate. This was perfect for dollying and booming simultaneously. When filmign element shots they would build a set around the talent so that if their was a weather system such as rain there was an added layer of relaism. To heighten this they used puppet animators to engage the live-action talent to get the timing off in cases such as real laughter and an emotional sensibility.

Simul Cam was used with motion capture markers on the camera so that they can trakc the position adn rotation of that camera. Then this will be copied virtually so that the viewfinder shows the live-action and virtual set. This was thanks to the planning in pre-production. This created a cycle of production and post as new scenes were added. The real camera was put on top of a Photon (DD + Microsoft) virtual camera software. This was a combination of traditional motion builder and Unity Game Engine. This technique was used on Gravity, The Life of Pi and Avatar. Using this knowledge one can decide how to use practical lights such as torches in terms of brightness and flickering for effect so that one can make confident decisions and be more interactive on set.

The setup of the set is quite interesting. At each corner there were large blue T-shaped blocks with the motion capture cameras with spacelights in the air with the entire set surrounded by blue and the curtain partially surrounding the 12Ks. There were LEDs on the floor to light the panels as well as ambient lights. This was a typical light set-up. During more exposed shots 12K lights were used through the trees for the dappled effect alongside 8X5 Lumapanels to add fill life from a separate direction. The crew went out of their way to sue an electric car and a handheld camera so as to get two angles simultaneously while being tracked. This added more layers of realism despite the process being CG. They were very creative with the lighting. For examplke they used a projector behind a diffusion sheet to create a more natural light source alongside a couple of bounces and backlights. The process is very hybrid where the lighting is designed in 3D first and then the DP works alongside the VFX team and set design team to get the set built correctly recreated in reality.

To create more mobile shots they used a turntable so that he can be continuously walking with added bumps to give the impression of a naturally rough domain. The programable GOBO lights can move around so that it seems like the talent is moving through the leaves. To create general light a large 40×40 source with 12K Tugstun lights pushed into it alongside some more warmfill and hard backlights. To film the water scenes a water tank was used surrounded by blue and lit by reali light with hard light from an 18K. The Pannavision lenses were mapped so that the focal lengths and distances could be relayed onto out of focus background CGI executed in post-production. To create added realism to this approach many of the final shots used blow-out much like in real lifefor part of the shot so as to achieve a more realistic light source immitation. The depth of vision and flaring on the camera appears in the final result mcuh like a Pannavision premo with additional control and style. In the knowledge that they are not bound by reality they can restructure scenes to suit there evolving purposes.

In one particular sequence, when mowgli is clambering to the top of a hill a couple of 12Ks are used dappling the light and creating a haze with a realistic light pattern. The camera runs behind the talent as though they were in the forest as a camera move used in the final effect. As a change of scene, when mowgli is being carreid by the monkies a series of capture artists in blue suits were used while respecting the aesthetics of the shot. Then, as the scene progresses, the same actors are sued again alongside animal cutouts so that the timing is correct in his response.

When Mowgli is escaping through the long grass the Technocrane pulls back so that the operator knows where to put the head of the puppets to be interacted alongside. Moreover, this virtual preview approach allows Pope to maintain his consideration and approach to the composition decisions on set of the final shot.  When introducing Sher Khan he is seen on a ledge with the light behind him blown out with lens flares to create a silhouette which is both ominous and ugly. This shot would be difficult to achieve as the set would only have half-an-hour to shoot the part so the virtually generated result is much more favourable. There are real world problems in deciding on the camera angles and lightign for this shot which prove once again the involvment of Pope in the final product and the cinematic vision.

In the moments before Johanson’s snake sequence a panning shot pans out from behind a tree silhouette to reveal the haze of the limited light, the perspective, the composition as well as the contribution of the virtual art department in creating the real set. Another example of a mind-blowing integration is the Bear Necessities sequence. The integration of Mowgli and the Bear Baloo in close quarters is phenomenally convincing with a fake BOCA on the lens and a Pannavision premo focus. Then again in the cave the lighting seems so organic and natural in it’s mishapeness as it falls onto the cave floor catching some foliage with the correct exposure and transparency and the sub-surface scattering on the honeycomb. Moreover the blockign of the scene as seen through Pope’s eyes with Baloo facing away makes the scene feel abundantly relaistic and full of real world decisions. Then later when a long focus shot of Baloo’s sniffing nose reveals a detailed nostrils and blurred hair with the reflection of his eyes we see some beautiful lighting.

In conclusion, as the industry melds together the virtual and traditional filming techniques pioneers such as Bill Pope will stand out as pioneers of layering lighting, camera-moves and talent direction as this new and innovative filmmaking tehnique progresses. The Jungle Book is an exquisitely shot and beautifully designed film which makes full use of current technologies to recreate the real world authentically and in their favour. Pope has mastered the digital/ live-action crossover in a comprehensive and creative manner.

WRITING WITH IMAGES: THE CINEMATOGRAPHY OF ‘THE JUNGLE BOOK’

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s