REMINDER: Context is key; this information should be taen as a guide only. The details of this article can vary based on the production and the creative intent.
Additionally, Netflix has produced a series of informative videos and modules designed to provide valuable insights into virtual production methodologies.
WHAT IS IN-CAMERA VFX?
With In-Camera VFX, the ultimate goal is to utilize display surfaces (such as LED walls with accurately projected content) to achieve the final visual effect in-camera during the shoot, requiring minimal to no post-production work.
Whether it's two people sitting in a car, a spaceship jettisoning into warp-speed, or recreating on-location lighting in a sound stage, Virtual Production technologies can empower cinematographers and their crews to create dynamic lighting, natural reflections, and realistic backdrops on set. This enhances the realism of the imagery and helps provide important creative cues.
There are two key approaches for In-Camera VFX to display content: 2D or 3D.
2D IN-CAMERA VFX |
3D IN-CAMERA VFX |
Example of plate-based driving shoot |
Example of real-time set extension |
2D In-Camera VFX involves projecting pre-rendered or live-action footage at a fixed perspective against a display surface. This achieves a high level of image quality and immersion. With driving comps, for example, plate arrays are shot at the same time and captured at a variety of angles. They are then reprojected on a display surface (LED screen) to emulate the experience of a sequence shot with a car process trailer; part of the value of this method is that it is easily repeatable on a controlled sound stage.
3D In-Camera VFX involves the use of image output from real-time engines to a live display surface, synchronized with camera tracking to produce final-pixel imagery in-camera. Content generated for 3D In-Camera VFX is not rendered into a final video format, but rather something that can be manipulated live through platforms such as Unreal Engine, Unity, or Notch (to name a few).
However, 2.5D In-Camera VFX is becoming more common as a middle-ground approach to reduce real-time complexity whilst still having controllability on the day. This is a blend of 2D and 3D, where certain elements of the scene, for example the the furthest away elements of a background, i.e a distant horizon, are baked out and pre-rendered. Meanwhile, the foreground elements are created in 3D, or as separate layers and can be adjustable in real-time. An example below demonstrates layers within a 2.5D Virtual Production scene.
THE CHALLENGE AND POSSIBLE SOLUTIONS
When watching content, you’ll often see talent in a vehicle, whether it be a car, plane, train, or boat. To achieve the shots in these traveling sequences, there are three shoot methodologies to choose from: practically on location, in front of a green or blue screen, or on a stage with LED panels.
ON-LOCATION CAR PROCESS SHOOTS
Traditional driving photography requires large process trailers to support the vehicle, camera car, and crew members. Permits and police officers are required to hold or direct traffic. Continuity might be sacrificed when the actors’ hair, makeup, and wardrobe support crew cannot reach the performers due to time constraints or location logistics. Changing light and the time it takes to reset the process trailer can restrict creative choice.
Example of a process trailer |
Example of a process trailer |
GREEN SCREEN CAR SHOOTS
This method seems simple: load a vehicle onto a sound stage, roll up a green screen, and shoot! There are three key challenges with this, however. First, the car must be accurately lit as if it is traveling down a street, in a city, or out in the countryside at multiple times of day. Second, the Director must explain to the talent and editorial what is happening beyond the green screen. Third, the actual visual effects work requires an artist to key out the green screen, seamlessly composite a practical driving plate seamlessly into the practical shot (or plate), and add reflections to the vehicle and talent. While logistically the most straightforward method, the end result may vary in quality.
Example of green screen car shoot.
INTERACTIVE LIGHTING
To help with lighting and reflections on the car, LED panels or traditional rear/front projection can be used for interactive lighting on set. Pairing interactive lighting with green or blue screen backgrounds can eliminate some VFX work in Post.
Example of interactive lighting on sound stage.
IN-CAMERA VFX
This methodology removes all the logistical challenges of traditional process trailer shoots as well as the artifacts associated with driving comps. It may also remove or reduce the need to physically build a set or travel to a specific location. The LED panels, serving as a background or window exterior, display still or moving images made up of practically photographed plates, CG generated digital environments, or some combination of the two. The need to travel a large cast and crew to the location is eliminated. Depending on the scene, the number of crew members required to shoot on a soundstage may also be reduced.
Compared to traditional green screen composites, the use of In-Camera VFX also has the potential to reduce the resources required in Post Production to finish shots. By adding backgrounds to scenes in a studio environment with LED panels, the camera operator can frame as they would any real object, and the actors can react to the actual final imagery, live. This not only provides cast and crew with a visual environment to reference in real time, but all of the natural reflections and lighting from the screen provide important artistic cues and enhance the realism of the imagery - avoiding the typical struggles of a green screen (like unwanted reflections and color spill on the subjects).
However this solution is not without its own challenges - as the LED infrastructure can be costly. There are a range of financial models to explore — from pop-up LED stages to purpose-built infrastructure designed for this type of work. The best execution of this work can result in the opportunity for savings on physical production costs, reduction in VFX shots, and an expedited post timeline - but more importantly, a more integrated visual look.
VISUAL EXAMPLES
CARS
Example of sound stage set up for in-camera VFX. Source: Territory Studios
Example of plate-based driving shoot.
TRAINS
Example of dailies from stage set up below.
Overhead visual of the soundstage for clip above.
WINDOWS
Example of LED outside of the window.
CREW
Working with In-Camera VFX will require a shift of some traditional crew responsibilities, as well as the addition of new roles.
The following is an example of roles (both existing and new) and responsibilities for crew involved with ICVFX workflows :
ROLE |
RESPONSIBILITIES |
Director of Photography |
Critical to approve the content from capture to how looks on the day, more than on a traditional shoot, as the LEDs provide both lighting and the imagery being captured through the lens. |
Virtual Production Supervisor |
A new and emerging role, it is similar to a VFX Supervisor and serves as the key point person for all of the moving parts within the Virtual Production pipeline, ensuring that all infrastructure, teams, and phases of production are moving according to plan from prep through shoot. Works closely with the VFX, Post Supervisor and Production leads to ensure alignment throughout. |
DIT |
Plays a crucial role in linking the creative look with the technical appearance of the LED panels. Helpful in identifying issues and can often aid with signal chain and identifying QC issues. |
Production Designer |
Understands the scope of the required set build ー what will be practical, and what will be displayed on the wall. Through set design, can help hide the blend where the stage meets the bottom of the wall. |
VFX Supervisor |
If there are plans to incorporate VFX into display content, or onto plates in Post, it is imperative to have the VFX Supervisor involved in the pre-production process and on-set to avoid costly redos down the line. |
Editorial |
Often involved in plate selects prior to the shoot, and possibly assembling a previs/techvis animatic for the Media Server Operator and additional crew; so that they understand the number of setups and camera angles required. |
Production Supervisor |
Connects additional personnel for essential conversations and coordinates additional/alternative logistics (ie. connecting Transportation with LED Technician to understand what the layout of the walls will be, and figure out how to load in cars...) |
Gaffer |
Key stakeholder in the LED panel set-up and plays a critical role in ensuring safety and functionality with other lighting elements. |
Post Supervisor |
Works with the Director/DP/Editorial to deliver plate selects for each scene, ensuring that they are up to spec according to the Virtual Production Supervisor and Media Server Operator. |
Media Server Operator (or ‘Engine Operator’ in 3D) |
A new role that is responsible for the ingestion and operation of content on set and stages content according to shoot schedule. Also operates the display of content and positions/adjusts plates on the display, ultimately playing/resetting footage as needed. Depending on the infrastructure and scope of the shoot, two operators may be needed. |
LED Technician (or ‘LED Engineer’) |
A new position in charge of the physical set up, movement, adjustment, and tear down of LED wall infrastructure. Also necessary to have them on set during the shoot in case of any troubleshooting needs with the LED panels. |
Virtual Art Department (or ‘Real Time/ Game Engine Artist(s)’) |
A new and emerging department or role(s). Virtual Art Department (VAD) can be a company/contractor or individual hired to create virtual content or ‘digital assets’ to be used in a Virtual Production set up (i.e on the LED wall). They will be on board throughout the prep phase and potentially on-set making changes to any VP assets in real-time, if necessary. Typically ‘Virtual Art Department’ is associated with CG asset creation, and ‘Plate Capture Vendor’ refers to the company specifically hired to capture any practical 2D plates. |
* Title variations could exist since these are 'new’ roles in film/tv production and still being defined
TIMELINE
Due to the inherent nature of the pre-production planning needed to execute the in-camera visual effect, the order of operations and average timelines will be affected. Certain production crew will need to come on-board earlier than normal. Some decisions that may typically be left to figure out in Post will also need to be decided prior to the shoot.
The timeline below is a generic representation of the order of operations when prepping and shooting in-camera VFX with LEDs and 2D playback for driving comps, focusing on the importance/goal of each step.
NB: Timelines can vary depending on the complexity of the In Camera VFX approach. Please refer to the Netflix Production Manual for specific guidance on timelines based on content type and consult your Netflix Virtual Production experts in the region.
- SCOPE ASSESSMENT: This is where level-setting and expectations are set. The first step is understanding what is trying to be achieved: What are the sequences (action, mood, time of day)? What are the areas of focus (shooting into LED has some limitations)? Are special effects going to be used (smoke, rain)?
- KICKOFF: Sometimes a specific Virtual Production kick off or workflow call is needed. For the DoP, it may mean going through how lighting and color will work on the VP set . The Gaffer will likely ask who will be be operating any integrated lighting controls affected by the LEDs. Depending on content being displayed, Post, VFX, and VProd Supervisor will likely need to work together to source/create and deliver footage.
- CONTENT/TECHVIS/STAGE RECCE: Each of these steps will inform the other in some way. On-Set Infrastructure and Plate Capture & Conform will need to be in alignment in terms of specs and requirements so that content can be properly displayed to expectations. Techvis can lay out potential camera options based on the content or show exactly how things need to be staged in order to achieve a desired shot. The layout of the stage will be a key part of Techvis in determining where everything/everyone will physically need to be on the day, to ensure that everything will fit, and that shots/sequences are achievable. Techvis can primarily solve content orientation or camera moves (whichever is more imperative to start), but will ultimately help to inform both.
- PERMITS/WEIGHT APPROVALS: To ensure production safety and confirm the weight capacity of the sound stage, the Transportation team will need to provide the make/model/weights of cars and the Lighting/Grip + LED teams will need to submit weights for rigging.
- INGESTION/ORGANIZATION: This is when the Media Server Operator receives and stages the content as they will be needed on the day. The order of files should reflect the shooting order according to the shooting schedule, and content requirements and details should be added to the call sheet for each shoot day . Prior to the ingest process, submission guidelines should be shared with whoever is responsible for submitting content to the LED Vendor or Media Server Operators in order to help with the process (ie. codec, file naming convention) in good time ahead of the shoot day. In addition to the call sheet, a guide should be created to highlight what the Op should know about the content (ex. ‘intentionally flopped’ ‘no LUT’). Content should be prepped, optimized for performance and QC’ed according to shoot schedule, so that scene swap can be done swiftly.
- RUN-THROUGH (or ‘PRE-LIGHT’): Stage and test everything. Set up lighting, review and orient the content on the display as needed for setups (note: setups can be ‘saved’ like a file, so they can be queued up quickly on the day). This step is crucial, otherwise, time will be spent doing this during production, and compromises will likely be made for the sake of time, resulting in unplanned fixes in Post.
- SHOOT: Showtime! Content adjustments such as scale, orientation, and a slight perspective shift are achievable as needed. However, it’s important to pre-determine what elements you would expect to change on the day with the Director and DOP, as not everything is always possible in real time. Get the shot, reset instantly, and ship to Editorial as standard with the intent of the shot captured in camera. This is what it’s all about!
- DAILIES/ QC / REVIEW: Its crucial to check the dailies for any potential artifacting which can present itself when shooting In-Camera VFX. Most things should be visible on the 4K Monitor during the shoot, but its good practice to have the VP Supervisor check the dailies after wrap, as per the Netflix QC Glossary, in case anything needs to be addressed in post, or fixed ahead of the next shoot day. Inevitably there may be some artifacts, moire, or other minor fixes needed.