REMINDER: Context is key; this information should be taken as broad strokes on a subject that can vary based on creative intent. Please read “2D LED In-Camera VFX Field Guide Overview” for context on where the information presented here comes from and is intended for.
TABLE OF CONTENTS
Frequently Asked Questions
- Color Shift
- Color Banding
- Content Color Mismatch
- Optical Moiré Patterns
- Image Aliasing Artifacts
- Compression Blocking & Noise
- Playback Lag
Glossary -- UPDATE 1/19/21: an industry-wide effort to establish common vocabulary for virtual production professionals.
- Bit Depth
- Bit Rate
- Chroma Key
- Depth of Field
- Heat Dissipation/Conduction
- Image Processor
- Content Player
- Metameric Failure
- Multiplexing Drive
- Pixel Pitch
- Refresh Rate
- RGB Dithering
- Viewing Angle
- Volume Stage
Frequently Asked Questions
Due to the way that LED panels are currently built, a color shift of the content might appear when looking through the camera versus what you see with your eyes. The shift can impact the whole LED wall evenly, or just a portion of the wall. The shift can be static or it can vary over time.
To diagnose this issue, the best approach is to use the camera (even if it can be seen by eye). If the issue appears only when the camera is positioned off axis, then it means that the camera is outside the optimal viewing angle of your LED screen. This can be checked on the LED panel spec sheet. If the camera is straight/parallel to the screen, and the shift appears evenly on the LED wall or a section of LED panels, then the shift might be caused by overheating. This occurs when the screens have been on for too long, if bright content is on the screen for too long, or from poor heat dissipation/conduction of the screen.
If the shift is caused by an off-axis camera position and changing the shot to get on axis isn’t an option, try to pan/tilt the screen to be more parallel to the camera sensor. LED panels with a larger viewing angle could also solve the issue.
To determine if the panels are overheating, consider switching the screens off for a period of time or reduce the overall brightness.
Color banding occurs when an inaccurate representation of the color of the content generates abrupt changes between shades of the same color, instead of gentle gradients. This is commonly caused by a low bit depth of the video pipeline and/or of the content.
If the problem is caused by the media, a possible solution may be to change the encoding codec or increase its bit rate or bit depth. If the problem is within the video pipeline, some settings changes might be required on your content player or image processor. You may also need to change hardware to maintain the right bit depth. In both cases, a possible solution might also be introducing RGB dithering to the content.
Content Color Mismatch
There are a number of elements in the image chain that might affect what the content will look like on the LED wall. Sometimes, the way it looks through the camera is not what we expect. This can be caused by two major elements: 1) a misleading expectation of what the content should look like or 2) a failure in how the color pipeline is set up (color pipelines are designed to make the content look as expected on camera).
Content color mismatching issues usually fall into one of three categories:
Strong hue shifts on the image, mostly caused by incorrect calibration or misinterpretation of the color space of the content
Incorrectly perceived contrast, mostly caused by a gamma (EOTF/OETF) mismatch
Crushed/clipped values in the shadows or highlights
Lack of contrast and milky blacks
Oversaturated or desaturated colors, mostly caused by gamut/color space mismatches
The expectation is the most crucial thing: LED walls have to be considered light emitters, not displays. The difference is seeded mostly on the fact that who needs to observe them are not our eyes and brains, but the cameras. And cameras react to light in a different way than the human visual system. This is due to a metameric failure, a phenomena based on the concept of metamerism. Camera metameric failures are easy to spot, as they mostly exist within what the camera “sees”. If the color mismatch is visible by your eyes as well, the cause might be somewhere other than the camera in the chain.
Please note: If the images look wrong to your eyes, but good on camera, don’t worry! It’s a success as you have reached the required metamerism on camera!
If the images look wrong on camera and also look bad to your eyes, then the issues can be found in one of the following elements (from content to camera): 1. the content color metadata; 2) the media player; 3) the image processor; 4) the LED wall calibration; 5) the white point of the wall vs. the scene lighting and the relative camera white point; 6) the camera color space.
Deep pipeline diagnosis
CONTENT: Try to playback the content on a different system or media player, if it looks as intended, proceed to the next step.
PLAYBACK SYSTEM: If your content player is interpreting the colors of your content wrong, that can be spotted within the UI of the media player. It is also possible (and more accurate) to intercept the output signal of the content player plugging it directly to a monitor (normally the transport protocol is either HDMI, DisplayPort, or SDI). If the signal looks right, proceed to the next step.
IMAGE PROCESSOR: Image processors use the metadata tags coming on the carrier protocol and interface (e.g. the HDMI or DisplayPort cable) to establish how to treat the incoming colors. Make sure that these match your expected color space and transfer function.
LED WALL: If your walls have been calibrated, make sure that the target calibration matches with the expected output of the image processor. Additionally, make sure that the white point of the screens matches that of the content (usually D65).
LIGHT AND CAMERA SETTINGS: If the scene is lit by other light sources on set and some of those lights need to be perceived as white sources by the camera, the cinematographer will most likely offset the camera white point to match one of these lights. If that white point is completely different than the one from the wall, then the perceived picture will look shifted (either warmer or cooler than expected, sometimes also either more magenta or green).
CAMERA COLOR SPACE: If a calibration process between the camera sensor and the wall has been performed, this will most probably be based on a specific color space on the camera. If that changes, the calibration might be off and colors will look incorrect or different.
If the content on the LED looks right to your eyes, but not on camera, it is due to a metameric failure. The solution is to offset the colors of the content in order to look right on camera (and probably wrong by your eyes). This can be done creatively or mathematically via standard color transform operators such as 3D LUTs to apply to the content playing back on the LED panels.
If the problem is on the content side, then it will likely need to be re-rendered or re-exported or the metadata should be changed to reflect the correct color instructions.
If the problem is in the media player, this might be solved by forcing the content player to interpret the correct colors or, worst case changing the content to match what the content player can interpret.
If the problem is on the image processor, most of the time the color tags on the processor can be altered and overridden. If that’s not possible, the content and the content player output should be adjusted accordingly.
If the problem is in the LED wall callibration, the LED vendor may need to re-calibrate the walls. This might take some time and might not be possible once the LED are setup and installed in a volume stage. Please make sure that the screens are calibrated before installing them. If the calibration has been done and it’s not in line with the content color pipeline (and a new-calibration is not possible), then you should change your content color pipeline accordingly.
If the problem is the lighting difference and the white point of the camera, shift the LED wall white point to match the ones on the camera (although this is not suggested) or change the content white point to match the desired one (this can be done with grading tools or color transforms to be applied to the content before rendering or on the Media Player). Another option is to change the on set lighting white point to get closer and match more the one of the content and the LED pipeline.
If the problem is the camera settings, please revert to the ones used during the camera calibration and, if not possible, perform a new calibration based on these new settings.
Optical Moiré patterns
A moiré pattern is an image artifact that is generated when two fine patterns interfere with each other. When shooting LED walls, the pattern created by the LED on the LED panels might be in conflict with the camera sensor photosite pattern and create visual artifacts such as color banding and multi-colored stair-stepping artifacts (a form of aliasing).
Moiré artifacts can be only seen by the camera. In fact, these artifacts can only be accurately diagnosed when all the elements of the image pipeline and shooting equipment are in place. Moiré patterns are influenced by a combination of (in order in the image chain): a) LED panel pixel pitch; b) any glass, thick smoke, gel or silk might be between the LED panel and the camera; c) distance/angle of the camera to the LED panel; d) shooting lens and/or filter package; e) aperture and focus point of the lens and the resulting depth of field; f) camera sensor structure and its OLPF.
Your eyes might see some moiré, but it is not the same as the camera sees (see image aliasing artifacts for more context on this). Other combination of camera, lenses, filter and any of the above elements will possibly see some moiré, but this might also appear differently when any of the above changes.
All of the possible solutions to fix a moiré artifact will have ramifications on the shoot. The best immediate solution is to reduce the depth of field so that the background LED pattern becomes blurred, reducing or eliminating the interference. Another suggestion might be to find a different angle from which to shoot the LED wall, or to change the distance between the camera and the panel. An ultimate solution could be to change the pixel pitch of the LED wall but be mindful that if it is true that choosing a smaller pixel pitch will indeed alter (and most time reduce) the problem, it might not solve it completely or might introduce some side effects (eg. reduced overall brightness capability of the screen, increased LED cost and power consumption, different/increased resolutions, etc.). If you are using the LED panels for chroma keys, it is sometimes an option to use a silk gel or a frost lighting gel in front of the LED screen to reduce or eliminate the artifacts.
Image aliasing artifacts
Similar to moiré patterns, there are a number of image-related artifacts that can affect content. Some of these fall under the category of aliasing artifacts. These are ones that appear as stair-stepping lines around the edges of some elements of your content or between high contrast lines and/or when you notice indistinguishable signals in your content that should be different. These can be caused by either sampling issues (up or down scaling) within the image pipeline, especially within the content player or from the content player to the LED screen (specifically within the image processor), or by bad content creation: poor capturing image quality, low capturing/rendering resolution or inefficient scaling filter.
Unlike moiré patterns, these artifacts will be visible by your eyes directly. In fact, there might be more visible moiré seen by your eyes than by the camera. In order to diagnosis where the problem comes from, first thing to do is to make sure that the content provided to the content player doesn’t suffer from these issues already. If it doesn’t, then try to make sure that the resolution of the content provided matches the intended output resolution of the content player and the intended resolution of the LED panel (eg. the content is 4K, the content player outputs 4K, the image goes to 4K LED panels). The issue will likely be in a resolution mismatch between the content and the canvas that the content player system is trying to fill, therefore it will be applying a scaling (either up or down) to the content in real time. Content player and Image Processors can scale your content in real time, but most of the time the scaling algorithm won’t be very sophisticated, therefore this might introduce some artifacts.
If the issue is in the original content, go back to its creation chain and make sure that there are not mistakes there. This may involve re-rendering the content using a better scaling algorithm or applying a filter to improve the source image quality and limit the artifact (which often happens when the capturing quality was not ideal to begin with). If the issue doesn’t appear on the content when seen at 1:1 scale, then it’s probably a resolution mismatch between the content and some element of the chain. Ideally, the created content will fill/fit your LED wall. This sometimes means having to produce content at massive and unusual resolutions and/or to split up the content in multiple canvas to fit the project requirements. Make an effort not to apply any real-time scaling within the content player or within the image processor. If the content is in line with the expected resolution, then the problem may be caused by an output scaling of the content player to (or within) the image processor (eg, the content is 4K, the content player graphic card outputs 1080p, the LED wall resolution is 4K). If this is the problem, try to change the output resolution of the content player or look to source a different one.
Compression Blocking and noise
Similar to banding, these artifacts are caused mostly by compression and appear as visible “squares” on the content.
To diagnose these issues, start at the beginning and check the source media. If this is good, follow the pipeline down to make sure the chain from the content player to the image processor to panel is maintaining the intended signal.
POSSIBLE SOLUTION: If the problem is in the media, a possible solution may be changing the encoding codec or increasing its bit rate or bit depth. If the problem is in the video pipeline, check the settings on your content player or image processor. Another solution would be to change the hardware to maintain the right bit depth. In both cases, a possible solution could be to introduce RGB dithering to the content.
Issues playing content back in real-time (or at the desired frame rate) can have many causes, usually related to the content playback server and/or the media. Common causes include choice of codec, resolution, or graphics card capabilities.
In most cases, this issue comes from the content player. First, check the content for encoding issues (if played frame by frame, there should be no jumps across each frame). If it plays smoothly on the Content Player, check for issues on the Image Processor. Next, check if the codec and the resolution of the content are in line with the specs of the Content Player.
If the issue is within the Content Player playback capabilities, and you are unable to change the hardware, consider changing the encoding codec of the content to an alternative that is optimized for the Content Player. You could also reduce the resolution or increase compression but be mindful that this can cause other issues.
Video artifacts are mostly represented as visible changes in brightness between cycles on the LED wall. These can be seen as dark static bands that move vertically or horizontally at various speeds, or as bright bands that move vertically.
To diagnose these issues, the best approach is to use your eyes. If you can see a flicker on the LED wall with your eyes, it could be a power issue. If the flicker is only visible in the camera, it could be caused by a genlock mismatch/issue within the pipeline or a slow multiplexing drive of the LED panels. Start by checking whether all of the elements of the pipeline (camera, content player, image processor) are correctly receiving and accepting the genlock signal. If the flicker is static (fixed dark stripes across the screen), it might depend on the camera shutter speed - which should be on the same frequency and phase as the screen. If the flicker only appears when moving the camera fast (especially tilting up/down), it might be caused by a slow refresh rate or multiplexing of the LED wall. The refresh rate of the image processor may also be a factor.
If the problem is a dynamic/moving flicker, make sure the genlock generator is feeding every element of your pipeline. Sometimes these can be wirelessly connected, but most of the time are connected via a HD-SDI. If the problem is a slow refresh rate, check if the image processor and your LED can run at a higher frequency (but make sure that this frequency is always a multiple of the base genlock frequency speed). Even if the Content Player can only operate at base genlock speed, the Image Processor can increase the refresh rate speed by multiplying each frame without losing genlock. If the problem is a static flicker (striped black bands) then check if the camera shutter speed matches (or is a multiple of) the base genlock refresh rate (eg 1/48 sec = 48Hz = 24fps) and/or modify the phase of the camera sensor scan until the lines disappear.
If the issue is seen by your eyes, make sure that enough power is being provided to the LED panels, that the daisy chain between the tiles isn’t overloading the available bandwidth, and that the LED walls are not too low in gain.
Some LED screens have matte surfaces, while others are shiny. The latter can cause your actors, props and lighting to reflect on the screen and possibly make the blacks look milky.
If you run a static black patch on your screen, it is easier to spot where objects are reflected. Although this is best done with a camera at any of the known camera positions, it can also be done with your eyes.
Hide objects and lights with black flags, increase the brightness of the shooting LED screen to hide the problem and/or reduce the brightness of the filling LED walls (not the ones within the camera frustum) and/or the other real lights in the scene.
When shooting on a volume LED stage, the hard surface of the LED wall might bounce the sound around and create issues.
The sound department on-set can quickly determine whether there will be a problem, so make sure they can scout the stage before shooting.
To prevent this issue, open up the space around the actors, removing some of the LED walls to limit the sound bounce. If that’s not possible or convenient, soft and sound proof surfaces can be added on set to break the sound reflection.
In signal processing, specifically sound and image processing, aliasing is an artifact caused by inaccurate sampling operations that generate an output signal different from the original source. In image processing, we refer to it as spatial aliasing and it would normally cause two different signals to become indistinguishable (such as lines or edges or details becoming indistinguishable).
In computer graphics, bit depth, or color depth, is the number of bits used to indicate the color precision of a single pixel (bit per pixel, BPP) or the number of bits used for each color component of a single pixel (bits per component, BPC). The most common bit depths are 8, 10, 12, 16 and 32 bits.
VP USE CASE
Within an LED infrastructure bit depth could be either bits per pixel or bits per component, depending on context. Video codecs and Image processors will most likely use BPC. Video cards and GPUs will most likely use BPP. Don’t get confused! The LED screen, the image processor and the content player graphic cards are the most important elements of your pipeline that should all accept and be able to produce your desired bit depth (ie ≥ 10 BPC). If even one of these three elements doesn’t support your desired bit depth, the video pipeline might produce banding artifacts when reviewing your content on the LED wall. Please also note that the receiving card of the LED wall should be at least 14bit in order to avoid quantization errors and banding.
In computing and digital systems, bit rate is the number of bits transferred, processed or conveyed per unit of time, which is normally defined in seconds (bits per second, bps or bit/s). The data unit can change according to the amount of bits processed, often "kilo" (1 kbit/s = 1,000 bit/s), "mega" (1 Mbit/s = 1,000 kbit/s), "giga" (1 Gbit/s = 1,000 Mbit/s) or "tera" (1 Tbit/s = 1000 Gbit/s) are used.
Chroma key is a visual effect technique designed to extract an element of an image for compositing multiple images or elements together. Normally, chroma keys are done using prime colors which are not likely to be found in nature, such as prime greens (green screens) or blues (blue screens).
A video codec is a digital system that compresses and decompresses digital video and audio. A codec is made by an encoder and a decoder which, respectively, compress and decompress the data. Some codecs might be only software, others can be hardware accelerated, either by common CPU/GPU processors or via custom-designed hardware.
VP USE CASE
Not all codecs are created equal. They have different limitations and benefits, including bit depth, frame rate, resolution, GPU encoding/decoding, and availability of implementations. When choosing a codec, consider your content player, its OS and its hardware limitations. For more info, see sections on Conform Specifications and Content Playback.
Depth of Field
Depth of Field (DOF) is the distance between the farthest and the nearest point where the image sharpness is acceptable. The depth of field is based on many variables, such as lens construction, lens focal length, iris aperture and the circle of confusion.
Genlock is a technique where a reference signal is used as a source to synchronize devices together. The aim in video applications is to ensure the coincidence of signals in time at a combining or switching point or, in other words, to ensure that each and every frame of the synchronized devices will start at the exact same moment in time. Please note that genlock will only define when each frame should start, not how long each frame should last within each second (that is the framerate, fixed among devices by a framelock). Framelock in addition to genlock will provide a much deeper sync framework that allows you to completely synchronize your devices.
VP USE CASE
Genlock is not always a pipeline requirement (or may be needed only in certain parts of the pipeline) but it’s always a good idea, as it might reduce visual issues while shooting. If video flicker is apparent in your camera, you probably need to genlock the pipeline. It’s important that all the elements of the pipeline can accept a genlock feed. Some devices might not be able to accept genlock.
All electronic devices generate excess heat, and this requires thermal management to improve reliability and prevent premature failure of the hardware. Heat dissipation is a type of heat transfer which occurs when an object that is hotter than other objects transmits its heat to the cooler object and to the environment around it.
VP USE CASE
Within the LED wall there are a number of elements designed to perform thermal management and heat dissipation in order to keep the screen at an ideal working temperature. The frame of the LED tiles, normally made of aluminium or magnesium alloys, are designed to conduct and dissipate the excess heat carried over from the LED PCB. Also, the way the LED tiles are manufactured can influence aspects of heat dissipation. There are two techniques: “common cathode” and “common anode”. Common cathode is a more reliable and less heat-producing (higher dissipation) solution which creates less hotspots in the image on the screen. This also creates a larger uniformity and less color differences over time, creating a longer lifespan with higher quality. Common anode tends to produce more heat and have a less efficient heat dissipation, producing hotspots in the panels that are easily being picked up in camera. Always check this with your LED vendor, sometimes this information is hard to find.
An image processor is a system that is designed to adapt, manipulate, transform or convert an image from a source to a destination format.
VP USE CASE
When dealing with LED walls, an image processor is the key hardware/software system that is designed to take the image data from the Content Player to the LED distribution pipeline and screen. It’s also known as an LED video processor, image converter or video controller. Its main operations are focused on these areas: image format conversion, resolution conversion, color space conversion, bit depth conversion, image scaling, and/or crop.
A light-emitting diode (LED) is a semiconductor light source which emits light when current flows through it. It’s the prime element of a LED tile.
A content player is software that is able to playback multimedia audio/video streams. Content Player software is able to read (decode) a number of different video/audio codecs but might have some optimized/preferred settings, depending on their architecture or the hardware on which they are installed.
VP USE CASE
In a Virtual Production environment a Content Player is a software or a hardware+software solution capable of playing back your content to the Image Processor and the LED wall. Normally a Content Player solution will also provide a content projection management tool that allows it to digitally rebuild your LED wall layout and place/project your content correctly within it. These systems often manage metadata streams, such as tracking/positioning data.
A metameric failure is when two colors that appear to match under certain light conditions and with an observer, don’t appear to match anymore if either the light source or the observer looking at the scene changes.
VP USE CASE
In modern digital image processing and reproduction metameric failure are unfortunately very common and are constantly under study and analysis. Sometimes there are solutions which can solve the failure, some other times not. It all depends from what it’s caused and from whom prospective it needs to be fixed. For instance, when shooting against LED walls, we most probably incur in observer metameric failures as we tend to judge the content looking at the LED walls with our eyes, but then the digital cameras which we use to capture that wall have a different spectral sensitivity (the digital sensors and digital processing sees the light in a different way than our eyes) and therefore the content results different.
In colorimetry, metamerism is a phenomenon that occurs when two colors are perceived as the same under the same lighting condition, despite their different spectral power distribution (the amount and quality of energy, which is perceived as light, that an object reflects under a specific light source).
Multiplexing is a technique used to connect devices – typically LEDs (for displays) or buttons (for keyboards) – in a matrix of addressable rows and columns. Each LED can be turned on individually by closing the appropriate row and column switches. Multiplexing is used to contain the generation of excess heat from the LED panel or to reduce the overall required power consumption.
VP USE CASE
LED tile production chain and construction optimizations force LED manufactures to be unable to keep all of the many pixels of a LED tile turned on all at the same time. LEDs are grouped in rows and switched on and off multiple times per second. This multiplexing operation is done by a chipset that controls the cycles in time. This operation is measured and indicated in fractions where 1/1 would represent an LED tile capable of keeping all the LED chips on at the same time (direct drive).
Pixel Pitch refers to the distance between pixels in millimeters. The smaller the pixel pitch, the higher the density on a physical surface, and the higher the resulting picture resolution should be.
VP USE CASE
Pixel pitch affects things like moiré and optimal viewing distance between the camera and the LED wall. The higher the pixel pitch, the higher should be the resolution per tile and the higher the resolution of the entire wall - which requires more image processing power.
A photosite is a light collector that is placed on a digital video sensor. Its scope is to capture the incoming photons so that the energy that they generate can be translated into digital values and represented into digital pixels after the required processing. Sometimes photosites are referred to as sensor pixels.
A refresh rate indicates the number of times per second a display refreshes the showing image. Refresh rate is independent from frame rate, which indicates how many images per second are played from an image source.
Dither is an intentionally applied form of noise deployed to reduce or eradicate quantization error, preventing image artifacts such as color banding or image aliasing.
A display viewing angle is the angle at which the image can be viewed with acceptable visual performance. The viewing angle is normally expressed as an angular range over which the displayed image quality is acceptable, such as 120°. Sometimes it can be different from the vertical axis and the horizontal axis (120° H, 140° V). Outside the optimal viewing angle, the image may seem garbled, poorly saturated, of poor contrast, blurry, or too faint, the exact mode of "failure" depends on the display type in question.
VP USE CASE
As with any display system, most of the LED walls have a limited and well-defined optimal viewing angle, which assures that there won’t be an appreciable color shift of the content when looking at the LED wall from within that range. Several parts of the LED tile can affect the resulting optimal viewing angle, such as the type of LED used or the protection mask applied to the tile. Be mindful that this shift will change dynamically while moving the camera, so your content might look different as the camera moves, especially if the camera placement falls outside the optimal viewing angle.
LED stages go by many names, types and configurations. When the LED wall configuration is designed in such a way so that it creates an immersive surrounding experience around the real scene and the actors, we refer to it as an LED volume stage.