Standard Definition Video
While working with video, we have to concern ourselves with its brightness and darkness output, and that depends on what is referred to as black level. There are two types of black that we typically work with. There is absolute black (or computer black), which we see here, and there is video black, which is seen here. It’s lighter in color, but it helps us to see much more detail in shadows and it helps to desaturate colors that might otherwise bloom out too much. We saw this effect far too frequently in the early days of music video.
This is because of the ways in which televisions handle video. Standard definition video must conform to the NTSC specification, which defines the range from black to white. The history of this range definition goes back to analog video (for the interested reader, a more detailed description can be found here). For standard definition digital video, the range is defined with 8 bits, and therefore has 256 possible values (ranging from 0 to 255).
The minimum brightness level defined in Standard Definition video corresponds with the 17th step, number 16. The maximum brightness level corresponds to step number 236, or number 235. This is why NTSC brightness is referred to as 16-235; it fits into a subset of the possible 8-bit range of brightness. We refer to this brightness range as the video’s luminance.
So how does this relate to the real world? The answer is that different devices and environments have different video output characteristics, and if the brightness levels of a video do not correspond to the target output device, severe degradation and loss of quality can result.
In this first example, we have a video that was designed on a computer, but is destined for DVD. A computer can take full advantage of the full 256-step range of Standard Def video. The DVD, however must output 16-235 video. If the video setup levels range from 0-255 on the computer and are not adjusted for the DVD’s range, we’ll see clipping in both the high and the low end of the brightness range.
In order to faithfully reproduce the original video, we need to adjust the video’s scale to the NTSC range of 16-235. This is why it is critical to have accurate color bars at the beginning of each video we produce. Without color bars (or with color bars that do not match the video), the black level and chroma setup of a video becomes a guessing game. For more about chroma, please refer to the article here.
Using the color bars at the beginning of a video in conjunction with the video content itself, we can adjust the scale of video output such that it fits in correctly with the correct NTSC color space. We do this electronically either while capturing a video source or before processing it for its final format. This final adjustment ensures that colors and black levels will appear accurately on the end-users playback system.
High Definition Video
HD video is different from SD video in numerous ways: its data rate is much higher, its display format is (nearly) always in widescreen mode, and its display resolution can vary from 480 x 854 to 720 x 1280 to 1080 x 1920. It also uses a different luminance scale.
Because High Definition video doesn’t adhere to the older NTSC standard, it can take advantage of the full 8-bit range of luminance. This means that blacks can be blacker and whites can be brighter without taxing the playback device used to view the video. While color bars are still necessary, black and white levels tend to require minimal adjustment (and may not need adjusting at all).
What’s more, HD video has the capacity to handle 10-bit video, further enhancing the number of steps between absolute black and absolute white. This pushes the scale from 0-255 up to a healthy 0-1023 range. What this also means is that the data rate of HD video can be upwards of 440Mbits/s (or 880Mbits/s for 4:4:4 video).