New HDR and WCG formats relies on large variety of video processing chain parameters.
Even the ubiquitous color bars in HDR version significantly differ from their SDR counterparts.
In mixed SDR/HDR environments video engineers have to deal with:
- Three types of YUV ⇔ RGB Color Coding Matrices:
SD BT.601, HD BT.709 and UHD BT.2020
(quite often the BT.709 matrix is used for UHD images, which should be encoded by the BT.2020 matrix)
- Four sets of Color Primaries:
“Narrow Color Gamut” BT.601, BT.709, “Medium Color Gamut” DCI-P3 and “Wide Color Gamut” BT.2020 (in practice, we see various combinations of Master Display Primaries and Encoding Primaries, e.g. DCI-P3 mastering followed by BT.2020 encoding)
- Three sets of Transfer Functions (OETF ⇔ EOTF):
SDR BT.709, HDR-PQ BT.2100, HDR-HLG BT.2100
(also so-called optimized OOTF, i.e. non-standard HDR camera and/or display transfer function)
- Two YUV ⇔ RGB Levels Mapping Schemes:
Narrow (aka Broadcast) Range & Full (aka PC) Range
Now, imagine the total number of combinations and permutations of all these parameters!
Without the appropriate tools and automated systems it is nearly impossible to provide an efficient video content workflow QA/QC.
Add to the above mentioned factors several Color Space Conversion Matrices combined with the Color Gamut Mapping Functions performing 1D or 3D conversion of “linear light” RGB values between various dynamic range formats and different primaries. Thus, apparently, the life of video engineer should become absolutely unbearable.
However, in the daily practice of live event coverage and similar challenging production situations, engineers already found good solutions and even established de-facto standards allowing them to work efficiently in such a multi-format environment.
For more details about the mentioned parameters see posts in the Video Science Fundamentals category.
For the HDR and WCG analysis tools and levels alignment procedures see posts in the VideoQ Analyzers and VideoQ Test Patterns categories.