The HDR technical specification is present in TVs , monitors, smartphones and cameras . You may not know exactly what this acronym means, but you have probably realized that images are more “vivid” when this function is available.
Understanding how this technology works and what are the possible variants it presents is fundamental to knowing how to choose the best equipment – whether for reproduction or for capturing HDR images.
Also Read : 5 habits that technologies of the last years have killed
HDR : What is it?
HDR is an acronym for High Dynamic Range (in direct translation). By dynamic range we can understand the relationship between the highest and lowest value on a given scale. In the case of TVs and monitors, this index is the difference between the lightest point and the darkest point possible.
For this reason, equipment and software compatible with HDR allow colors to be reproduced more faithfully. This is possible thanks to the ability to recognize a wider range of tones within the same color range and greater brightness.
HDR technology is an evolution of SDR ( Standard Dynamic Range ), which had a smaller relationship between the lightest and darkest points within the possible color scale. To give you an idea, SDR equipment reaches a brightness intensity between 100 and 200 nits, while in HDR this intensity can reach up to 2 thousand nits.
HDR is not always the same: understand the differences
By now you may have realized that the greater the dynamic range, the better the HDR quality. Although the technology itself is the same, the form of implementation varies from device to device and from manufacturer to manufacturer. That is why the HDR TV of a certain brand may present a different image from that of another with similar technology.
The first generation of HDR used 8 bits of color depth. In practice, each of the three basic RGB colors (red, green and blue) has 28 possible values, a total of 256 colors per channel. Considering the 3 colors, we therefore have 256³, which makes a total of 16,777,216 possible colors.
In the following generations, the HDR10 and HDR10 + standards, each color channel supports up to 10 bits, therefore 210 possible values, totaling 1024 colors per channel. Therefore, considering the 3 colors, we have 1024³, which expands the possibilities to 1,073,741,824 colors.
There is also the Dolby Vision standard, developed by Dolby Laboratories, in which each channel supports 12 bits. Doing the same calculations we have 212 possible values, or 4096 tones per color channel. The total of 4096³ colors results in an impressive 68,719,476,736 colors.
Putting the quantities in perspective, Dolby Vision reproduces up to 64 times more colors than the HDR10, in the same way that the HDR10 reproduces 64 times more colors than the first generation HDR.
TVs and monitors usually have brighter displays and, therefore, have a higher capacity for emitting nits. While an HDR TV can reach 2,000 nits, the screen of a top-of-the-line smartphone can reach around 700 nits.
HDR in photography
The above logic applies to the reproduction of images, but when we talk about capturing photos we also use the same principle. It works like this: when taking a photo with HDR mode enabled, the software takes care of creating an image with the greatest possible exposure and the least possible exposure in addition to the conventional image.
These three images, when combined, are able to offer an above-average sharpness index, as they are composed of the “best possible exposure” at each point of the image. That’s why HDR cameras need more powerful processors, which can process these multiple image layers in real time.
Also Read : What is the camera aperture and what is it for?