Just as I was laying down to hopefully fall asleep after a massive 18-hour work day, I read a story over at TechPowerUp sourced from German tech site Heise.de, that AMD Radeon graphics cards were limited in their HDR abilities... well, click bait can be bad sometimes, and we now know the truth.
The original story can be read here, which claimed that Radeon graphics cards were reducing the color depth to 8 bits per cell (16.7 million colors) or 32-bit, if the display was connected to HDMI 2.0, and not DisplayPort 1.2 - something that spiked my interest.
10 bits per cell (1.07 billion colors) is a much more desired height to reach for HDR TVs, but the original article made it out to seem like this was a limitation of AMD, and not that of HDMI 2.0 and its inherent limitations. Heise.de said that AMD GPUs reduce output sampling from the "desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD 'Polaris' GPUs, including the ones that drive game consoles such as the PS4 Pro," reports TPU.