ASUS RoG Swift PG35VQ Monitor review

Monitors 37 Page 3 of 13 Published by

teaser

4:2:2 Color compression - Nits Explained

Chroma subsampling - Color compression

In 2018 it became apparent that in highest refresh-rate modes like 144 Hz, 180 and 200 Hz, high resolution monitors apply color compression due to bandwidth limitations. It's not the fault of ASUS, AOC or NVIDIA, no, the main limitation for this need is signal bandwidth over DisplayPort 1.4. DisplayPort 1.4 has too little bandwidth available to drive high resolutions like 144 Hz and then HDR, thus it needs some sort of compression, often bypassed by applying 4:2:2 chroma subsampling. Chroma subsampling, very bluntly put, is color compression. While information like brightness will remain intact, the color information will be based on half the resolution. All is good up to 144 Hz / 10-bit, after that the DisplayPort connection runs into bandwidth issues and 4:2:2 chroma subsampling

Here's the reality of it all, the PG35VQ applies YUV422 if you want to show higher refresh rates than 144 Hz in combination with 10-bit color. 10-bit color is only used during HDR images; SDR content, as such, is shown as standard with 8-bit color. That type of signal fits over the cable without compression at 200 Hz. So for this monitor YUV422 kicks in once you access an HDR game. And in games, you will not notice any effects. 


41709_untitled-2


Is there a solution? - Display Stream Compression (DSC)

A proper solution is available, so we are staggered to see it is not implemented on this extremely expensive monitor. DisplayPort 1.4 offers a way to still be able to transport data with all the bandwidth it needs via Display Stream Compression (DSC), a virtually loss-free compression in which the required bandwidth decreases considerably. Unfortunately, the scaler of the PG35VQ as tested today does not have support for DSC. This means that visible color compression must be used.


Is 4:2:2 chroma subsampling a bad thing?

That I can only answer with both yes and no. This form of color compression has been used for a long time already, in fact, if you ever watched a Bluray movie, it was color compressed. Your eyes will be hard-pressed to see the effect of color compression in movies and games. However, with very fine thin lines and fonts, you will notice it. For gaming, you'll probably never ever notice a difference (unless you know exactly what to look for in very specific scenes). In your Windows Desktop, however, that's something else. And let me show you an example of that. Have a peek at a zoomed-in Windows desktop...


Chroma

Example image done with a previous review


Above, you can see a good example of the problem at hand. Compare the two photos left and right, focus on the icon text. To the right, the N in Heaven and letters N, H and M in Benchmark are discolored. That is the effect of 4:2:2 chroma subsampling. The photo is blown up, but you can see this rather clearly with your own eyes. 

Now, ONLY if you have 10-bit HDR active in Windows desktop mode will you notice this effect in the Windows desktop with thin fonts. In gaming, you will not notice the effect, at least I could not see it. Typically you will all have set up your desktop in 8-bit and games then apply 10-bit. 

HDR 10

Better pixels, a wider color space, more contrast will result in more interesting content on that screen of yours. FreeSync 2 and the new G-Sync enabled monitors have HDR10 support built-in, it is a requirement for any display panel with the label offering full 10bpc support. High Dynamic Range will reproduce a greater dynamic range of luminosity than is possible with digital imaging. We measure this in nits, and the number of nits for UHD screens and monitors is going up. Candle brightness measured over one meter is 1 nit, also referred to as a Candela; the sun is 1.6000.000.000 nits, typical objects have 1~250 nits, current pc displays have 1 to 250 nits and excellent non-HDR HDTVs offer 350 to 400 nits. An HDR OLED screen is capable of 500 to maybe 700 nits for the best models and, here it’ll get more important, HDR-enabled screens will go towards 1000 nits with the latest LCD technologies. HDR allows high nit values to be used. HDR had started to be implemented back in 2016 for PC gaming, Hollywood has already got end-to-end access content ready of course. As consumers start to demand higher-quality monitors, HDR technology is emerging to set an excitingly high bar for overall display quality.

Good HDR capable panels are characterized by:

  • Brightness between 600-1200 cd/m2 of luminance, industry goal is to reach 1000 to 2000
  • Contrast ratios that closely mirror human visual sensitivity to contrast (SMPTE 2084)
  • And the DCI-P3 and/or Rec.2020 color gamut can produce over 1 billion colors at 10 bits per color

HDR video and gaming increase vibrancy in colors, details with contrast and luminosity ranges with brightness. You will obviously need a monitor that supports it as well as a game title that supports it. HDR10 is an open standard supported by a wide variety of companies, which includes TV manufacturers such as LG, Samsung, Sharp, Sony, and Vizio, as well as Microsoft and Sony Interactive Entertainment, which support HDR10 on their PlayStation 4 and Xbox One video game console platforms (the latter is exclusive to the Xbox One S and X). Dolby Vision is a competing HDR format that can be optionally supported on Ultra HD Blu-ray discs and from streaming services. Dolby Vision, as a technology, allows for a color depth of up to 12-bits, up to 10,000-nit brightness, and can reproduce color spaces up to ITU-R Rec. 2020 and SMPTE ST-2084. Manufacturers of Ultra HD (UHD) TVs that support Dolby Vision include LG, TCL, and Vizio, although their displays are only capable of 10-bit color and 800 to 1000 nits of luminance. The maximum range of colors reproducible by the monitor is generally expressed as the percent coverage of a defined standard like sRGB, Adobe RGB, DCI-P3, and BT.2020. These each specify a 'color space', or a portion of the visible spectrum, that delivers a consistent viewing experience between different imaging devices like monitors, televisions, and cameras. HDR10 requires displays to cover at least 90% of the DCI-P3 color gamut (which is a subset of the currently unachievable BT.2020 color gamut).  


41711_untitled-2


Another way to interpret color gamuts are their coverage of Pointer’s gamut, a collection of all diffuse colors found in nature. Expanding a color space to include or extend beyond Pointer’s gamut will allow for richer and more natural imaging, as human vision is capable of interpreting many artificial colors beyond Pointer’s gamut that are commonly found in manufactured goods like automobile paints, food dyes, fashionable clothing, and Coca-Cola’s signature red. Think big and a lot of bandwidth. Monitor resolutions are expanding. A problem with that is that the first 8K monitors needed multiple HDMI and/or DisplayPort connectors to be able to get a functional display. Alright, we've got the basics covered, let's move onwards into the review.

Share this content
Twitter Facebook Reddit WhatsApp Email Print