Technology

How to Choose an HDR Gaming Monitor

High Dynamic Range refers to scenes rendered with brighter highlights, greater shadow detail and a wider range of color for a better looking image. For
gaming
HDR, in contrast toΒ TV HDR, it means more than a prettier picture: The better you can see what’s lurking in the bright and dark areas, the more likely you are to avoid hidden enemies and spot clues. But keep in mind, that most games are still designed for the middle common denominator: Everything you need to see is sufficiently visible in the middle of the brightness range.

Read more: How to Buy a Gaming Monitor

Games still require explicit HDR support for optimal results, but the introduction of Auto HDR in the Xbox Series X/S and in Windows 11 changes that: The operating systems can automatically expand the brightness and color ranges of nonHDR games. It’s not the same as having a game that was rendered to use the expanded ranges, but it can give it a bump to make it look better than it otherwise would.

What is HDR and why do I want it?

To deliver its magic, HDR combines several elements. First, it uses an extended brightness range, well beyond the 256 levels displayable by a typical monitor, and in the best cases, beyond the true 1,024 levels of a great monitor. It also covers more colors than the least-common-denominator sRGB gamut, profiles necessary to optimally map the color and brightness ranges of content to the capabilities of the display, a decoder in the monitor that understands the mapping and all the related technologies that tie the pieces together — not the least of which is the operating system.Β 

For a lot of games, HDR doesn’t matter, because they don’t have lots of areas with high brightness or deep shadows, or don’t take advantage of the bigger tonal range in any meaningful way. But for games that support it, you’ll probably get better visuals for AAA games, more creeps from horror games, fewer ambushes out of the shadows in FPS games and so on.

The real question isn’t whether or not you want it. The question is how much are you willing to pay for it — not just for a display with “HDR10” in its specs, but for a monitor that will deliver the image quality that we associate with HDR.

Will an HDR gaming monitor work with the Xbox Series X/S and PS5?

Yup! There are even a publicly available set ofΒ best practices for HDR game development and monitor design developed by
Sony
,
Microsoft
and a host of other relevant companies under the umbrellaΒ HDR Gaming Interest Group, for their
consoles
and Windows. But HGIG isn’t a standards body, nor does it certify products, so you still need to pay close attention to specs. And it gets more confusing still

‘HDMI 2.1’ caveats

Unfortunately, the HDMI specification has turned into such a mess that you can’t make assumptions about capabilities based on the version number. Not only is every HDMI 2.0 connection henceforward to be labeled as 2.1a (with the same HDMI 2.0 feature set), but the specification no longer mandates any of the important new features; in other words, all the whizzy capabilities that made HDMI 2.1 desirable, especially as a choice for consoles, are now optional.Β 

Bottom line: If you want a monitor for your console that can do 4K at 120Hz, support variable rate refresh and auto low-latency mode, you’ll have to verify support for each individually. And the same goes if you want a PC monitor connected via HDMI that can support source-based tone mapping (discussed below) and bandwidth-intensive combinations of high resolution, fast refresh rates and high color depth/HDR.

Monitor manufacturers are supposed to list supported features explicitly; if they don’t, either pass the monitor by or delve deeper. If you want the gory details, TFT Central does an excellent job explaining the issues.

What do I look for in an HDR gaming monitor?

The term “HDR” has become pretty diluted thanks to marketers stretching the definition to encompass displays in the most popular price range (less than $400). So to a certain point you have pay attention to multiple specs to figure out if it’s capable of a real HDR experience.

The VESA display industry association created a set of standards and criteria for conveying HDR quality levels in consumer monitors, DisplayHDR, which is pretty reliable as one method of crossing choices off your list. (DisplayHDR 400 is laughable for HDR because its color gamut and brightness requirements make it the kiddie pool of HDR, but if you’re just looking for a bright SDR monitor it’s a good bet.)Β 

Read more:Β VESA Updates DisplayHDR Logo Spec to Accommodate Laptop, OLED Screens

Many manufacturers have taken to referring to monitors as, for example, “HDR 600,” which confuses things. It’s never clear whether they’re simply using it as shorthand for the equivalent DisplayHDR level and don’t want to pay for the logo certification program, or whether they’re using it as misleading shorthand for the ability to hit the peak brightness level of a particular tier. It’s possible for them to run through the certification tests themselves for internal verification without opting for the logo. (You can, too, with the DisplayHDR Test utility available in the Microsoft Store.)

monitor-logos-photo-3

Lori Grunin/CNET

That’s why it’s important to understand the important — and not so important — HDR-related specs.

HDR10 and HDR10 Plus Gaming

From a spec standpoint, HDR10 support means little to nothing, because it only means the monitor understands the data stream and render it somehow, not that it actually be capable of displaying it well. Adherence to the HDR10 standard is the most basic level a monitor has to hit (and the cheapest to include) in order to call itself “HDR.” It’s simply means the monitor can support the algorithms needed by an operating system to map HDR content correctly to the capabilities of the monitor: brightness mapping and the ability to handle the 10-bit calculations that mapping needs (for EOTF and SMPTE ST.2084 gamma), understanding how to work with the compressed color sampling in video and the capability of handling and mapping colors notated within the Rec 2020 color space.Β 

At CES2022, the organization behind the HDR10 standard announced a new level, the forthcoming HDR10 Plus Gaming standard, a variation of the HDR10 Plus that’s been available on TVs for a while. It adds Source Side Tone Mapping (SSTM), which adjusts the brightness range on a scene level based on data embedded by the game developer — HDR10 has a single range that has to work for the whole game. It also includes the ability to automatically trigger a display’s low latency mode, to compensate for the additional overhead imposed by the HDR data (more important for TVs than monitors), as well as support for variable refresh rates in 4K at 120Hz on consoles (still not implemented in the PS5 as of today).

HDR10 Plus requires certification and a paid licenseΒ for the hardware manufacturers (that includes GPUs), because the license also pays for usage rights to selected patents of the member manufacturers, but not software developers. Samsung announced at CES that all its 2022 gaming monitors will support HDR10 Plus.

Color and brightness

Brightness is a measure of how much light the screen can emit, usually as expressed in nits (candelas per square meter). Most desktop monitors run 250 to 350 nits typically in SDR (standard definition range), but HDR monitors also specify a peak brightness which they can hit for short periods in HDR mode and usually for just a portion of the screen. Displays that support HDR should start at 400 nits peak — at the very least — and currently run as high as 1,600. (Laptop screens are different, because they need to be viewable in different types of lighting, such as direct sunlight, so therefore benefit from higher brightness levels even without HDR support.)Β 

OLED screens tend to be assessed differently because they achieve virtually zero-brightness black levels, which is what makes them so high contrast regardless of how bright they can get; contrast is one of the biggest determinants of how we perceive the quality of an image.Β 

For gaming and monitors in general, the color space you’re most interested in is P3, which comes in two slightly different flavors: DCI-P3 and D65 P3. In practice, they differ only by their white points; DCI is a hair warmer (6300K instead of 6500K) and was conceived for editing film. However, I frequently see DCI-P3 listed in specs where they really mean D65. That’s fine, because D65, which was spearheaded by
Apple
for its own displays, is the one we care about for gaming monitors. And their gamuts are identical, so unless I’m specifically differentiating between the two I refer to it simply as P3. (If you’ve got educated eyes you can tell the difference between the two whites, but it’s immaterial for most people.)

You’ll also commonly see gamuts listed as a percentage of Adobe RGB, which is fine as well. Adobe RGB and P3 overlap significantly; Adobe RGB is shifted a bit toward the green/cyan end of the spectrum, because printers use cyan ink, while P3 extends further out on the green/yellows, which are easier for good monitors to produce. And that, in a nutshell, is why when specs say “over a billion colors” (the number produced by using 10-bit math) it’s meaningless. Which billion matters.

ciechartwith709and2020-and-p3.jpg

The smallest triangle is the color gamut of any decent monitor and is roughly equivalent to sRGB (it’s actually Rec 709). The next largest is P3 color and the largest is Rec 2020.

Geoffrey Morrison/CNET (triangles); Sakurambo (base chart)

Any monitor you consider for decent HDR viewing should definitely cover much more than 100% sRGB, a space developed by
HP
and Microsoft in 1996 to provide least-common-denominator color matching in Windows that is roughly equivalent to the color space of the the Rec 709 SDR video standard. If you look at the chart above, you can see immediately why the greens of sRGB-calibrated monitors and images are awful and everything is looks relatively low contrast (because it can’t attain high saturation values of most hues).

Based on my experience, I think a decent HDR monitor should be able to hit a peak brightness of between 600 and 1,000 nits and cover at least 95% of the P3 or Adobe RGB color gamut. (When Windows looks awful in HDR mode, it’s the result of lower brightness capability, sRGB-only color gamut, poorly designed aspects of the operating system and math.)

Backlight type

All screen technologies except OLED shine a light through various layers of color filters and liquid crystal to produce an image except OLED, which has self-illuminating pixels.Β Most panels with backlights may display some artifacts, notably the appearance of light around the edges of a dark screen, usually referred to as backlight bleed (although it’s technically an artifact of edge lighting).

A newer backlight technology which is great for HDR, mini LED, lets a monitor use
local dimming
like a TV to produce high brightness with less bleed and fewer bright halos when they appear next to dark areas; the brighter the display, the more noticeable unwanted brightness tends to be. Mini LED is used by the latest crop of HDR displays with brightness of 1,000 nits or more. And as with
TVs
, more local-dimming zones is better.

But all those LEDs glowing brightly can generate a hot of heat. One trend has been to dial back the number of zones from when monitors with mini LED arrays first shipped. Monitors announced in 2022, for example, have half the zones of the initial 1,152-zone models.Β 

Samsung QD-OLEDΒ screens combine Quantum Dot color rendering technology with a blue OLED backlight; that let’s produce high contrast and fast response times, using the Quantum Dot array to render a broad array of colors. The first monitor to ship with the screen is the Alienware 34 QD-OLED. The AW34 straddles the brightness line by supplying a standard 400-nit HDR mode (which is better than it sounds because of contrast provided by the essentially perfect black) as well as a more limited 1,000 nit mode. It radiates some heat, but doesn’t get nearly as hot as the traditional 1,000-plus nit monitors.

As brightness rises, so does price, which is why 400-nit displays are so appealing to both buyers and sellers. Tossing in gaming needs like a high refresh rate can boost the price even more.

Source: CNET

Donate to Breeze of Joy Foundation

Global NewsX

Global NewsX is a news sharing website that offers a wide range of categories, from politics and business to entertainment and sports. With its easy-to-navigate interface, users can quickly find the news they are looking for and stay up-to-date on the latest global events. Whether you're interested in breaking news, in-depth analysis, or just want to stay informed, Global NewsX has got you covered.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Home
Videos
Back
Account