The sad, misleading and embarrassing state of HDR in PC games

HDR is laughably bad for PC gaming in most cases, and we all know it’s true.

It might surprise you, though, if you’re only thinking about how to do it gaming screens announced. After all, on paper, HDR Technically supported by your monitor, your games, and your graphics card. Heck, even Windows supports HDR relatively error-free these days.

So, who is to blame then? Well, when I dug deep for the answer, I found three main culprits that explain our current predicament. And even with some light at the end of the tunnel, this multifaceted problem isn’t about to go away on its own.

game problem

HDR comparison at Tina Tiny's Wonderlands.
Jacob Roach / Digital Trends

I need to start by paving the way for why HDR is such a problem for PC games. It’s a highly variable experience depending on what kind of display you have and the game you’re playing, which makes the whole mess of HDR even more confusing on PC. The main reason behind static metadata.

There are three main HDR standards: HDR10, HDR10+, and Dolby Vision. The latter two support dynamic metadata, which basically means that they can feed display information dynamically based on the ability of the screen and the scene you’re in (even the frame that’s currently on the screen). On the other hand, HDR10 has only static metadata.

Dynamic metadata is a big reason why the HDR console is so much better than HDR on PC.

Only a few select monitors support Dolby Vision, such as Apple Pro Display XDRAnd none of them are game monitors. There are a few HDR10+ screens out there, but they’re exclusively Samsung’s most expensive. The vast majority of monitors deal with static metadata. TVs and consoles widely support Dolby Vision, however, which is a big reason why HDR console is so much better than HDR on PC.

As a former game developer and product manager for Dolby Vision Gaming Alexander Mejia points outstatic metadata creates a huge problem for game developers: “There are more and more HDR TVs, monitors, and laptops on the market than ever before, but if you get a pair from a big local retailer, your game will look drastically different every time. One… How do you know that the appearance you set in your studio will be the same as what the player sees?”

HDR comparison in Devil May Cry 5.
Jacob Roach / Digital Trends

on my Samsung Odyssey G7for example, The wonders of Tina Tiny It looks dark and unnatural with HDR turned on, but Devil May Cry 5 It looks naturally vibrant. Search for the user experiences of these two games, and you will find reports ranging from Best HDR game Ever to terrible picture quality.

It doesn’t help that HDR is usually an afterthought for game developers. Mejia wrote that developers “still need to offer a standard dynamic range version of your game – and creating a separate version for HDR means poor mastering, testing, and quality assurance. Good luck getting signed out on that.”

There are many examples of developers being indifferent to HDR. Recently released elden ringAnd the For example, a terrible flicker appears in complex scenes with HDR turned on and motion blur (above). Turn off HDR, and the problem will go away (even with motion blur still turned on). and in destiny 2 HDR calibration is broken for four years. HDTVTest found The slider did not set brightness correctly in 2018. The issue was only fixed in February 2022 with the release of charming queen expansion.

Games are a source of HDR problems on PC, but it’s an afterthought: one that stems from the market for gaming monitors that seem frozen in time.

screen problem

Alienware QD-OLED display in front of the window.

Even with the many Windows errors HDR has caused over the past few years, Screens are the main source of HDR problems. Anyone compatible with display technology can list the issues without a second thought, and here’s the point: After years of HDR screens flooding the market, displays are mostly in the same place they were when they were HDR technology first landed on Windows.

Conventional knowledge has been that good HDR requires at least 1,000 nits of peak brightness, which is only partially true. Brighter screens help, but only because they are able to push higher levels of contrast. For example, the Samsung Odyssey Neo G9 is capable of twice as much peak brightness as Alienware 34 QD-OLEDHowever, the Alienware display offers much better HDR due to the significantly higher contrast ratio.

There are three things a monitor needs to achieve good HDR performance:

  1. High contrast ratio (10000:1 or higher)
  2. HDR dynamic metadata
  3. Expanded color gamut (above 100% sRGB)

Televisions like LG C2 OLED Very desirable for console games because OLED panels offer huge contrast (1,000,000:1 or higher). Most LED screens come out at 3,000:1, which isn’t enough to get solid HDR. Instead, monitors use local dimming — to independently control the lighting on certain sections of the screen — to increase contrast.

Color image on LG C2 OLED screen.
Dan Baker / Digital Trends

Even premium gaming monitors (more than $800) don’t come with enough areas. The LG 27GP950-B It only has 16, while the Samsung Odyssey G7 has an embarrassing eight. To get a really high contrast ratio, you need more areas, like the Asus ROG Swift PG32UQX with more than 1,000 local dimming zones – a monitor that costs more than building a new computer.

The vast majority of HDR screens don’t scratch the bare minimum. On Newegg, for example, 502 of the 671 HDR gaming monitors currently available only meet VESA’s DisplayHDR 400 certification, which does not require local dimming, an extended color gamut, or dynamic metadata.

Example of local dimming on a Vizio TV.

Spending for a premium experience is nothing new, but that has been the case for four years now. Instead of premium features becoming mainstream, the market is flooded with screens that can advertise HDR without offering any of the features that make HDR tick in the first place. And monitors that square those under $1,000 typically cut corners to do so with few local dimming areas and poor color coverage.

There are exceptions, like the Asus ROG Swift PG27UQ, which offers a great HDR gaming experience. But the point remains that the vast majority of screens available today are not much different from the screens available four years ago, at least in terms of HDR.

Light at the end of the tunnel

Ultra-wide QD-OLED curved screen.

The HDR experience on PC has been mostly flat for four years, but that’s changing due to some cool new display technologies: QD-OLED. As Alienware 34 QD-OLED shows, this is The The panel technology that will truly drive HDR in PC gaming. And good news for players, you don’t have to spend $2,500 north to get to it.

MSI just announced its first QD-OLED display The specs are identical to the Alienware, and I suspect it will use the exact same board. If so, we should see a wave of 21:9 QD-OLED displays by the beginning of next year.

We’re seeing more OLED screens as well, like LG 48GQ900 48″ Phone Recently Announced. They’re marketed as TVs as gaming monitors, sure, but display makers are clearly in tune with the demand for OLED panels from gamers. Hopefully we’ll see some of them with a decent screen size.

There are other display technologies that improve HDR performance, such as mini LED. But QD-OLED is the seismic transformation that we hope will make HDR a reality for PC gaming.

Editors’ Recommendations