Should you prioritize the pixel density of 4K displays or the eye-searing brightness of HDR when you’re shopping for a monitor? We explain everything you need to know.

Nvidia

Today’s Best Tech Deals

Picked by PCWorld’s Editors

Top Deals On Great Products

Picked by Techconnect’s Editors

Table of Contents

Show More

4K and HDR are the peanut butter and jelly of modern televisions. Where you find one, you’ll find the other. Together, they’re the recipe for a delicious eye-candy sandwich.

Monitors aren’t so lucky. While it’s possible to find 4K HDR monitors, they are few in number and expensive. You’ll likely need to choose which feature is more important to you. Here’s how to decide.

The basics: What do 4K and HDR mean?

4K is shorthand for a display’s resolution. It typically describes a 16:9 aspect ratio with a resolution of 3,840 x 2,160. Companies occasionally get creative with marketing but, in most cases, will adhere to this definition.

High Dynamic Range describes content that provides a wider range of luminance and color than previously possible. This allows brighter scenes with more contrast and colors. This term does not always refer to a specific standard, though, so it can be a bit fuzzy. Read our complete guide to HDR on your PC if you want to know the details.

If you’re new to display terminology in general, our guide on what to look for in a gaming monitor can also help get you up to speed.

4K vs. HDR for a monitor: What’s more important?

The answer to this question is definitive. 4K is almost always more important than HDR. Most people should heavily favor a 4K display over one that offers HDR if forced to choose between them.

Why? It all has to do with standardization and software support (or the lack of it).

4K is not an especially common resolution even among monitor computer monitors, but it’s nothing new. The first mainstream 4K monitors hit store shelves in 2013. Windows 10 released with good interface scaling support that made 4K resolution easy to use and has received additional updates to improve scaling over time. MacOS also has excellent scaling support for 4K resolution due to Apple’s focus on high pixel density displays.

Matt Smith/IDG

Windows’ scaling makes a 4K monitor look great even when browsing the web or writing in Word.

HDR is less mature. AMD, Intel, and Nvidia moved to support it only in 2016, but Windows didn’t add an HDR toggle until 2017. Windows still can’t automatically detect an HDR monitor and enable the appropriate settings, though that feature is expected to arrive soon. HDR support in monitors is the wild west. Only the VESA DisplayHDR certification (which is entirely optional) offers a hint of standardization.

It’s a similar story with content. 4K content is not universal but it’s generally easy to find. Virtually all games, even those that are several years old, support 4K resolution. Major video streaming services support 4K resolution, too.

HDR support is less common. Only the latest games are likely to embrace it. Many streaming services don’t yet support HDR streaming to a PC—and there’s some set up to do even when it’s possible.

Most HDR monitors suck

HDR has another problem. Most HDR monitors sold today are really, really terrible at HDR.

As mentioned, High Dynamic Range enables a wider range of luminance and color. But you’ll only enjoy the full benefits on a display with a range of brightness, contrast, and color approaching what HDR standards enable. Most computer monitors do well in color, but falter in brightness in contrast.

Different HDR standards have different limits on the maximum brightness they enable, but at minimum you can expect up to 1,000 nits. Dolby Vision HDR can deliver up to 10,000 nits. Yet most computer monitors max out around 400 nits or less.

Asus

The Asus’ ROG Swift PG32UQX is among the few monitors with excellent HDR support.

Monitors also tend to have poor contrast, which limits the difference between the brightness and darkest areas of the image. Few monitors include the Mini-LED or OLED technology found in modern televisions. There are exceptions, like the 32-inch Asus ROG Swift PG32UQX, but it’s priced higher than most 85-inch televisions.

You can view HDR on any monitor that accepts an HDR signal, no matter its capabilities, but you won’t see its full potential unless the monitor is of exceptionally high quality. A monitor with sub-par HDR will look different from SDR – but it won’t always look better.

Who should care about HDR?

I’ve laid out a damning case against most HDR monitors, and for good reason. HDR doesn’t make sense for most people right now. So, is there any situation where it makes sense to pick HDR over 4K?

There’s two cases where HDR becomes critical.