While the amount of households with a 4K TV has soared over the past few years, the process of upgrading is hardly cut and dry.
With technical jargon like Ultra HD, HDR, local dimming, and 10 bit color depth, it’s hard to know exactly what you’re getting yourself into, and whether or not the TV you’re about to buy is the right fit for your needs.
HDR, 4K’s partner in crime, makes things particularly puzzling for home theater setups – many TVs on the market lack HDR and those that do offer the new color standard often miss the mark.
If you’re someone just venturing into the land of higher fidelity, you might be wondering whether 4K, on its own, carries the same punch independent of all the developing TV tech.
Keep reading for a dive into what makes the jump in resolution worth it and whether it should be considered for gaming without HDR.
A generational jump in image quality
The move to LCD monitors and TVs during the Xbox 360 and PS3 generation provided a seemingly endless supply of clarity when matched against the grainy CRT TVs of prior generations. From Blu-rays to broadcast television, it was the sort of achievement that didn’t need any technical backing to send a message.
While 4K provides 6,220,800 more pixels than 1080p (Full HD), it doesn’t quite compare to the jump from sub-HD resolutions.
A generational leap in quality? Definitely.
It just won’t match up to seeing Planet Earth on Blu-ray for the first time after years of settling with an unsightly tube TV.
That’s not to say that 4K isn’t worth the upgrade, but there are a few caveats to the technological transition this time around.
For one, larger televisions are going to make things significantly more impactful. Anything 55 inches or more is generally the sweet spot, and smaller TVs will naturally require the viewer to sit closer for the full effect of 4K.
Most TVs sold at retailers like Best Buy are exclusively 4K, so there is always the possibility of upgrades in other factors that improve picture quality, but the visual leap is not as bold as most marketing would leave you to believe when disregarding size and viewing distance.
One of the added benefits of upgrading to 4K streaming (YouTube, Netflix, Amazon Prime, etc.) is an overall increase in picture quality. In addition to clarity, 4K streams often produce less artifacting, better contrast, and a greater level of depth.
Netflix, for example, increases the bitrate of 4K streams to roughly 16 Mbps, nearly 3 times that of 1080p streams.
It’s something that often goes overlooked in the race to sell premium displays, but it’s an aspect that’s immediately noticeable regardless of any other confounding factor (TV size, viewing distance, OLED vs LED, etc.).
Is 4K without HDR worth it for gaming?
While PC is the only platform with guaranteed 4K (per your budget), the PS4 Pro and Xbox One X have also been able to deliver a healthy amount of Ultra HD experiences through the use of checkerboard rendering techniques.
The Xbox One X has also been able to deliver a handful of native 4K games thanks in part to its 6 teraflop GPU.
So, is it worth upgrading without HDR?
Making the move to a 4K capable console is absolutely worth it if your someone that appreciates incredibly crisp image quality and the removal of artifacts like “jaggies”. To put it simply, playing on the Xbox One X or PS4 Pro makes games look clean, even when checkerboard rendering is the driving force.
Own a TV that’s 55 inches or bigger? Sit fairly close?
The jump from 1080p to 4K is going to be even more noticeable, packing a significant “wow” factor.
Many HDR implementations in gaming are either mediocre or botched entirely, making an increase in resolution that much more compelling for most players.
Bigger screens don’t necessarily fall apart at 1080p, but clarity becomes the limiting factor in overall picture quality.
Another thing to keep in mind is that while consoles offer some of the best cases of HDR, developers are still wrapping their heads around the tech. Many cases of HDR in games on PS4 Pro and Xbox One X are mediocre to downright broken – Red Dead Redemption 2 being one of the more recent cases.
These mid gen refresh consoles also offer much more than 4K and HDR, packing a substantial boost in performance (FPS) in many cases, and even offer dialed up graphical enhancements with some titles. The PS4 Pro even packs an additional setting dubbed Boost Mode to increase performance in games that aren’t officially enhanced.
The best part about these enhancements is that, in most cases, players have a choice between the two.
This level of customization, while behind that offered by PC, has never been offered on consoles.
Which brings us to the 4K PC gaming – is the move to 4K alone worth the upgrade?
Things are a little trickier when it comes to the PC front, because the money necessary to display 4K at high frame rates is quite high.
It’s also harder to notice the differences in 4K vs 1080p when you’re gaming on a smaller monitor. The overall driving force of whether it will be noticeable for you is the size of your monitor and how far away you sit from it.
As with TVs, the bigger the better.
Many PC gamers find there is a balance to be struck between resolution and framerate – a popular option is to use extra horsepower for a 1440p (2K) 144 hz monitor instead of going for 4K.
The beauty of PC is that there are options. If you want, you can hook up your powerful gaming rig to a big screen TV to reap the full benefits of 4K.
What you need to keep in mind when it comes to PC gaming and HDR is that, at the moment, you might not be missing out on very much by foregoing the display tech. The implementation pales in comparison to consoles and the experience still has quite a few bugs and kinks.
I’ve personally experienced this when attempting to get Windows 10 to play nice with Forza Horizon 4’s HDR setting. What was always seamless with my PS4 Slim quickly become an insufferable hassle.
More peace of mind to wait for the tech to mature is that the quality of HDR on monitors is outclassed by TVs. Necessary tech like full array local dimming needed to drive better contrast for dynamic range is largely absent from monitors, and monitors generally have poor contrast ratios when compared to TVs.
Putting it all together
Upgrading to 4K gaming will never match up to the transition from the sub-HD era, but the generational leap is more than enough for most players. When done right, HDR is a game changing feature for TVs, but the main drawbacks to the tech are:
- Many mediocre implementations across all platforms
- A high barrier to entry for TVs that do HDR right
And while not mind melting, 4K oriented consoles have not only introduced an increase to fidelity and performance, but give players a level of freedom not seen before. PC players may want to settle for the middle ground of performance and resolution (1440p, 144Hz, variable refresh rate, etc.).
Player choice continues to reign supreme in the land of gaming PCs, especially as companies like AMD roll out generational upgrades to GPUs.
Regardless of player platform, the increase in picture quality with Ultra HD Netflix, YouTube, and Amazon Prime makes 4K more of an all encompassing entertainment upgrade as opposed to remaining a niche use case.
Lover of games, tech, nature, and strange electronic music. Shaped by Sega, PlayStation, Nintendo, and Xbox – platform agnostic ever since. Currently overwhelmed by choice on my Xbox Series X thanks to Game Pass.