I recently got an HDR supporting monitor and for video it’s a really nice effect. Some of the supposedly HDR enabled games just use it for a sun drenched washed out look though. As a new tool, it’s bound to get misused and over applied though.
I love how it absolutely wrecks and washes out things like Steam notifications or other UI elements that appear in overlays during fullscreen gameplay. Kinda like the absolutely illegible achievement unlocked popups on the X360 when playing games in 3D.
The earliest thing I can remember was the end of Annihilation on an OLED TV. It’s metadata to tell the TV how to adjust brightness in dark and bright areas of the picture.
I got a tv supports it. I played Half Life Lost Coast. I cannot process that it is doing anything.
It a visual difference at the start of RDR2 so it has been on my mind. “Okay” I said.
The HDR that Lost Coast is doing is something different than HDR in the case of monitors. I learned the appeal of HDR when I saw the Netflix startup logo in HDR, that was a very rich red color.
Ohhh I was confused when Rudie said 17 years. Lost Coast is emulating HDR photography exposures in its lighting rendering (think Flickr wallpapers). HDR as metadata with video has only been around in consumer stuff since ~2016 or so.
Results vary between display technologies because of their ability to show shades of black or crank brightness in specific parts of the image.
Oh okay this is absolutely an audiophile thing got ya. Visuaphile? Either way better off not giving a shit.
Yeah, the most impressive use of what looks like what people describe as HDR I’ve ever seen is the original Nier on the Xbox 360/PS3 on my Mitsubishi DLP. And that’s not even actual HDR so 
Can we just call the modern use of HDR “even better contrast” or is that not accurate?
It’s “tone mapping” metadata to tell the TV to enhance the darkest and brightest areas by modulating brightness. It could technically produce lower contrast if the hardware playing it back isn’t capable of putting out enough brightness e.g. a “VESA 400 HDR” monitor that does 400 nits peak without local dimming.
The different uses in HDR in terms of photography and HDR in TV tech, and the way that over the years video games have adopted both meanings makes the whole thing super confusing.
the HDR that has existed in games for the past 15 or so years is a part of the rendering pipeline that still does the effects of what HDR does (a good example is your vision blowing out if you look at the sun) but then crunching that down to SDR tone space. a large part of why AutoHDR works is because everyone has been doing HDR passes so technically the data is there to create a tonemap, the issue was interpreting it before it gets crushed down to SDR or telling the device to output the HDR data
which is why I am understandably confused that a game like Uma Musume, which is 85-90% 2D assets, was able to get hooked by the AutoHDR process (my base assumption is they never flicked a switch in Unity or there’s actual simple metadata for iPhones and higher-end Android devices left kicking around in the port)
It’s not that confusing on an iPhone where you have HDR on both the camera and attached screen and can see a greater range of dark to bright on your photos, with less pitch-black or blown-out white parts.
As soon as you send those HDR photos to a different device and try to represent the "HDR"ness on a legacy display it gets confusing and underwhelming though. Or if it’s the display but not the source image that is HDR. And the vast majority of the things branded “HDR” are one of those partial situations.
Is HDR mode on an iPhone not just automatically doing the old digital photography trick of compositing three images taken at different exposures?
Yeah as usual, but it’s the screen color range that’s wide, and universally assumed to be wide by iPhone software, which is why I mention the iPhone specifically. Whereas even if you get a wide color range display on another OS, it’s hard to be sure one piece of software or image format isn’t undermining the effect since it’s not the baseline case.
If anyone has an Asrock board and needs a hardware TPM, let me know. I feel dumb for missing that PTT would be enough but not that dumb because I got in before the absurd price hike
I have spent an hour with my HDR compatible TV and PS5 flipping HDR on and off and trying to notice the difference and I just cannot do it. Gonna try the Netflix logo test and see if that helps
it’s an enormous difference if your TV has the brightness to support it, it should make SDR stuff look crayon-saturated by comparison
it’s kind of a subtle thing and really only stands out if your display can really push a ton of nits right into your eyes but it’s definitely a game changer when it’s used properly
it’s the ray tracing of display tech
I keep it disabled on my projector unless I’m watching a movie with the lights off late at night, then it rules, but it’s not worth the hit to legibility otherwise