Why hasn't this taken off in gaming?

Why hasn't this taken off in gaming?
It's easily the biggest upgrade to graphics you can get.

HDR looks fucking horrible and fucks up readability in competitive games, so there's that.

Obviously displaying the same image but with a grey filter that says

This is you

is a pretty poor advertisement

I looks good if you actually have an HDR monitor.
Most "HDR" monitors can't actually do HDR

Maybe I'm blind here but the left and right picture 100% look exactly the same to me

Then you didn't have an HDR monitor. Literally only miniLED and OLED can actually do HDR. Mini LED hardly does it. MicroLED will actually be able to.

Otherwise a dual polarized monitor can. But at that point OLED wins in every regard since DPLCD are overpriced..

White balance is the same in both. Defeating your small brain argument.

Just got my 55 inch LG G4 and anyone who says HDR isnt good is a retard

PC monitors in general have extremely misleading advertising and it takes advanced autism to know that the spec sheet is trash hardly better than an ancient TN display.

price

That's by design

As a PC gamer I do not understand the HDR meme. I just use reshade if colors are messed up and then I fix them. I also lower the brightness of my monitors so I don't burn holes into my eyes.

HDR just seems retarded to me. It seems to be all about being reliant on a dev and searing holes in your eyeballs.

HDR for TVs and movies is amazing. HDR for video games and monitors is terrible 9/10 times. HDR is extremely underutilized in the vidya industry for some reason, and even when developers do implement it, it is usually some extremely half-baked bad version.

HDR is better in video games than movies or TV.

There are no HDR monitors that can do that, they all have ABL on a 100% full white screen

searing holes in your eyeballs

you realise outside is thousands of nits and no monitor gets that bright

It used to be like that 5 years ago but HDR in video games is stellar now

It's kind of hard to convey what HDR looks like an in image in a similar way you can't convey 120FPS in a 60 FPS video, but the point of HDR is that it makes light areas lighter and dark areas darker.

very few games actually support hdr

hdr doesnt work on linux (((yet)))

the few games hdr does work in, they usually fuck it up and make it look awful

t, owns an hdr monitor

I don't even have an HDR monitor myself but this reads like cope
That's like saying you don't get the 144hz meme

Yeah and it fucking sucks.

hdr doesnt work on linux (((yet)))

Works on my machine

have HDR monitor

turn on HDR

everything is too dark and murky no matter how much I mess with my settings and monitor

turn SDR back on and enjoy playing the game

sucks

wolfermelon.jpg - 640x486, 70.68K

Because you need an actual OLED monitor/TV for it to not be a meme. You think your average gaymers can actually afford them when they couldn't even buy a Switch 2?

oh shit it actually works now? i had no idea lmao

Your monitor can accept HDR, but it is not an HDR monitor.
Only miniled and OLED can do it.

I don't know if you can do it on nVidia or gnome yet. You need AMD + KDE.

oled > hdr

Also you need wayland. You can't use X11 if you want HDR. That's why you're fucked if you try with nVidia.

OLED is HDR

I have an OLED HDR monitor. It sucks. It's a sucky setting that sucks. Stop sucking HDR's sucky dick.

doffle.jpg - 540x540, 56.53K

You need AMD + KDE.

welp, back to waiting then i guess

what monitor do I get if I want a good hdr experience and no burn-in issues

You need to enable HDR 1000 mode.

More like OLED enhances HDR.

no burn-in issues

Then your only choice is miniLED. OLED will always inherently have a chance of burn-in.

Enjoy the holes in your eyes.

I don't really care about fps above 60 either.

People who don't understand how good HDR can be tend to be too poor to afford a proper OLED monitor/TV that can output true HDR instead of whatever "HDR" their LCD monitors claim they can do.

Because there's no standard, and games always half-ass their HDR implementation and expect the end user to spend hours tweaking it themselves for their specific monitor. Nobody's autistic enough for that.
SDR just works.

I hate it, and always disable it. Otherwise it's a delightful round of "WHEE TIME TO TURN UP THE GAMMA SO I CAN FUCKING SEE" anywhere it's dark, which of course makes the bright areas too bright. Same type of cancer as chromatic faggeration or brown n bloom as far as I'm concerned.

and fucks up readability in competitive games

The exact opposite, when calibrated properly,

Then you didn't have an HDR monitor

HDR monitor is a requirement in order to use HDR.

Most reddit response I've read today, go back

firstly, virtually every modern game supports hdr, the only one in recent memory that doesn't is kcd2 because it's based on an ancient version of cryengine
secondly thirdies can't afford oled panels so don't bother asking here
thirdly its quality is completely based upon developer aptitude, and funnily enough cyberpunk in your example image is a game with terrible hdr implementation. an example of a good game is the newest ratchet and clank title
fourth, it's sucked cock on PC until very recently with the addition of windows hdr calibration tool and RTX HDR which basically fixes everything
also no one's opinion is valid unless they show proof of their panel's capabilities

I don't really care about fps above 60 either.

case closed lol

I don't really care about fps above 60 either

Choose one:

have never seen a high refresh rate monitor

neurological problems making it impossible to tell between different framerates

pure cope

I'm just not going to spend $1000+ on something that I can ruin so easily. I am on this thing all day every day so there's no way I can avoid it from happening

Nintendo will finally push HDR to mainstream normies since it works in docked and handheld

hdr.png - 1702x961, 2.01M

Windows

HDR 400 true black

lmao

okay eyelet

I have a 144hz display and I cap the frame rate at like 75 or 90 usually just to save energy and gpu lifespan lol. I genuinely don't care, I haven't yet played many games where it felt much better. Gravity Rush 1 was one of them, that was pretty cool. TLG too but it's not playable yet.

You don't need more than HDR 400

fake image, the game doesn't look so dim like on the right

human eye can't see any meaningful difference over 60 fps

you cant show hdr online, especially on Anon Babble since they strip out all icc data. so its all exaggerated. this is the best example.

sdr vs hdr.gif - 1587x1160, 3.96M

That's not HDR

anything above 400nits of brightness is unsafe for human eyes and a marketing meme

I always turn off HDR lol.

HDR certification on monitors that can't do any form of per pixel backlight

Windows do not support showcasing both color ranges side by side in 10, and arguably 11

It will be a good showcase at the least.

You may as well just use a VA panel at that point. Why buy OLED when you aren't even going to use it properly?

It is the only HDR that won't burn your retina

VA panel

An objectively superior technology to OLED.

Not many games support HDR

there is not one singular HDR standard theres HDR 10-500 and its a complete dice roll if its the hdr it says on the box

anything above 400nits of brightness is unsafe for human eyes

ayylmao.jpg - 160x122, 11.25K

Yeah, I'm white.

Because your image is straight up lying, going for oled monitor is way bigger upgrade than hdr, hdr can even make games look worse

This is the most retarded post I’ve read in a long time. What makes people so confident in being completely wrong?

You could try vkbasalt. It's not true HDR but post processing but the results are decent.

living in the arctic circle and underground during those 3 months where the sun is out?

there is not one singular HDR standard

The standard is one
The output is currently 3:
1. Per pixel lightning
2. Not per pixel light
3. Capped at 400NIT
The big question are if a light value of 1024 will output 400NIT, 800NIT 1000NIT or more.

There are "HDR monitors" and HDR monitors. I have an OLED and IPS and the latter is unusable in HDR while the former is vastly improved. Without proper backlight segmentation there’s zero point to HDR. Unfortunately most LCD monitors have only one backlight area which is always on. TVs are way ahead on this front. The winning move is just getting an OLED for games/movies and keeping it off when just browsing, assuming you live and work in a first world country and can afford such things.

No one is going to give a shit about what you said

it needs ONE singular standard that goes from shittier to gooder in easy to understand numbers so when any normalfag like me looks at the box he knows what kind of HDR he is getting

i tried hdr in drg on my steam deck and it looked like shit

HDR is obviously a scam since you can show the difference in a simple image on every cheap non-HDR monitor, like in those pics i quoted
It's just color balance, and there doesnt need to be any hardware or "high tech" requirement for it to function
it's the equivalent of stupid audiophiles' 500 USD "cables"

I"ve been doing HDR since the beginning of 3D games by using mods or settings to disable the brown / blue / or whatever current year fashion is of "filters" the devs put on a game, also disabling "bloom" whenever it is intrusively washing out the entire display, and that's it, that's "HDR"

That plus properly setting up your monitor, just read a good review about your model and they'll tell you how to tune it correctly

Eventually OLED and MicroLED is going to outcompete the rest of the market. At that point the question will remain if HDR400, HDR800 or HDR1000 is going to be the premium sticker on the box.

but won't that be far far into the future

Maybe
Maybe not.

the monitors are too expensive.

wrong but you typed a lot of words

Reading these threads, I can see why HDR is taking so long to catch up.
The average consumer is far too retarded to set it up properly.

because the people working on the game, artist or whatever need to take the time to manually customize the luminance of stuff and that takes a lot of time, which is why 90% of HDR games look like shit. Is a lot of extra work for stuff most people don't even use.

actual retard

What are you on about, you're mixing up software HDR with hardware HDR, they might have the same name but they're apples and oranges

how does this improve gameplay?

nah people are just retarded, even non-HDR games look good with Auto-HDR, when calibrated properly

it's essentially about the brights being brights and the darks being darks. You can't do that without a proper OLED or at least a mini-LED.

That shouldn't be the end user's job.
No wonder people are afraid of being replaced by AI if they can't even be bothered to deliver a working product.

"fucks up readability in competitive games"

yokercore.jpg - 1280x720, 62.91K

HDR seems retarded because I can instead just autistically use a 3rd party program to manually color balance each individual game I play

Either better bait then I normally see on this board or just one of the most autistic posts I've seen in a while

Buy two different OLED monitors of the same resolution and refresh rate, both say "HDR 1000"

calibrate equally

both look wildly different

need to recalibrate for every fucking game

This is why hdr will never take off on PC, this is not an issue for televisions you turn on your brand new 8k oled TV it gives you an easy to follow calibration guide for your hdr and colors and its done, set it and forget it, you can dedicate your precious time to watch kinos or play your ps5 games

let's take the same screenshot and put a low opacity black filter over it and call it SDR

You wouldn't understand, you eat shit.

all monitors have this problem, just calibrate it better
even two CPUs of the same exact model will have slightly different overclocking ranges

How would you make an example image comparing HDR content with SDR content for people with an SDR only capable display?

just to save energy and gpu lifespan

Actually laughing out loud rn, you're either incredibly autistic or trolling. Good posting either way, I'll use this one in the future.

That's how it works friend.
More number higher = more energy required.

hdr is a meme, but oled isn't. you literally can't go back after experiencing "black blacks"

be on windows 11

calibrate HDR once

Auto-HDR works with every game flawlessly

Damn, I must be a genius

Wont using less energy which means less heat make it last longer?

Same type of cancer as chromatic faggeration or brown n bloom as far as I'm concerned.

Completely and entirely unrelated from these things in any sort of comparable way. Seems like you're just a blind retard from this post.
The most amazing thing I've parsed from this thread is that the retards on here will even somehow manage to use a fucking monitor incorrectly.

>be on windows 11

You don't.
Color A is color A on both images.
The key difference is that a OLED monitor can display red(off), red(400 NIT) and red(800 nit)

It doesn't actually make that much visual difference and it looks fucky if you have multiple monitors that aren't the same exact model.
Plus software support is all over the place.

Rather than having HDR switch on and off all the time or having one monitor in HDR the others in SDR, I just leave them all on SDR all the time.

i still can't believe that 'eck has 1000 nit hdr that's never used by default. i played through all elden ring thinking it was always enabled but it turned out you need to enable it ingame

Turning on HDR freezes my windows 10, and makes my monitor flicker on and off. It's scary tech that I will never understand or get to experience.

HDR is obviously a scam since you can show the difference in a simple image on every cheap non-HDR monitor

Fucking kek, good bait

24H2 update has fixed literally everything

my fucking cheap MSI 120hz monitor has the worst HDR screen ever but the vizio 4k screen in my living room looks really despite both being LED IPS. why is that my monitors screen is really bad with HDR?

I ordered an OLED monitor, how do I go about setting up HDR? Does it only turn on when I run a game on a per game basis or is it always on?

like light arnt already distracting enough in video games...

its fine in GTA likes but thats about it

RGB has invaded art directipon and these fucking red and green lights are annoying enough in Doom 2016 and prevent having a good look of the game. devs already have enough things to think about methinks

Yes. It's funny that he didn't realize this lol.

Because you can emulate HDR's effect on an SDR display, there are no actual standards for HDR displays, HDR implementations for media and software are wildly incosnsitent, and HDR is complete fucking dog shit on PC thanks to Windows. Any other questions?
At this point, HDR exists for OLED TVs and consoles. HDR monitors are snake oil for retards. HDR TVs can look very good under the right conditions. :^)

I also just use ReShade for this. More specifically, I take some representative screenshots, edit a neutral LUT into one, color correct them in Photoshop or Da Vinci, and then use the LUT. It's quick and easy.

You don't get to say you don't, how would you do it? How would you show the difference for a potential consumer?

As the person you replied to said, no wonder it's taking so long to catch on. If you really don't think the end consumer can be expected to learn or know anything about any expensive product you're shipping, there's just no hope. Consumers in the modern world are hopelessly braindead.

Multiple monitors are anti-kino devices they substract from the experience, you dont need to browse Anon Babble or watch youtube while playing a game

i didnt see you write an argument so ill assume hes completely right and you re a dumb bitch

you have no idea what hdr is

kek what is this response anon

do you guys even understand what is HDR

you can emulate HDR's effect on an SDR display

retard

HDR is complete fucking dog shit on PC thanks to Windows

only on Windows 10. it works fine if you're not a poorfag and use Windows 11

If you have a console you just toggle it in the settings.
If you have Windows 11 you have to calibrate your OS and display.

You're samefagging but what you don't realize is that your literal and actual autism makes it incredibly obvious, because, only an autist would interpret my post the way you have here, kek

Depends. If you're on W10, HDR looks like shit, so you'll want it off most of the time, unless you're actively viewing HDR content. W11 fixed it apparently.

It’s ok to be wrong faggot

OLEDs are disgustingly dim in HDR

Here comes the airpwane~!

Retard.

Make sure you enable HDR 1000 mode instead of HDR 400

HDR is a color format, it's 10bit while SDR is 8bit, it's missing 2 bits but that's missing over 900 million compared to the other. "HDR" monitors are esstentially 10bit

you can emulate HDR's effect on an SDR display

That's like saying you can emulate 120fps on a 60hz display by using motion blur lmao

kek starting to realize this isn't a bit

Most people don't undervolt/overclock their GPUs, or calibrate their TVs.
There's a reason why iPhones are the best selling phone in the world, and why "enthusiast" is even a word used to describe someone if they do more than the bare minimum.
Yes, consumers are the lower common denominator, but they're also the majority. New tech will never take off unless it works out of the box in an idiot-proof way.

read the thread anon, they still think Oblivion's HDR setting = real HDR

file.png - 640x480, 247.02K

No PC monitor can do HDR properly. Some just suck less than others.

HDR is a color format

No it's not retard. It uses a higher color space because otherwise you'd have horrible banding, but there's nothing stopping you from going full retard and having HDR with an 8 bit per channel image.
HDR is about luminance, brightness.

Why hasn't this taken off in gaming?

cause you HDR fags can't comprehend that SDR only looks bad on HDR displays
cause most HDR monitors have misleading labels in regard to what type of HDR they have and how they achieve it
etc

is HDR better than SDR? obviously it does, that's not even in question, the issue is people claiming SDR looks bad when it absolutely does not on non-HDR displays, and the fact it's a "standard" that hasn't actually been standardized at all across all the various monitors that claim to be HDR, so the result isnt always the same across the board

HDR feels like a cope. Around 2021 I bought an HDR 4k TV and with it on and off, the only time I can tell a difference is side by side, but rather it's on and off, it didn't matter. I feel like this shit is just audiophile levels of "but I can tell the difference". Happy for you, but HDR isn't all that fantastic for gaming. Movies, on the other hand, that's different and it really matters there. HDR just seems to be slapped on when all you need for a game is solid art direction and focus more on ray tracing then whatever the fuck HDR is trying to pull off.

Muh black levels

Yeah, and I bet that gold plated HDMI cable is providing better picture quality too.

It's wrong to be wrong.
In fact, there's nothing more wrong than being wrong and thinking it's "okay."

I take some representative screenshots, edit a neutral LUT into one, color correct them in Photoshop or Da Vinci, and then use the LUT.

It's quick and easy.

file.png - 1200x675, 893.94K

i compressed the colors for a faux HDR effect using ebin LUTs

now my 8 bits of color show as 6 bits total and trust me bro it looks just like HDR

then what's HDR10? thats supposed to be the real HDR format right?

If you can't tell the difference then your TV is not HDR. It's just advertised as "HDR"

I can see these HDR examples just fine on my SDR screen. Why should I buy a "HDR" display again?

HDR10 is a marketing trick, it has nothing to do with actual HDR.
That's like comparing trans women and women

You could say the same comparing HDR to the human eye.

Think of it like the difference between a NES and a modern game. A NES game has a limited palette, it can only render a limited number of colours and of these only so many at once. HDR is like that for modern displays - we go from 8-bit colour to floating point colour, 8x the data. Obviously, the limitations aren't so obvious these days, but smooth gradients, portraying dark areas in games in a way that doesn't get muddy, and portraying brightness with fine detail aren't possible on 8-bit colour without hacks like dithering.

Like I said, it's cope. You know how long I've heard this shit? If you can, good for you, but if you're the type to keep your music files as .flac, I'm not listening to you.

Is your TV an OLED TV? As people keep repeating in this thread, there are plenty of monitors/TVs that are advertised as "HDR" that don't actually meet the standards for what people are talking about when they talk about HDR.

Are you a idiot? What you are asking for has already happened before.
You would go to a store, and the store would have 1080p content rolling across all the screens, alongside 720p HD ready and 480p TVs.

And today you would go to a store, and the OLED would have grains of something against a black background, or a night sky, or a very bright Mediterranean scene.

Online marketing material

You don't.

HDR is a a color space

lmao. And I love how close it is to the truth.

there are plenty of monitors/TVs that are advertised as "HDR" that don't actually meet the standards for what people are talking about when they talk about HDR.

So HDR is a scam? Cool.

It's a format, not a certification. You can buy LCD driver boards from China that will make your current monitor accept HDR10 format input, but because your display doesn't have a bright backlight and some way to get very dark blacks, it will just look washed out if you enabled HDR.

You're the one coping right now. On a real HDR display you don't need to guess if it is on or not, it is obvious and impossible to miss.

this HDR argument again

lmfao. no current games, let alone movies, are going to magically look better by using cope technology to masquerade the bad color grading/cinematography/art direction ubiquitous across media
youtube.com/watch?v=EwTUM9cFeSo

real HDR display

real

Like I said, this shit is just "gold plated" bullshit. If you really need to detail "real", then this is just some real videophile bullshit.

You also use 60hz right? No one can see more than 60hz

Why would the LED monitor have a haloing effect that theoretically is present only on minileds? This image isn't a good representation of anything. Probably pure marketing slop.

Dragons Dogma 2 and MHWilds was graded in HDR. That's why the game looks so bad in SDR.

If your monitor cannot produce a large difference between the blackest black and brightest white, then it is not an HDR monitor.
Displaying a 4k video downscaled to your 1080p monitor doesn't make it a 4k monitor. lmao

my LCD display can perfectly show the difference here

but i'm supposed to believe that my monitor can't actually do this in games because i """need""" OLED

Yeah this thread has convinced me HDR as well as OLED is just a retard-pitfall for consoooomers who can't critically think for a few seconds about why an image could somehow display the difference but the actual media couldn't. Have fun consooooooming when the next gimmick bullshit comes out and you try to convince me it's even better than OLED because it finally has 1/4th of the motion clarity of a CRT.

there's not a single game that looks realistic in daytime and that's because everyone's still stuck on their 8-10bit monitors.
it could even be path traced and it would still look like a game.

calls something marketing slop

uses the term "LED monitor"

You fell for marketing and don't even realize it. That's called an "LCD"

So HDR is a scam?

Yeah "HDR" can be a marketing gimmick as a scam. HDR isn't a company or a product or a device or a manufacturer, so I'm not sure why you typed that like I was supposed to be offended or something, but yeah people can scam you by falsely advertising HDR. I assume your TV isn't an OLED then?

You're one of those retards who cries about things you don't have, and then when it is cheap enough for you to have it you pretend you liked it the entire time.
Hating HDR is like hating 120hz or hating 1440p. Massive cope

Retard, do you think an mkv rip of a movie you torrented has the same colour depth as the physical film? HDR is not flashy bullshit, it is a container for holding more data, which is why it is so hard to explain to retarded laymen.

The only real HDR is HDR400 and up.
OLED, miniLED, and even some VA panels can deliver a true HDR image.
IPS monitors can't do HDR.

I don't typically look at them while gaming except to display game-related information like infographics or wikis, or to have discord visible.

For non-gaming use, however, multiple monitors are invaluable.

IPS monitors can do it if they have miniLED backlights

cause i can replicate this with ReShade?

monitors themselves are a fuckin scam ngl.
paying premium price for a monitor doesnt even get you better technology - you just get reshade effects bundled with them that are locked to their proprietary software.

hdr10 simply means that the device can read/output 10bit colors. it says nothing about the actual display

Sure buddy, but the panel I wanted to point out in pic-rel was marked as LED. Get your head out of your ass.

No, because HDR monitors make the bright pixels actually brighter. OP's image just simulates that.

That person is trolling you. I'm convinced that half of the posts in this thread are bait because the reasoning is so easy to come up with if you're trying to think of how to troll on this topic.
Do an exercise for me, and write out a few ways you would bait people about this very subject. Write it down in a notepad. Then, check how many posts you're arguing with are using one of those arguments.

Yeah it was labeled an "LED" because when LCDs got LED backlights, companies realized they could scam people into thinking it's some kind of new display technology.
LED displays are only used in jumbotrons and digital ad signs. You don't own one.

miniled is just the backlight tech used. The panel still is either VA or IPS with IPS requiring, in general, more zones than VA.

Reshade is a post processing filter. Real HDR can only be done at a GPU scaling level.

He's not. Anon Babble is comically retarded when it comes to display tech. The only correct opinion most of this board has is that high refresh rate is good.

OP's image just simulates that.

Oh so you can just simulate HDR with ReShade? Why would I bother with your retarded marketing bullshit if I can just do it for free? Lmfao the self defeating arguments by HDR/OLED cultists trying to justify their buyers remorse is just hilarious. It's legitimately cult like.

There's nothing stopping you from making your screen dim and dull like on the right, yes.

i read that HDR 400 is basically a entry level HDR format, basically only going up to 400 nits.
my conception is that HDR is just more than a bright/dark luminace, there's still color needing to represent that bright red and I was wonderinf if a richer color space helps with that but I don't know too much about color theory

I thought that at first, but I'm starting to think that half of this thread is just 3 people trolling and 3 other people replying to them seriously. I mean look at how all of the arguments are just repeating the same shit like GPT output, and none of it makes sense or is actually well thought out.

Boost

If it was ChatGPT it would be giving an actual argument.
You see these kinds of retards all over the place. You want good mice? Scam. Good keyboards? Ripped off. Good GPU? Scalped.
They're genuinely seething that someone has something they don't have. Sour grapes.

i read that HDR 400 is basically a entry level HDR format, basically only going up to 400 nits.

that's correct

This. People who are laughing at the idea of just mimicking HDR are ignoring the part where every other fucking game doesn't actually have well-implemented HDR or even good color grading. Hell, most of these games don't even use standard SDR sRGB color space well, let alone Rec. 2100. For fuck's sake, look at the black levels in games like Starfield or Monster Hunter Wilds. The blacks are so raised it's like a fucking baptist revival. On the opposite end of the spectrum, you have games like Tears of the Kingdom where the highlights will occasionally get blown out beyond belief despite the game's low contrast, low saturation grade.
tl;dr most games are graded so poorly that you have lots of free real estate to work with within the standard sRGB gamut that most monitors can display 99-102% of.

Good video, by the way. VFX has been a fucking disaster for color grading.

Wilds HDR is so bad that you need mods on PC. On a TV / console it's probably better, but I don't play on console so I wouldn't know.

you crushed the colors idiot

HDR600 is a sweet spot for most people.

HDR won't fix shitty grading. Good luck fixing that on your own.

had to pirate a paywalled tool that can change the tonemapper of UE5 games

scale(5).jpg - 1000x572, 168.12K

HDR is literally just about the luminance. It's all about the dynamic range.
If you have a high dynamic range image with 8 bit color however, you're going to run into awful banding that will look like shit. That's why pretty much every HDR standard requires a wider gamut and higher bit depth.

Just fuck with your system brightness and game HDR settings.

The secret is, most of Anon Babble are teenagers or broke early-20s, who don't have the money to drop into luxury electronics.
When OLEDs reach a more affordable price point, say, $200, then you'll be able to have an actual discussion.

320x240

ah yes I see the difference clearly

my conception is that HDR is just more than a bright/dark luminace, there's still color needing to represent that bright red and I was wonderinf if a richer color space helps with that but I don't know too much about color theory

to display "real" hdr you need a device that supports hdr10 protocol/format and hardware to display it

for example, switch 2 support hdr10 but has a normal lcd screen, so it won't even reach 400 nits, and that's what's considered as "real" hdr, thought in practice it can get only a little bit brighter than normal lcd, that's why everybody says hdr is a meme.

if you see how deck's 1000nits hdr looks then you'll easily what REAL hdr is and what it does (it's still a meme btw, but it looks very good)

yea but a lot of people are saying the reason why they think HDR looks bad is because most games that have "HDR" is still using 8bit color space.

There's no use case for that.

This is not a comparison.
You can't see the HDR without a HDR display, the HDR picture there is just literally what it looks like and the "SDR" picture has a filter over it to look worse.
HDR is a scammy gimmick to improve the gamut instead of just making displays with a better color gamut.

HDR, High Dynamic Range, means that there is a bigger increase in lighter and darker colors that can be shown, but not at the same time. because it accomplishes this through brightening and darkening the display to simulate those colors, usually in sections for anything beyond a mediocre HDR display.
So it is attempting to create the possibility to display colors so you can get a richer image, do you know what else lets you display more colors?
HAVING A BETTER SCREEN WITH A DECENT GAMUT.
screens exist that can display over 90% of the real colors, why buy a screen that does 60% and has a gimmick setup so it can display 80% of them (if its a solid color, because it is shifting the gamut around) sometimes instead of just one that can display all of those colors at the same time and doesn't rely on trickery?

HDR is dogshit, push for better color gamuts on your screens, not this sewage run off of a feature.

It would be impossible to replicate such changes with postprocessing or HSB adjustments. Look how the interior gets added detail or the gnome in front of the screen becomes visible.

video

I wasn't that aware of this because I don't watch a lot of modern television or movies, but he's completely right. The way that modern movies are produced just rapes the creative intent out of anything that isn't produced on a physical set by someone with a vision. Niggers need to stop fucking fearing crushed blacks and clipped highlights like they're literally Hitler, too.

Why do we need to see all (the detail in highlights and shadows)

This is also something I don't see said often enough as it applies to games. Games, by design, have virtually no relevant information in the darkest or brightest part of the image. By design, most games rely on contrast (whether it's a glowing item, a UI element, or the dreaded yellow paint) to direct the player's attention on rare occasions where the player needs to actually pay attention to these areas. I can boot up Dark Souls right now and be completely immersed in the experience even though the game crushes the blacks way, WAY too hard in many areas.

same anon here, i also think my monitors HDR looks really bad because the monitor can't go up to 10bit and when I try to run a modern game with HDR10 color space, every color looks washed out. where as my living room TV doesnt have that problem

HDR is a scammy gimmick to improve the gamut instead of just making displays with a better color gamut

And also a crutch for colorists and developers who don't know how to use more common SDR gamuts effectively

I can

The reality is that most people give a shit about color, the average consumer doesn't know about it, developers don't know shit about it except what it looks like on their on end (and who knows who setup their monitors), artists in general except for high trained profession artists don't know shit about how color works on displays.
So we all end up with mediocre products displaying mediocre products and watching people fling shit about what is slightly less shit when actually good solutions exist is psychotic.

Good luck fixing that on your own

For better or worse, I've had to learn how to do this because every other modern video game has bad color grading, especially for SDR. It sucks.

a paywalled tool that change the tonemapper of UE5 games

What's it called? I might want to use it for Lords of the Fallen.

I realize this every time I try to actually have a conversation with anyone about remasters. I start complaining about how the Dark Souls "remaster" completely butchered multiple effects and the lighting in many areas and their eyes just glaze over as they remind me that "I can't tell the difference, dude."
Every single thread about HDR is just further evidence that no one knows what the fuck is actually going on with display technology.

Based, thank you. I wouldn't have been able to find the Kemono myself. I do really like LotF's art direction and grading (not to mention the fact that it has proper fucking brightness, contrast, etc. settings unlike 99% of games), but I just want to mess around with it for fun.

haha all right champ

what's wrong with this if it doesn't negatively affect the image as a whole? am i supposed to be upset because i can barely see what's beneath a pile of debris next to the player character? or because the torch on the right i slightly too bright, even though the game is doing nothing with adaptation?

Based. These show off the highlights and bloom really well, especially around skin tones and reflective materials. Amazing how much better HDR shots can look when actually encoded and displayed properly, instead of getting nuked by lazy tonemapping.

90% of this thread is blatant bullshit spewed by retards who are too stupid to know that they don't know shit about what they're talking about. Holy fuck Anon Babble is infested with troglodytes.

paywalled LUT adjustment tool

Are you fucking SERIOUS
Patreon was a FUCKING mistake

Nobody tell this guy about paywalled AI "art"

Various mods are paywalled this is nothing new. The FFXIV raider mod is paywalled.

"HDR" is completely worthless. Higher colour depth isn't.

Quoting specs sheets is entirely pointless and meaningless to 99% of users.
The real test is if the average consumer is, A, going to notice, and B, if they're willing to put the extra time, effort, and money into getting that difference.

no you just have no idea what you're doing
might as well just push the vibrance in nvidia inspector to 60%

Wow, it's fucking nothing.

gimmick.png - 1920x1080, 549.76K

The funny part is that I specifically turned off HDR with my monitor and left still looks far better than right, so it's not even HDR. You (or some other kike) disingenuously washed the right screenshot to peddle a marketing gag.

Finally, somebody with a brain that even know what a color space is.
People also underestimate:

Color accuracy

Greysteps (actual ability of the panel to diffetentiate brightness levels. This sucks on OLED for lower levels)

Screen uniformity

The value of having a light sensor that dynamically adjusts content based on environment light intensity and white point. Rarely is this implemented

flicker free (OLED has brightness dips that will fuck up your eyes over time)

anon thinks he solved the mystery

There is no mystery, everything in life can be traced back to Jews wanting to peddle garbage to hoard more wealth, this is nothing new

HDR adds a luminance or brightness value to the image. In a typical monitor the brightness will be static at a fixed amount with the darker tones into black being lower. This is how contrast is measured, the difference between the brightest and darkest a screen can get. A bright white spot on an HDR screen might be 5 times brighter than on a conventional screen. It looks good, I like using the AI HDR filter on the Nvidia app but the drivers are buggy with that right now. I like that Nintendo is using it on the Switch 2, in time it will become the new standard for all displays.

retard

it's the same with 30 vs 60 fps
looks exactly the same

votes for the jewest of them all

the ironing

he doesn't know

OH NO NO NO

trvke

Windows is a big part of why people think HDR sucks.

Normally windows is in SDR and it doesn't do anything, it just sends out an SDR signal and expects your monitor to sort it out. sRGB is the default SDR colourspace but most modern monitors can show colors way beyond that. This makes it so that when windows sends out 100% red, your monitor is probably showing 140% red because you don't have the monitor in sRGB mode.

So what happens when you turn on HDR in windows? Now suddenly windows decides they need to handshake with the monitor, the monitor sends windows it's EDID data which tells windows how bright it can get and how red it's red is (and blue and green). Now windows can be color managed and it translates; if sRGB red is 100% red and this monitor can show 200% we will send out a 50% red and the user will see exactly the red red that sRGB red is supposed to look like.

This means that for a lot of people, they've been looking at all colors stretched out the ass into technicolor wonderland but then they turn on HDR and now your colors are correct, and look like the creator intended, but they're a lot less saturated.

There's also the contrast/gamma part where windows uses an sRGB gamma curve in HDR when 99.9% of devices target a 2.2 power law gamma curve. This mismatch causes everything to look more washed out in HDR.

This mismatch causes everything to look more washed out in HDR

Except actual HDR content which doesn't use gamma curves, it uses an EOTF

Windows uses sRGB because of its wider gamut. 2.2 can result in crushed blacks, and on PC people probably care more about preserving as much detail in shadow as possible, especially for competitive games.

HDR should be banned.

And incest should be legalized?

Plenty of games support HDR? unless you're playing indie games. Like most triple A releases are supporting it. But you might have to look a little harder for developers that are truly taking full advantage of it. Whether they just have it there as an option, and SDR was really just the intended experience is a little harder to work out.

Because we've had working HDR->SDR compression in games for 2 decades.

I accept your concession

It's intentional. Windows HDR prioritizes color gamut over contrast. The same goes for iOS.

Works for me.

It's easily the biggest upgrade to graphics you can get.

Picture is literally false.

Because I can see the difference on a SDR monitor. So it is a lie!

if u dont see banding in ur game u wont benefit from sdr to hdr

simple as trvth nvke

sRGB is the smallest gamut color space that's still in use. It's also the standard color space that 99.9% of media uses which is why in SDR windows just assumes you're viewing it on a sRGB calibrated monitor.

2.2 won't crush black because 99% of devices use it. Windows is in fact it's causing everything to look washed out because nobody follows the sRGB gamma curve. They are just being retarded because it's technically correct, they're following a standard written in the early 90's that was designed to be used on shitty CRT monitors to view excel documents in a brightly lit office. Literally every other industry has moved to 2.2 gamma being the new standard.

pic

Rec709=sRGB

proxy-image.jpg - 507x568, 61.88K

The blacks look raised way too much with HDR by default, I use a 70 contrast value in Nvidia control panel to correct this.

This, but instead of banding I can now see jpeg compression.

yhey probably use it for compatibility reasons, most monitors are srgb

Because it's hard to market. Proof: this very thread. You need HDR to market HDR.

You can technically show it on VP9 WEBMs on Anon Babble, but whether or not they display properly is a dice roll. Pic not related, this is just a regular SDR webm.

hdr.webm - 1280x720, 2.49M

If I remember correctly, they display in HDR on desktop HDR-enabled chromium and in Kuroba/(Blue) Clover/etc, but they show washed-out SDR on mobile chromium and on non-HDR desktop chromium.

hdr.webm - 2048x2048, 1.46M

Left HDR, Right SDR
I still can't tell a difference sometimes...

file.jpg - 4000x1848, 847.79K

using AI slop as a wallpaper

do normalfags really?

I generated it, I like it, I use it.
I don't see the problem.

I still don't know if my monitor has HDR or not and even if it did I would not know how to set it up properly

MUH LIVELIHOOD OF LE HECKING ARTISTS REEEEEEEEE

You would certainly know if your monitor had HDR. It would look like looking into a portal of an alternate universe instead of just looking like a plain old screen.

This one should be blindingly obvious. I know for a fact it works on my phone.

Search Windows HD Color Settings, if it tells you your monitor can use HDR, stream HDR video, and use WCG apps, you can use HDR.
If you don't have Windows, then no, Linux does not support HDR in any meaningful way.

test.webm - 1372x2048, 1.26M

LEAVE MY PIGSWILL ALONE REEEEEEEE

that desktop

holy normalfag

HES NOT USING MUH TROONIX MINT WITH WATERFOX REEEEEEEEEE

because things like IPS, high refresh rates and VRR are more important for videogames

this but no arrow

This is what's been killing me. I'm using an old LCD 1440p monitor, and I'm thinking I wanna do a dual monitor setup. I'm stuck between maybe wanting IPS or OLED. I keep hearing about OLED burnin, but people say that modern OLEDs don't need to be babied as long as I'm not doing repetitive work style tasks on them.

Not sure what to get but I wanna get a real HDR monitor at some point.

i have a 3070 and a monitor that has hdr
i dont know how to use it though
i didnt even know it existed

holyshit i just clicked yes in windows settings and it got really dark

That one is really weird.
First there's the difference in hdr and sdr

file.jpg - 4000x1848, 493.32K

And then there's a notable difference between chrome and a video player

Took a photo with the left display with HDR enabled, and the right monitor with HDR disabled, while the image is displayed in HDR.

That's a meaningless comparison.
An SDR screen trying to display HDR content will always look like shit.
You'd need to do a comparison likeWhere the HDR image is converted to SDR on the SDR screen

chrome://flags/#force-color-profile

SDR uses gamma which just maps brightness on a scale from 0-1. This means that 100% white can be completely different depending on how bright your monitor gets.
HDR uses PQ which maps brightness to an absolute standard based on real life light levels. This means that when a game shows 300 nits of brightness, your monitor is supposed to show 300 nits exactly. This is why most monitors lock out your settings when turning on HDR.

AutoHDR and RTX-HDR take an 8-bit SDR image and try to stretch it into an HDR PQ image. This does not always work because in 8-bit 100% white is very common. This means that the white on your hud, the white in the lamp on the desk, the white in the sun and the highlight of the characters shirt are all the exact same brightness, which AutoHDR usually translates to 1000 nits, meaning retarded levels of brightness for stuff that's not supposed to be that bright. While a true HDR image has the information to say the hud is 200 nits, the lamp is 600 nits, the sun is 10.000 nits, the shirt is 300 nits. They're all white but wildly different brightness.

They also slather the image in debanding filters to hide all the stretching meaning you lose some detail.

3.1/5

OS: Windows 11 version 22000.0 or higher

kek it appears that uwp store hasnt improved in 10 fucking years

HDR is just snake oil to sell expensive TVs

Seconding this retarded question but unironically. Why can you do all of that on jpegs in web browser and image viewers, but can't do that in games on the same monitor?

now do this :)

You're just viewing a normal SDR image that's faking the difference. You can't show actual HDR because actual HDR needs a special HDR monitor and a special HDR toggle turned on in windows and a special HDR encoding on the file you're trying to view.

The Windows Store is still shit, but the app works.

Because muddy ps1 graphics that make the games run smooth as butter are the future. The tariffs will ensure everyone is stuck with potato computers. You will play the 20 year old games and be happy.

Can't really see the HDR effect here

Because these examples are specifically made to "emulate" HDR on an SDR screen.
Take this one for example. That gif is meant to emulate what you see if you were to take a camera, point it at an HDR screen, and reduce its exposure. You can see some parts of the image still remain very bright. On an HDR screen, you'd see the image at a regular exposure, but the highlights would be VERY, VERY bright.
A better example (that actually uses a real camera pointed at a real HDR screen) is here: , and it demonstrates the same concept of HDR.

HDR equals needing an OLED or miniled VA, which were deprecated by cheap oleds entering the market and basically no one is releasing them anymore. Samsung developed decent VA and miniled tech, released like 4 monitors, sold the patents off and focused on QD oled and other per pixel future tech.

I remember that anon from the second example posting his HDR jpegs here

slightly brighter

HOLD
THE FUCK
UP

THIS IS THE MOST GROUNDBREAKING DISCOVERY OF
ALL
FUCKING
TIME

burn-in

But niglet I already play 20yo games and I am very happy. native 4k with real high framerates is kino as fuck.

miniled IPS works as well.

I would not buy a miniled monitor where the base contrast is 1:1000.
Imagine the fucking blooming when you have a bright highlight against a dark background.

Hey, it looks exactly the same on my HDR / SDR screen setup

Really, not problem if it has enough zones. VAs do with 500something zones what IPS need more than 1k to do but the end result is largely the same. When viewing content of-course. Blooming is very noticeable in desktop use but that is true for any miniled.

You can download a color profile to use in HDR that fixes this.

github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm

But windows will also apply the profile on real HDR content causing blacks to get crushed there.

If you're autistic you can have it on when browsing shit and turn it off when HDR gaming or movie watching, but then HDR gaming is a clusterfuck where it's a 50/50 chance if the game will compensate for the sRGB fuckups of windows.

srgb_vs_g22.png - 994x1034, 58.5K

SDR

all the best graphical techniques in the world and the brightest we can make a scene is Microsoft Word document white

Yes, actually.

That's me. I wanna introduce the world to scene-referred painting and I intend to use my bad art to advertise it so that others will draw horrifying HDR porn that both sears my retinas with the weight of God's shame on my soul, while also literally searing my retinas in a physical manner.
files.catbox.moe/202wjb.jpg
files.catbox.moe/83j7qj.jpg
files.catbox.moe/ezl67o.jpg
files.catbox.moe/vojcck.jpg
files.catbox.moe/e9ugw9.jpg
files.catbox.moe/9qe6hp.jpg
files.catbox.moe/4xyq0m.jpg [R18]

I have an OLED monitor that can do HDR properly, and I do use it, but it's fiddly, and the industry is still a fucking mess when it comes to deciding on standards.

Far enough out into the future, enough HDR displays will be in the hands of the consumers that they can fucking settle on one standard to use and it will be the standard, but as of right now it's pretty cumbersome to use even after you buy a proper display.

xbox 360 era games must look like the ultimate ludo on the right monitor

Do you realize how much of a graphicswhore you have to be to care about graphic after the 7th gen? Most people don't give a shit about marginal improvements

Both are HDR to me on Vivaldi browser but they have different contrast and color.

Are you sure? Because I can see the bright light from the windows adjusting to the brightness of my monitor when I open the JPEG example.
It's missing from the webm example. Don't know how else to explain it.

I can see the HDR in this example

You can't actually. A JPG can't have HDR info.
Here's an exaggerated example where I maxed out the exposure on my phone.

file.jpg - 4000x1848, 710.17K

A JPG can't have HDR info.

This is a new jpeg format that iphones use, it can have HDR data. I don't know exactly how it works, but it's HDR.

Yeah, it's probably because Anon Babble doesn't actually support HDR within VP9 WEBMs and the fact it works at all is a happy accident. After all, they specifically warn about extraneous data when uploading HDR10 MP4 videos and block the upload.
As said, some browsers can render the HDR and tonemap to SDR, but other configurations just botch it completely and render the PQ data as sRGB data, making it look washed out. I don't really care about encoding my art in VP9 when I have MUCH more control when using "Ultra HDR" JPEG images. I only have a few animations, anyway, I can just catbox them. Like this one: files.catbox.moe/rbg4ey.webm [very NSFW]

That's where you're wrong, kiddo.
"Ultra HDR" JPG images, also known as Adobe Gein-mapped JPGs, or ISO 21496-1, use the metadata block of the JPEG file format to save HDR gainmap data, which, when read by compatible applications (such as Chromium browsers and Google Photos) will apply the gainmap and form an HDR image. As it sits on top of the time-tested JPG format we all know and love, it's fully backwards compatible, and all image viewers will be able to display the regular SDR image while discarding the HDR data.

You should just use the avif format for the animations.

Soulless

That's weird, it looks the same for me between chrome and firefox.
And the webm and avif files work as expected.
But I don't have an iphone and I don't know enough about this, so sure.

files.catbox.moe/rbg4ey.webm [very NSFW]

This is some niche of a niche of a niche shit. Keep it up

Does windows actually handle it well? It looks hit or miss on my tv. For instance, the last World Cup games looked awful on the fox app

Huh, didn't know you could do animations with avif. Though I suppose it makes sense considering the format is based off of AV1.

so sure.

I don't blame you, HDR is a fucking mess. My earliest "HDR-ready" work was all the way back in 2020, and I only recently started exporting them in actual HDR formats this past September

windows 11 handles it well enough for me to always have it ON

file.png - 1055x749, 193.67K

Is AutoHDR or RTX HDR worth any good? What about RTX Video HDR?

because real HDR doesn't exist in 95% of monitors on the market
you either need a miniled with plenty of dimming zones and high lumen capability or an OLED, a shitty hdr400 branded monitor is a complete lie
hdr standards is also all over the fucking place with windows being unusable garbage and plenty of games out there not having proper hdr configuration options

RTX HDR is pretty good but a bit more fiddly than AutoHDR and has a bigger impact on performance.
AutoHDR just works everywhere. RTX HDR works in some games and doesn't work in windowed games.
I just have AutoHDR enabled for everything and don't bother setting up RTX HDR anymore.

Oh yeah, and RTX Video HDR is really good. I always have it ON, and it doesn't need any additional tweaking, unlike the gaming implementation.

you need more salad doug

thank god IPS tech died and we dont have to think about 1:1000 contrast ratio displays ever again

Remember, the human eye can't see past srgb according to Anon Babble. Also: the switch is going to cause a MASSIVE uptick in "hdr is a scam" posts because the inbuilt LCD screen is edge lit which makes it useless for HDR - switch 2 HDR support is intended for when used docked but lmao switch users understanding this.

Fuck accuracy - I want to tonemap my srgb content all the way out of dci-p3 so HDR stays off until needed.

No and i dont care lol, give me good game design

Fun fact. Windows 11 supports HDR wallpapers in jxr format.

oh shit really? I gotta try this.

wallpapers

do niggas really

if you aint playing a game on your computer you are wasting you are life

I'll consider it, but in my experiences, AVIF stills are extremely posterized on my phone, and that animated avif slows to a crawl. And my phone is a Galaxy S24, so it ain't no slouch.
They display okay on my computer, though.

nobody understands what it actually does and it requires specific hardware and software support

oooh yeah I love it when everyone's skin gets oversaturated to look like Trump

I've had real trouble with getting Steam screenshots to work with HDR, even though it's supposed to support them.

Never hear of Steam supporting HDR screenshots
You can use Nvidia's overlay to take .jxr screenshots

win+alt+prntscrn takes a screenshot with gamebar and it gives you an hdr jxr and a badly tonemapped sdr png

I think they started adding this with the OLED Steam Deck but it tends to suck on PC.

steam hdr.png - 1344x995, 118.45K

It's easily the biggest upgrade to graphics you can get.

Shill or retard? Call it.

Oh yeah that works too.

GameDVR supports HDR too.

Lack of standardization. With resolution, you change a setting and the exact same thing happens the same time. With HDR, most people don't even know of their monitor does real HDR. Then you have different implementation per game that relies on devs setting it up correctly. Then you have to factor in user's monitor calibrations offer calibration tools in game. I have a nice OLED monitor and generally know what I'm doing, but there are still games where it looks terrible because devs implemented it incorrectly. If someone tried it once and saw that, I can definitely see them never bothering with it again. Anyways, just switching to a good high refresh rate OLED is the biggest, simplist jump you can make in general.

HDR ON
YAKUZA NIGHT LIFE ON
KINOE ON
MY DICK ON
EVERYTHING IS ON

avif

shame, jxr is a superior format

What's the deal with that? Is 10 really that bad? I don't want to "upgrade" to 11 just for HDR because I use 11 at work and it's fucking terrible. Is it just the auto HDR thing 11 has?

Hdr? All my games are hdr on playstation. Maybe stop playing on substandard pc hardware and get a real games machines.

Because it's a fake "comparison" taken with a dogshit digital camera and the image does not look like that to the human eye.

but there are still games where it looks terrible because devs implemented it incorrectly

Reminds me of GRIP combat racing. It supports HDR but basically nobody has ever made it work properly and the dev response was "we just used UE4 defaults and don't know what the fuck we are doing".

hdr is essentially broke on win10 and microsoft will never update it
get Windows 11 Enterprise, it's literally just Windows 10 but with all the latest updates

Does it reduce overhead, remove all telemetry and AI, and fix the shitty UI?

Nearly every game in the past 15 years is internally calculated in high dynamic range, it's only at the final step that it is tonemapped down to SDR and HDR. Mods like RenoDX take advantage of that and replace the final step tonemapper with their own, making SDR only games have perfect HDR or just fixing devs' shitty HDR implementation. Very recommended, if the game you're playing is on the supported list you just gotta install reshade and then drag and drop the renodx file into the game folder.

It's not a reshade filter, it just uses the reshade engine to hook into the game.

you can debloat it with one simple script, and it actually doesn't break anything
christitus.com/windows-tool/
the UI stays shitty though

Console tard thinks it's the platform he plays on

RenoDX does some magical things for deus ex mankind divided when you get to night prague. Dat redlight district.

HDR is a meme

Ultrawide is peak gaming.

prey vision screen is peak gaming

based gacha enjoyer

OLED gaming monitors can.
you CAN afford a $2500 monitor (and do research for hours to make sure the speficic one you're buying has "Readable Text" technology), right?

Need a new monitor.
What's the baseline for good HDR?

just buy multiple monitors

Like this?

file.jpg - 1848x4000, 281.24K

It's predator vision.

I have multiple monitors.

troonix vegans at it again

Good HDR is expensive mini-LED and OLED monitors.
If it's rated as HDR400 or higher (higher is better), you can consider that a decent HDR monitor.

Then go for HDR600, it's the most popular one, and it's not very expensive.

What does this mean? What's the difference between 400 and true black 400?
Is true black 400 better than 600?

Is true black 400 better than 600?

yes

you're not poor, are you?

212607.png - 1145x888, 592.64K

then why number smaller

'cuz the Trueblack standard is for oleds, the regular DisplayHDR standards are for lcds. There is more to HDR than straight up peak brightness - contrast matters and that is an area where oleds reign supreme,

Recently got a 27" 4K QD-OLED and it's great. 240 HZ, 36 bit color.

Funny coincidence. I just enabled it in windows again. After the first minute you realize how much better everything looks.
The main issue is that some games want you to set several parameters yourself instead of offering a ready HDR preset and that taking screenshots is outragously crippled. Video recording is fine but normal screenshots are so bad.
Gamebar at least saved an almost grey looking .png and a .jxr with all the colours but you cna not open it in anything.
How is this shit still not fixed after all these years? The technology itself is great.

OLEDs can't into full screen brightness and will seriously ABL to near SDR brightness levels when dealing with bright scenes. So True Black is for them.
MiniLEDs can easily hit 1000nits no problem but obviously suffer from MiniLED problems like blooming and discreet illumination zones.

This thing good?
AW3225QF

file.jpg - 1778x1000, 780.86K

kek

One of the top rated screens according to RTings and it is usually the cheapest of the 4k 240hz models and is one of the few with dolby vision support.

alianware

LMAAAAAAAOOOOOOOOOO

But are you just talking about auto HDR? I only use HDR for games that have proper HDR support implemented well. I don't care about auto.

I don't get it

Like come on man. Settle on a standard and fix this shit.

hdr jxr.jpg - 1148x476, 124.62K

this lmao
I have a great HDR monitor and GPU but I've had HDR off because it's so annoying
I ended up getting a laptop that has Windows 11 so it's not annoying to use and yeah it's actually good lmao

I tried Sekiro HDR the other day and almost threw up.

Alienware is a gamer brand aka retards pay premium for middling at best products.

vegan

You sound like a tendie, but no, I'm on W10 education.

The UI is kind of a deal breaker. I can't accurately express how much I hate it.

HDR10

OH NONONO AAAAAAHHHHHHHHHHH

HDR is not an on/off toggle and there's no proper standards set.
Different displays can range from good to practically unusable in terms of HDR. Plenty of people don't have proper HDR displays.
On PC, on Windows it's not the best experience. You need quite a bit of setting up to do. And SDR to HDR is not always the best thing. AutoHDR/RTX HDR don't always work well.
Don't get me wrong, the end result can be brutal under the right type of display and settings, but the autism needed to get some games to look right... is not pleasant given the amount of money you already pay for the technology. Should be practically handed to you rather than having to fuck around with RenoDX and Special-K and all this fucking nonsense.

It has HDR10+ retard

That's an encoding standard you retard.

alianware is a shitty brand and they don't manufacture their own panels
buy an AOC, they own an actual megafactory that ships panels to other brands

The problem is that HDR is so heavily variable depending on users’ displays that it becomes more of a nuisance than anything
It’s good when games automatically apply the system settings’ HDR like some first party Sony games, because otherwise you’re stuck in a calibration menu for ages

alienware

LMAOOOOOOOOOOOOOOOO

Neat, thanks

wouldn't buy it now. other monitors got firmware updates to fix their gamma curves at least a little, dell fucking fired their firmware team

hdr10+ is not a real hdr you fucking retards lmao

Only two companies make oled panels - LG and samsung.

people judging alienware on their reputation for shitty prebuilts 15-20 fucking years ago

Bunch of poorfags in this thread coping that they don't have the kino called HDR. Use a 4k HDR OLED on PC, it will blow your mind.

alienware

i have tears in my eyes lmao

It has supports dolby vision but I guess that isn't real HDR either no?

Did you not hear the news? switch 2 is gonna have an HDR screen, i'm sure it will take off from there

If I'm on a ~$500 budget, what's the best HDR I can hope for?

stuck in a calibration menu for ages

wat?

own 1000 nit peak monitor

turn hdr on in game

set game to 1000 nit peak

wow, I'm a technical genius

Is buying a QDOLED + an IPS a good plan for working + gayman? the QDOLED can be a secondary screen while I work.

The switch 2 uses a normal ass LCD screen, the hdr is going to look like shit and everyone is going to assume that's what all hdr is.

dolby vision

hdr10++

Holy shit, how did these companies manage to confuse consumers so much?
Only HDR400 certification and up means that a monitor supports actual HDR.
Dolby Vision, HDR10, whatever, is not an HDR certification.

The inbuilt lcd of the switch 2 is going to upset a lot of people when they turn on HDR lol.

Gamebar at least saved an almost grey looking .png and a .jxr with all the colours but you cna not open it in anything.

Color Management's hard, man. People don't truly know how absolutely bullshit even regular SDR color is, with things like Gamma curves and all that. Then you have HDR, and it's an entirely different encoding structure that's supposed to be universally compatible with all manner of HDR displays but also SDR but nothing works well because every popular file type is SDR only but we can hacksaw HDR data into them but then how does the computer know if it's HDR PQ data or not because what if it just assumes it's SDR and then it displays it as this washed out grey blob and

It's HDR10 on an IPS screen, aka fake HDR.

HDR400

real HDR

fair enough, but i mean, we're not talking about generating fake frames here. i don't mind fake HDR. all it appears to do is enhance contrast and color vibrance/saturation. if i can simulate it at even 50% effectiveness with just a ReShade, that's good enough for me.

Yes, most OLEDs are HDR400, and it's a real HDR.

Have you seen how colorless the switch 2 games are looking?
Pratically requires hdr now.

Why would u want more? Ull see enough dust on your glasses without any HDR at normal brightness level.

most OLEDs are HDR400

No they don't. Trueblack400 =/= HDRF400. You can even read the spec sheet and see the abomination that is HDR400 for yourself.

Isn't HDR10 just good enough?

You do realize that most OLEDs are only capable of HDR 400 because it's true black.

If only you could read, oh well.

Honestly after watching monitors unboxed testing their OLED for burn in, I just use mine for production. I just make sure to do the panel protect when it wants, and I minimize any programs I'm not actively using so it's just my pitch black wallpaper on screen with the taskbar hidden.

Most oleds that people buy are HDR400

Yet another filter is considered an "upgrade in graphics"

OLED HDR400 is a real HDR, yes.

I'm not going to argue with (You) anymore anon, your idiocy might be contagious and I can't risk it.

Everyone realized that HDR is a meme because everyone has an HDR display on their phones these days and realized that they don't care about most of the use-cases but now have to deal with inevitable burn-in from AMOLED screens.

Lol you fuckin retard.

HDR is kino
OLED is God
anyone claiming otherwise is either poor, retarded, gay, or coping (or any combination of the 4)
still most games implement HDR like shit, but that's the devs fault. when it's done properly it's amazing

It is okay to admit you are wrong, you know.

HDR is not just about how bright your monitor is
most people don't want to burn their eyes out with constant 1000 nits blasting their retina

Which games even have good HDR? The only decent one I've played was Doom Eternal, but I hate DE so I stopped playing.

yeah but any other type of monitor with HDR400 can only display 400 nits maximum at any time. OLEDs can go way higher than that on small highlights. So they're both better and worse than the HDR400 spec.

I am guessing that the LED has edge lighting form of HDR, which is complete dogshit.

The First Descendant does it well. In fact I think the game was designed around it as the game looks like utter shet with a ton of lost details in SDR mode.

still most games implement HDR like shit, but that's the devs fault. when it's done properly it's amazing

Yet you will not accept this argument for ray tracing.

You're right. That is why FALD is just a marketing scam and not needed for HDR.

OLED monitor

no thanks, last thing I want are browser tabs burned into my monitor

HDR basically lets you see way more colors on your screen. If you watch this video on a regular
display, right at the beginning you won’t even be able to read the white letters on the pink sign.
On normal displays the sign looks kinda washed out white, but the reflection on the ground shows up as pink. That shit doesn’t happen with HDR because the display actually has access to more colors to show you. Anyone calling it a meme or whatever is coping or just lying.

youtu.be/SiryvrStb8E?si=UzGkVkOnwerNCzCk

so HDR is just fancy marketing for 10bit color?

OLEDs are disgustingly dim in HDR

My OLED monitor is so bright in HDR that it sometimes hurts my eyes. Do you have cataracts or glaucoma or something?

Anything not Korean or Chinese or made by shitty western/jp devs?

It looks gooder

Why

Because light brighter

I am so sick of snake oil. Being sold

10bit colour is part of it but not the whole thing. HDR is complicated.

I'm tired of tech illiterate third worlders shitting up the board.

can't play in borderless windowed if I turn HDR on

A deal breaker desu.

What game are you trying to play that does this?

HDR is just so much easier on a console and TV

More like old man who can barely see a difference. I miss actual bumps in performance instead of the bullshit we have today

I like the little side street part at 27 minutes.

I don't get it. If it's HDR10+, how do I know if it's HDR400 or 600, etc.?

SDR bros ... not like this.

Because, like all modern things, it's a gimmick. We're still using 1080p 60 HZ and we like that. Let me guess, you need more? Nothing in the gaming hobby requires HDR. It's just like 4K, no one can tell the difference.

Just get an HDR capable OLED and it will do everything.

Those are technically two different things.
HDR10+ (and dolby vision) are encoding specs that support dynamic metadata.
DisplayHDR400, DisplayHDR1000 and DisplayHDR TrueBlack400 are display standards of the physical screen.

it did years ago. I refuse to play vidya that don’t support proper HDR.
t. LG OLED CHAD

Bait or actual third world child?

woled

Enjoy your colours washing out anytime you actually want the screen to get bright.

We're still using 1080p 60 HZ and we like that.

Speak for yourself, I've been on 2K 120Hz for nearly 10 years and recently moved up to 4K 120Hz. I have a 4090 and I make damn sure to get good use out of it.

It's just like 4K, no one can tell the difference.

This is pure cope. See a doctor about your cataracts, the world will look much more vibrant when you get those removed.

werks on my C1

White subpixel says hai, welcome to volumetric collapse desu ne

If you are chasing meme tech, you are a "jeet". Most of the PC gaming world is happy with 1080p SDR.

You cannot tell the difference. You just think you can because otherwise you'll get buyer's remorse.

Just eat shit bro - after all 10 billion flies are happy with it so why aren't you?

Sorry I can’t read your text over my lcd clouding

I upgraded only after experiencing the difference and realizing how big of a difference it makes. Not everyone is blind and tasteless.

my issue is the text fringing. i bought a G9 OLED and returned it because the text unironically gave me a headache with the strongass fringing.

chasing meme tech

I bought a 360hz OLED. The colors, responsiveness and motion clarity in games are amazing. 1080p 60hz on an LCD is the absolute bare minimum, not something you should aspire to.

HDR is kino

OLED is God

anyone claiming otherwise is either poor, retarded, gay, or coping (or any combination of the 4)

absolutely based and truthful
poorfags get out

I got one of the high end MSI monitors and the text seems fine to me. No headaches even after a full day of production work.

Why is hdr in photos the exact opposite of hdr in video games?

hdr_tutorial.jpg - 450x274, 41.31K

HDR was great on my old TV because it made the image much brighter and I could crank the colour saturation up higher without it becoming over saturated. But with my current TV it gets brighter in SDR for some reason so I have HDR disabled.

G9 was WOLED, to be fair. But it did traumatize me a bit; it was supposed to be the highest end monitor in the market.

This, I have a non-HDR monitor and it's often too bright, especially when you get dark scenes with sudden flashes of light. I don't see why I'd want it brighter.

I assume it's just slopcattle getting dazzled by the shiny lights, same reason modern movies usually have a bunch of bright flashing VFX. I think it causes some sort of deer in headlights effect for retards.

only companies and studios can afford these shit for production

wonders why consumers complain that their end product looks like shit on average monitor

I don't really care about fps above 60 either.

Ah, mine is QDOLED.

Good to know. Maybe I'll just give it a go and return it again if my eyes meme on me.

Why hasn't this taken off in gaming?

I think 95% of games that I have played since 2021-2022 have an HDR option by default. you can also force it with RTX HDR from the Nvidia profile inspector if you want.
Since the Switch 2 will have HDR by default, I bet this will become the standard for games very very soon.

HDR isnt a new technology, have you been living under a rock?

It is new to nintendo and that is all that matters.

Why wouldnt HDR not work in an image? Its an image already.

Ah, just like open world games were invented by BotW.

It's the opposite for me actually. HDR looks like right.

Why can't you just put in your monitor's colour gammut and get correct colours for all content

That means your monitor doesn't support HDR.

can't afford good tech

calls everything slop

Peak Anon Babble right there.

That's what ICC color profiles do but you need to find the correct one for your monitor but windows won't apply it to all of windows, it's only used in color managed programs, luckily that includes most browsers.

Windows 11 24H2 also added Auto Color Management which I think get's enabled automatically so this might not be as big of an issue as it has been in the past 20 years.

I thought this video seemed to tackle it well enough in 5 minutes
youtube.com/watch?v=frBNnNWEbyk

Haven't seen it in months but in a nutshell the problem was that most games don't implement it well.

My eyes are good and I'm sensitive to this stuff so I'm pretty confident that it's fine. Also mine is one of the 1440p high refresh rate models.

HDR isn't about brightness. It's about contrast

sdr picture looks like the left if I turn my sdr monitor brightness up

I had this app.....but now it says I need to update, fucking hell I got win 10 LSTC and is updated, what is wrong with them?

No it doesnt. Turning up the brightness makes the darker parts of the image too bright.

Poorfags/third worlders

It's everywhere in console games. If you're not looking at poverty TVs, it's not hard to find a decent HDR TV these days either. Seriously, any midrange TCL or Hisense 120Hz native miniLED from last year will get the job done well enough in 95% of all scenes as soon as you turn the lights in your room on (like you should for any display, to avoid eye strain). PC gaymers get shafted because, ironically, they're the easiest to dupe when it comes to displays. So they get sold monitors that are 2-3 years behind TVs in every way outside of refresh rates + they get stupidly overcharged. But also, Windows is kind of a disaster that's needlessly complicated to get HDR working properly.

Your vision or your monitors sucks, they're very different.

muh brightly lit room

Enjoy washing everything out.

TVs and Monitors have had contrast sliders for 20+ years anon.

You have no idea what you're talking about.

Ive had HDR monitor for my PC for years. Its an old tech. Its just typical poorfag cope that buy cheapest office monitors, or offbrand chink trash.
PC also has multiple auto HDR modes you can apply to older games. Consoles cant do that.

HDR is a movie specification and is assumed content will be consumed in a dark room.

low quality bait, try harder

You know you can hook up a PC to a TV screen too?

HDR has existed for years, the problem is that quality HDR displays that can maintain brightness when they need to haven't been easily obtainable until a few years ago. Now, basically anyone can afford one if they really want to. But the monitors are still living in the past, as TVs have heat dissipation and light output advantages. MiniLED TVs have more real estate for more LEDs without needing to shrink them, making density easier.

You really do not understand what you're talking about.

Whats ironic is that OLED isn't even bright enough to actually do HDR. It's also dad that this is the garbage that the industry is pushing because they can't develop better TVs.

You think tod amazing because you stopped using your old burnt out TV and got a brand new one, retard.

HDR was literally designed for poorfags to buy a new TV. If you actually had money you would skip the shitty smart TV and just get a 4k laser projector.

Yes, and most people buy a monitor anyway, because "That's what you're supposed to do!" and "Muh Framerates!" Never mind the fact that aiming for over 120fps is still difficult and exponentially more expensive, to the point it's genuinely not worth it due to diminishing returns. And then you lose a bit anyway if you're going to go with frame generation and upscaling. Display technology in a broad sense, is crap.

HDR can't work in handheld mode because the screen is an LCD. This is how stupid the whole HDR faggotry scene is. Did they even settle on a proper HDR format? Last I remember they had HGL, HDR+, HDR10, DolbyVision, and some others.

You're an idiot. External light sources wash out screens. Having low ambient light is good if you're working and reading, but for movies and games you want a dark room to minimize light actually hitting your screen. You're the one who doesn't know what you're talking about. Now spout some nonsense about dark rooms hurting your eyes because they need to readjust when you look away from the screen (something you aren't doing) because you want to pretend you understand biology.

So, hdr is a new buzz word and excuse for devs to make unfun games

I play PC on a 55" TV without smart features. I don't think I can go back to some overpriced "gaymer" monitor since I don't give a shit about anything above 1080p or 60fps.

Remember when you visit a cinema they leave the lights on full bright when the movie starts.

Can someone explain why in some games the HDR calibration does literally nothing? Is it being overridden by Win11's HDR configuration?

'cuz it is the wild west of adoption. Some games adhere to the OS level calibration, others just say fuck it and offer their own calibration.

mfw RDR2 with HDR

muh feminism: the game

high dynamic meme

*yawn*

based zoomer

years and thousands of explanations later i still don't understand hdr

spidercum.png - 539x407, 348.01K

8 bit: u cant see trannys dick because its hidden in shadow
10 bit: u can subtly see trannys dick because thears more color data

I care about gameplay.
I think all graphics faggots should be lynched.
OP. I want to tie your feet to the back of a cybertrick and drive.

i wish kcd2 had HDR, man

White balance is the same in both

which means nothing in this discussion

It has on PC.