When are HDR monitors going to be a thing?

When are HDR monitors going to be a thing?

After watching my first UHD bluray on a new 4K tv. I know this is something I need.

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=N82E16824236399
twitter.com/NSFWRedditImage

Just use your 4k tv as a monitor, or some 4k korean monitors support full 10bit color as well.

i was thinking of buying one today, came here for some input. no one is talking about it...

hey dude whats the difference? i watched a uhd blu ray and didnt see a difference

He is talking about hdr, which you'll need a special monitor for.

>HDR
What. You mean IPS or AMOLED?

Don't expect the mouth breathing consume retards on Cred Forums to know what dynamic range is.

Hopefully never. We don't need another tv industry meme feature to impose more hdcp.

Yeah i have a 4k tv

I don't understand why you would need it, unless you camera is so shit that you need your monitor to make up for it. Isn't the lighting in movies good enough to not rely on this shit? Am I supposed to start recording videos with retard exposure, upload them to youtube, and then require all users to have an HDR monitor to get the full experience?

4k ≠ HDR

HDR monitors are already a thing, it's called every fucking monitor.

The whole idea is to get the experience of watching the movie as close to reality as possible (apparently, I'm not a fan of this idea). Dynamic range is one of the biggest short comings of current display technologies. The thing is that our brains intuitively adapt to that, so it's not as apparent as e.g. resolution.

t. retard

The "HDR" TVs they're going to shit out are just going to use software to add "HDR" effect to the video. All the TVs and monitors already display a high dynamic range if it is supplied in the video.

I think you don't know what dynamic range is and what "HDR" refers to. You are mixing up shit. Apparently this user was right after all.

Whenever it stops being a UltraHD Blu-ray exclusive feature.
It is pretty fucking mindblowing how good it is.

The best way to picture it is viewing a picture of a sunset on a regular monitor, it looks dull because the sun is as bright as what the monitor can display white. On a HDR TV, the sun looks bright as fuck while the rest of the detail remains at a normal brighness. This makes a much bigger difference than the jump from FHD to QHD on a TV.

I understand what dynamic range is because I deal with it in photography.

It's not something for you to worry about on the display side. Our displays do just fine.

This sounds like AMOLED compared to LCD, is that what HDR is?

Or you could, you know, make the film maker put effort into editing the scene so that detail is not lost.
iPhone did this for you with pictures 5 years ago

Basically it's just crazy contrast ratio coupled with more sophisticated brightness control information embedded in the video file.

The Official HDR specs for LCDs are, it must have lower than 0.05 nits black level and at least 1000 nits for maximum brightness. No IPS LCD meets the requirements so all HDR LCD TVs are VA type panels.
For OLEDs, Black levels are obviously zero so maximum brightness has to be at least 540 nits.

People who buy 4k tvs are idiots who love to throw away money. 4k is a half-baked meme like lazerdisc and Betamax. The REAL deal is 8K Super Hi-Vision. That's what I'm waiting for.

Why should people buy a screen with 4 times the pixel density of a screen whose pixel density is already higher than it should at any realistic size?

Because 8K looks FAR better than 4K

How can more pixels look better when I can't even see those pixels?
Do you buy 4k phones too?

>calling monitors with a high bit depth "HDR"

Using this picture as an example, it should look about the same, except with the sun being 10x brighter.

You sound like the 1080p tards who said "you cant see more detail than 1080p" and "4k is only useful for 80 inch tvs." Meh.

I don't think 1080p is enough even for 23inch monitors. But 4k is enough for 50 inch TVs even at close range

You make it very clear that you DON'T understand what it is. Believe me, our displays could do a LOT better. Compared to reality display resolutions these days seem okay, but the dynamic range is a joke.

So like this?

That's not true. The sun should look exactly the same, but you should be able to see detail on the sand

So like this?

You wouldn't get a reasonable opinion on this board when 4k is considered a meme because of the high price, I own a high end 12 bit quantum dot sammy with all the memes and I agree it's a clear upgrade.

To answer your questions, HDRR is going to be released in games, there's a sheer propensity in the failover of panels, so much that Amazon at some point even temporarily banned their sale because of the sheer number of returns and consumers playing the panel lotto

So that definitely enters the equation, market adoption on a consumer TV level, the high end features still need to trickle down.

Pushing further, 4K and HDR will only accepted when bandwidth and the corresponding content storage can be reasonably supported by households without investing an arm with gigabyte streaming and petabytes of storage.

In terms of the gaming scene, 4k 120hz can only be implemented when 8k 60hz is democratized and you'll always have waitfags but one thing is for sure, you gotta have fun. Rockin 4k 60fps with dolby rocks

movie making

Both is true. The whole picture gets brighter, the dark details are not swallowed by the low value, the sun gets glaringly bright. In short, it's closer to reality. You don't need to compress the dynamic range before to fit it into the tiny range of a display, the eye/brain manages that. It will provide a much more immersive visual experience.

Aren't Samsung's "Quantum Dot" TVs just LCD TVs with backlight clusters controlled by Quantum dots? I still preferred the picture from a Sony Bravia and LG OLED compared to Samsung's new display.

The whole point of HDR is that you can't reproduce it on a regular display. You are just lacking the brightness.

The current way around it is to compress the dynamic range, so that dark regions are just shifted to higher values overall. It kind of mimics what the eyes and brain does to cover the enormous dynamic range we are able to see (we can see just fine on a bright summer's day, but we can also resolve things at night when only a few thousand photons are reaching the retina at all, which is really impressive).

Sounds like you're explaining HDR photography. HDR for TVs is a bit different, the contrast is just huge without looking unnatural.
It's something you just have to see in person because there's not a single computer/phone display, web browser or picture viewer that could display this HDR content at the moment.

>the whole picture gets brighter!
We already have a brightness setting.

You're talking about dynamic range but you're describing exposure.

Quantum dot is a just a layer applied on the whole panel with the corresponding nanotech to allow higher nuance in color gamut, it's a 12 bit panel even if barely any content is encoded in 10bit

When you're running the native panel basically you still get some saturation and you can fine tune in anyway you want.

I think you're referring to local edge dimming, the high end samsungs have areas that are independently lit which allows HDR and the top of the line series have FALD which is full array dimming, so basically all of the pixels on the panel are independently controlled

On the other hand, LG OLED push their own tech with organic pixels that allow deeper blacks but it's sucha new tech that latency doesn't always follow but in terms of picture quality, I'd say it's superior.

You gotta take in account the electricity consumption as well. At the moment I'd say there's nothing wrong with high end local area dimming on sammys and the LGs still have potential to improve. Sony is good but nearly went bankrupt recently, all panels are produced by Samsung/LG

Brightness is very limited so far. Dynamic range is the range values you can output on a display. For current monitors the darkest value is actual black (OLED), but the brightest one is not particularly bright. So that is the limiting factor of dynamic range on displays atm.

No

The problem with your example is that we have this wonderful thing called a pupil which actually governs this.
When it is dark it gets wider, allowing more light to enter and when it is bright it gets smaller so that not as much light enters.

This is why we aren't blinded during the day nor blind at night.

What high dynamic range really refers to is how we are able to resolve details in both dark and bright areas at the same time, as with most cameras the dynamic range is fairly compressed as it is difficult for imaging sensors to expose for a high range of lighting variation.

This is actually the best and most understandable explanation of what the HDR feature in displays is.

The main problem with most of our displays now is that even if they can reach the 1000nits brightness level the black level would be much higher than 0.5nits.

It's the ability to combine dark blacks with rich bright colors.

>The problem with your example is that we have this wonderful thing called a pupil which actually governs this.
This is simply not correct, or just in parts. While the pupil does adapt for different brightness levels, it is not the main reason for our incredibly high dynamic range. The dominant mechanism is found directly embedded into the chemistry of our retinas.

yet we're still stuck with 23.978fps

When everyone gets a ps4. It's the only thing pushing hdr content at this point. PCfags are all talking shit out of their mouths and saying 10bit is the same thing.

>implying it works the same way for videos
off yourself

I do want a nice 2K 144hz HDR monitor.

The current Ultrasharps are droppin the price but theyre still expensive.

I have this one in sight but it costs 1200 usd where i live. newegg.com/Product/Product.aspx?Item=N82E16824236399

It's a standard for luminance range and 10 bit color.

It's not as much as displays as standards that target low brightness range displays and served more as a transition for content from CRT to LCD screens.

They have different standards, but both can be HDR. OLED will have better blacks, IPS will BURN YOUR RETINAS with the highlights.

Why isn't 4K actually called 2K and 8K called 4K?

Everyone says "1080" not "1920", it makes more sense.

Digital movie projectors have higher dynamic range than broadcast monitors. HDR is moving video displays to approach digital cinema. Films are mastered (and their dynamic range reduced) when they come out on Blu-ray or TV.

t. Idiot

Only a digital cinema projector (with the xenon lamp running to standard, or laser illumination) can display HDR content (or a 4K HDR display, which are few and far between)

Not professional movie camera sensors like those on a RED or Alexa. Those capture 10+ stops on the sensor. But since no display so far can show all that information, lighting on set and grading in post are done to bring the dynamic range to an acceptable range that also brings an artistic effect. The dynamic range is bigger on a movie theatre than in a non-HDR video display.