Where did this HDR thing come from all of a sudden? Does it mean the contrast/black-level problems of LCDs are solved...

Where did this HDR thing come from all of a sudden? Does it mean the contrast/black-level problems of LCDs are solved? Is it actually outputting greater brightness and dynamic range or is it just slightly jacked up contrast and saturation or something?

Just sounds like software marketing lingo to me.
Waiting for someone who knows details

In this case, HDR refers to using a different color space with 10 or 12 bits per sample instead of the usual 8. This color space is also wider than the traditional RGB and results in darker blacks for instance.
That's for the protocol changes. For the TV themselves, the panel technology is different to accommodate for the change in color space.

Consumers can't visually tell the difference in hz, several even use sd signal when they have both broadcast and devices that can manage full hd.
This is just another fizz word that normies can use to make jealously to their poorer counterparts.

>HDR

It's bass boost for the eyes.

I made this OC for the tards on Cred Forums

they still didn't get it

This is the closest I've seen anyone come. HDR is an attempt to give your monitor the same range of brightness as your eyes. When you look at the horizon, you can see the very bright sky but at the same time many details in darker spots. When you take a picture with any consumer camera, the issue that everyone is familiar with is that bright areas very easily wash out the detail in dark areas. Monitors have the same problem. If you use HDR cameras with a HDR screen, the result is a wider dynamic range, thus a better ability to resolve detail in both bright and dark areas in the same screen.

10 bit color

so... how can we tell if our TVs have HDR support?

>High dynamic range
>The range of colors is high, and very dynamic

Was it on the box? That's your answer.

Thanks guys. I should have been more clear in the OP, I meant HDR as a TV technology specifically. I've been into computer graphics/photography for a long time so I have a pretty good understanding of high dynamic range. I was just kind of blindsided because the first that I heard of it was when they announced the PS4 Pro. I didn't even realize there were TVs already on the market that supported some kind of HDR.

Now I'm wondering how and how well it actually works. How is HDR content captured and authored? I would imagine the only footage that is legit HDR would be that captured in RAW formats by higher end cameras from RED, Sony, Canon, Panasonic, Black Magic, etc. And of course games and CG. Otherwise all other format/codecs are too limited in bit-depth to capture high dynamic range.

What about HDR on the PC side? Obviously you will need an HDR monitor (which I hear will be coming out in 2017), but what else needs to have explicit support for it? Games, applications, the operating system (just 10-bit support?), the graphics card? Let's say I take a photo outdoors with a nice camera. It has a sensor with good dynamic range. In the jpg, the sky is blown out because I exposed for the ground, but I know my RAW file has preserved all the details in the sky. Now if I have a fancy shmancy HDR monitor, what would it take for the monitor to properly display the full dynamic range of a RAW photo?

Monitor technology is hard to appreciate without actually having seen it. Having lived at 1080p for ages, I had always thought 1440p was just a meme until I walked into a computer store and saw the massive amount of screen real estate.

>I meant HDR as a TV technology specifically. I've been into computer graphics/photography for a long time so I have a pretty good understanding of high dynamic range.

HDR "photography" is actually a hack called local microcontrast, which compresses multiple images (captured at different exposure) into a single one by looing at different parts of theimage and making them selectively brighter or darker, increasing overall contrast at the expense of accuracy. For example, with a HDR-modified image you'd no longer be abe to look at two separate pixels or areas of the screen, and compare their brightness to discern the ACTUAL difference in brightness when the photo was captured.

HDR screens instead area *actually capable of dilaying a larger contrast range*. The dark parts are visually darker and the bright parts are visually brighter, because the screen (and video format) has additional contrast to work with. You no longer need local microcontrast tweaks to get the HDR effect because you can discern contrast in both the light and dark areas..

My TV is older than before the buzz word became a marketing factor.

How old is it?

2014, but it's a Sony

According to Wikipedia, it is very unlikely.

I was nearly going to pull the trigger on a MG279Q, does this mean I need to wait for 1440p 144hz freesync HDR panels or will I be able to sleep without having that tech in my monitor?

is it seriously just a 10-bit display standard?

More or less, truer colors and a higher luminance range.

>HDR "photography" is actually a hack
It's a good way to mimic HDR, considering it's for your average daily use and not professional photography.

HDR10 Media Profile is the whole standard which in part specifies the display is using the new wider colorspace and 10 bit depth.

Can someone explain how LDR data translates to HDR.
For example, apparent gray was 128 in LDR. If you make a game HDR, will that gray still be 128, or will it now be 512?

Probably just multiples each color by 4. In the hardware.

LDR is 2^8 values = 0-255 per channel
HDR is 2^10 or 2^12, so 0-1023 or 0-4095 per channel