HDR/ High Dynamic Range Explained
By Jim Bask
By Sam Machkovech from Ars Technica
High Dynamic Range, explained: There’s a reason to finally get a new TV
Ever since the HDTV standard emerged in the mid-’00s, screen producers have struggled to come up with new standards that feel anywhere as impressive. That’s been a tough sell, as no baseline image standard has yet surpassed the quality jump from CRT sets to clearer panels with 1080p resolution support.
3D content came and went, with its unpopularity owing to a few factors (aversion to glasses, hard-to-find content). The higher-res 4K standard is holding up a little better, but its jump in quality just doesn’t move the needle for average viewers—and certainly not those sticking to modestly sized screens.
But there’s another standard that you may have heard about—high dynamic range, or HDR. It’s a weird one. HDTV, 3D, and 4K have all been easy to quickly and accurately describe for newcomers (“more pixels,” “one image per eye,” etc.), but HDR’s different. Ask an average TV salesperson what HDR is, and you’ll usually get a vague response with adjectives like “brighter” and “more colorful.” Brighter and more colorful than what, exactly?
Yet HDR may very well be the most impactful addition to modern TV sets since they made the 1080p jump. Images are brighter and more colorful, yes—and in ways that are unmistakable to see even to the untrained eye. Content and hardware providers all know it, and they’ve all begun cranking out a wow-worthy HDR ecosystem. The HDR difference is here, and with this much stuff on market, it’s officially affordable (though certainly not bargain-bin priced yet).
If HDR still has you (or your local retailer) stumped, fear not. Today, we’re breaking down the basics of high dynamic range screens: what exactly differentiates them, how good they are, and whether now is the time to make the HDR leap. And as a bonus, we’ll answer a bunch of questions about various screens and compatible content along the way. [Continue Reading…]