I have to admit, I’m a fan of high-resolution imagery. I was an early adopter of HD at home and haven’t watched an SD channel in years. I’ve also seen Super Hi-Vision demonstrated in big theaters at trade shows for many years, and it is truly breathtaking. Last week at the IBC, there was a lot of talk about Ultra-High Definition TV (UHDTV) as a broadcast standard. For a summary of what UHDTV is, feel free to check out my previous post.
But after looking at UHDTV on a smaller screen, I couldn’t help but wonder: who actually needs Ultra-High Definition TV? Answering this question took a bit longer than I had hoped, but at this point I will just say it…
UHDTV is overkill.
This is not to say that UHDTV has no merits, nor that today’s HD can’t be improved upon. But if I were in charge, I would focus the industry on doing HD right, not on adopting UHDTV. High quality 1080p HD running at 60 frames per second would be a joy to watch as compared to the highly compressed MPEG2 garbage my cable provider currently charges me a premium for. The difference between today’s HD experience and “HD done right” would be easy for anyone to appreciate. But the difference between great HD and UHDTV would not be easy to see, if you could even perceive it at all!
In terms of spatial resolution, I’ve done the math and I encourage you to do the same. Click here to see my analysis in gory detail. I could go on about the limits of human visual perception, but suffice it to say regular old 1080p HD has enough pixels that you can’t see them at a normal viewing distance. Yes, this means that if Apple were trying to sell you a TV (which may happen soon) your existing 1080p HD TV would qualify as a retina display at the viewing distance recommended by SMPTE. So unless you are one of those people who like to sit uncomfortably close to the screen, you won’t even be able to tell the difference in spatial resolution between 1080p and UHDTV.
UHDTV faces other challenges. Although UHDTV televisions are now being released by major manufacturers, there is currently no UHDTV content available for consumers to watch. There are only four UHDTV cameras in the world today. And until more efficient codecs like HVEC are developed and implemented, UHDTV may actually look worse than HD. If you’ve seen what HD looks like on cable, imagine what the compression artifacts would look like if the codec were trying to describe 16 times more pixels!
Beyond the technology challenges, UHDTV will also face adoption challenges. When consumers are given a choice between convenience and quality, they almost always choose convenience. Ten years ago, consumers could have adopted high resolution audio – remember SACD and DVD-A? Instead, they chose lower quality MP3 because of the convenience. Likewise, although pro audio gear now supports 24-bit audio production at 192 kHz sampling rate, no one I know bothers with it. Audio has long since passed the “good enough” threshold and this same phenomenon is now playing out for video. To me, 8K is to video what 192 kHz is to audio — one step too far. Most consumers are happy with HD broadcast in its current, crappy state. And HD broadcast is already under threat from over-the-top content, most of which is streamed at resolutions lower than HD.
There’s no doubt that high resolution cameras are a boon to content creators, giving them more flexibility to tell the best possible visual story. I will also say that 4K makes sense for theaters with very large screens. But as a broadcast standard for the home, UHDTV just doesn’t add enough value for consumers over and above HD, especially when HD is done right. I know it would not be sexy to say, “after ten years, we’ve finally figured out how to do HD!” but that would be an advancement worth paying for!