Beginning with the 2012 holiday season, I began seeing large screen, 4K TVs in retail displays (typically in a high-end theater room). The first one that I could inspect closely was at a Sony store in a factory outlet mall in Winthrop MA. That was on Black Friday. Just a month later, I saw several displays with more compelling content at ABT, the mega-super-retailer with just one location in Glenview IL.
If 4K were to catch fire, the sourcing of high resolution content is not in doubt. 4K has been a production and archival standard for Hollywood studios since shortly after the advent of digital content creation. And, of course, studios can always transfer directly from their vast warehouses of legacy films. (At about 2000 lpi, the 35 or 70mm film used in the making of Hollywood movies for the past 75 years has a theoretical resolution of about half way between HDTV and 4K, depending in large part on lighting conditions. Digital IMAX is arguably the pinnacle of mainstream theater technology. It is projected at 4k x 2k = 8M pixels).
But is home theater 4K TV relevant?
In 1990s, I was briefly co-chair of the National Coalition for HDTV Research & Policy. The path to HDTV standards was torturous, both for display technology, broadcast standards, and the requisite PC convergence.
I am a resolution junkie. For enter-tainment, I crave a big, beautiful theater experience. For PC work, I want a desktop with many open windows or pages—resplendent with microscopic detail. I want lines and characters that pop out with enhanced acutance. In the 90s and early 2000s, my friends were satisfied with VGA (640×480) or SVGA (800×600). I demanded XGA (1024×768). When laptops shifted to widescreen, I held out for WUXGA (1920×1200). Now, I have a 1080p notebook. It is the convergence standard. But it is not the ultimate consumer display. In fact, I crave the newest Samsung Book 9 plus, which offers 3,200 x 1,800 pixels packed into a 13.3 inch display. That’s almost 6 megapixels!
The NTSC standard lasted more than 50 years. It took two decades to make the market transition to HDTV. Today, 1080p is the de facto standard for both PC and TV displays, although most HD TV content is transmitted at a still respectable 720p. But do we want or need another standard that has 400% more pixels?
As a resolution junkie, I can firmly answer the question: Nah… It is simply not worth it, even if the technology cost rapidly drops to par.
Watching TV is very different than viewing PC page content, which tends to be filled with text, but is mostly static. Over time, motion creates a rich experience. In fact, the “psychological bandwidth” of TV viewing is a product of pixels and frame rate. In my opinion, with HD—especially at 1080p—the human mind is maxed out. At this point, auditory and tactile input become more important than attempts to increase resolution beyond 1080p.
At whatever distance that you find comfortable, (say 2.5 feet from a 24″ display, 9 feet from a 50″ display or 15 feet in a home theater with a 110 inch screen), adding resolution to a moving image beyond 1080p is detectable only when getting so close to the screen, that you are no longer enjoying the experience. For this reason, HDTVs under 20″ don’t even bother to support 1080 pixels unless the display is also intended to accommodate connection to a PC. [ continue below image ] …
In my opinion, taking films beyond 1080p adds nothing to the experience (or at least, a severely diminished return), and yet it adds tremendously to the cost of storage and transmission.
Of course, in the end, industry standards are becoming marginalized. 4K will probably come upon us with or without a federally sanctioned standard, thanks to multi-synch monitors and the flexible nature of graphics cards and microcode. Today, resolution—like software—is extensible. Cable service providers can pump out movies at whatever resolution they like. The set top box at the other end will decode and display films at the maximum resolution of a subscriber’s display. The role of government in mandating an encoding standard is marginalized, because most viewers no longer tune in to public airwaves. FCC turf is generally restricted to broadcast standards.
Am I often reluctant to adopt bleeding edge technology? Far from it! This opinion is brought to you from a committed resolution junkie. But I do have a few exceptions. Check out my companion piece on consumer 3D TV technology. Spoiler: Both technologies are limited exceptions to my general tendency to push the proverbial envelope!
Ellery Davies is a privacy pundit and editor of AWildDuck. He is a frequent contributor to The Wall
Street Journal. He is also a certified techno-geek with ties to CNet, Engadget & PC World.