Beginning with the 2012 holiday season, I began seeing large screen, 4K TVs in retail displays (typically in a high-end theater room). The first one that I could inspect closely was at a Sony store in a factory outlet mall in Winthrop MA. That was on Black Friday. Just a month later, I saw several displays with more compelling content at ABT, the mega-super-retailer with just one location in Glenview IL.
If 4K were to catch fire, the sourcing of high resolution content is not in doubt. 4K has been a production and archival standard for Hollywood studios since shortly after the advent of digital content creation. And, of course, studios can always transfer directly from their vast warehouses of legacy films. (At about 2000 lpi, the 35 or 70mm film used in the making of Hollywood movies for the past 75 years has a theoretical resolution of about half way between HDTV and 4K, depending in large part on lighting conditions. Digital IMAX is arguably the pinnacle of mainstream theater technology. It is projected at 4k x 2k = 8M pixels).
But is home theater 4K TV relevant?
In 1990s, I was briefly co-chair of the National Coalition for HDTV Research & Policy. The path to HDTV standards was torturous, both for display technology, broadcast standards, and the requisite PC convergence.
I am a resolution junkie. For enter-tainment, I crave a big, beautiful theater experience. For PC work, I want a desktop with many open windows or pages—resplendent with microscopic detail. I want lines and characters that pop out with enhanced acutance. In the 90s and early 2000s, my friends were satisfied with VGA (640×480) or SVGA (800×600). I demanded XGA (1024×768). When laptops shifted to widescreen, I held out for WUXGA (1920×1200). Now, I have a 1080p notebook. It is the convergence standard. But it is not the ultimate consumer display. In fact, I crave the newest Samsung Book 9 plus, which offers 3,200 x 1,800 pixels packed into a 13.3 inch display. That’s almost 6 megapixels!
The NTSC standard lasted more than 50 years. It took two decades to make the market transition to HDTV. Today, 1080p is the de facto standard for both PC and TV displays, although most HD TV content is transmitted at a still respectable 720p. But do we want or need another standard that has 400% more pixels?
As a resolution junkie, I can firmly answer the question: Nah… It is simply not worth it, even if the technology cost rapidly drops to par.
Watching TV is very different than viewing PC page content, which tends to be filled with text, but is mostly static. Over time, motion creates a rich experience. In fact, the “psychological bandwidth” of TV viewing is a product of pixels and frame rate. In my opinion, with HD—especially at 1080p—the human mind is maxed out. At this point, auditory and tactile input become more important than attempts to increase resolution beyond 1080p.
At whatever distance that you find comfortable, (say 2.5 feet from a 24″ display, 9 feet from a 50″ display or 15 feet in a home theater with a 110 inch screen), adding resolution to a moving image beyond 1080p is detectable only when getting so close to the screen, that you are no longer enjoying the experience. For this reason, HDTVs under 20″ don’t even bother to support 1080 pixels unless the display is also intended to accommodate connection to a PC. [ continue below image ] …
In my opinion, taking films beyond 1080p adds nothing to the experience (or at least, a severely diminished return), and yet it adds tremendously to the cost of storage and transmission.
Of course, in the end, industry standards are becoming marginalized. 4K will probably come upon us with or without a federally sanctioned standard, thanks to multi-synch monitors and the flexible nature of graphics cards and microcode. Today, resolution—like software—is extensible. Cable service providers can pump out movies at whatever resolution they like. The set top box at the other end will decode and display films at the maximum resolution of a subscriber’s display. The role of government in mandating an encoding standard is marginalized, because most viewers no longer tune in to public airwaves. FCC turf is generally restricted to broadcast standards.
Am I often reluctant to adopt bleeding edge technology? Far from it! This opinion is brought to you from a committed resolution junkie. But I do have a few exceptions. Check out my companion piece on consumer 3D TV technology. Spoiler: Both technologies are limited exceptions to my general tendency to push the proverbial envelope!
Ellery Davies is a privacy pundit and editor of AWildDuck. He is a frequent contributor to The Wall
Street Journal. He is also a certified techno-geek with ties to CNet, Engadget & PC World.
I received comments and feedback to the above article in a GoogleTV discussion at LinkedIn, I wish clarify my position. In fact, 4 out of 5 readers disagree with my assertion that 4K is an unworthy technology enhancement. They raise valid issues, but just to clarify my opinion…
Like many others, I am a resolution fiend. I crave visual information & clarity especially in documents, still photos, and desktop space. Even beyond my own visual acuity, I would like the ability to magnify an image or document indefinitely.
But, it is my belief that with television and film, there is a limit to the visual experience that makes further resolution somewhat pointless. I do not claim that 4K is a) expensive, or b) that it will ultimately fail to dominate the market, or c) that it has no value in the production and archival community. Rather, I am observing that it brings no significant consumer enrichment–and that with widespread deployment of 1080p, there are better fish to fry…
The reason that moving pictures differ from documents, photos and desktops is because the human experience is a product of both resolution and motion (e.g. frame rate & change). In radio engineering or information processing, I liken it to the Gain-Bandwidth-Product (GBP), which is a well understood concept. Together, 1080p at 120 fps saturates the visual senses. And at a comfortable viewing distance for entertainment and drama, it is indistinguishable from higher resolutions. (Again, we are talking about examples with motion… Not still frames that were cut and enlarged, like the Avatar comparison shown in my original post).
So, while I very much covet that new Samsung ATIV Book 9 Plus (it packs 3200×1800 pixels!), I will not spend $1 more to get a TV that goes beyond 1080p, unless models lacking the higher resolution also lack some other application or feature. That dollar would be better spent on a decent subwoofer, adding backlight (to reduce eye strain), or adding black out treatment to the room (improves contrast as light splashes back from the walls). In fact, thundering bass, contrast and black level are critical factors to an immersive and a more exhilarating entertainment experience.
Ellery’s original post was qualitative. Carlton Bale added qualitative data based on the physiology of the human eye…
Terry Eh, a commenter on Carlton’s Blog, sums it up with a simple rule of thumb:
Incidentally, although their exist several slightly different standards, the new “Prosumer 4K” typically refers to 2160p, or 4 full HD screens stacked 2 across and 2 above.
OK Readers. It is just over 1 year since I advised against buying a 4K television. I said:
Retractions at AWildDuck are uncommon. I cannot recall completely reversing opinion on another issue. But this is such a situation. Scroll past image
In the past, I supported Carlton Bale’s calculations. Based on eye and retina physiology, it seemed reasonable that 4K only mattered when sitting ridiculously close to a television — too close to enjoy a theater experience. But, this week, I have jumped horse. With 50″ UHD TVs at just $399, all bets are off. Even if you rarely sit close to the television, I say “Go for it!” My comment about the cost trade off appears below. But separately…
Separately, I am beginning to suspect that a demonstration of visual acuity is not the supreme test of resolution-enjoyment. Let’s say that the average person cannot discern a pattern tighter than 0.3 arc minutes. But acutance perception goes beyond visual perception measured by a subtended angle. Acutance is the subjective perception of sharpness related to edge contrast. It is the reason that humans peering through a microscope can distinguish when two hairlines cross paths, even if the width of converging hairs is considerably under the range of human perception. This may have to do with wave refraction and the way in which a Moiré patterns is interpreted on the retina and in the brain.
Similarly, it is the basis of the printed unit bars on a calipers. The user looks at the millimeter lines of two sliding measurement scales that oppose each other. One scale is slightly stretched and so the measurement units do not mach up. The user reads the calipers by finding a pair of opposing lines that create one smooth longer line. The technique results in highly precise and consistent readings, yet the lines are too small to provide an accurate reading without the acutance trick.
Just as with edge enhancement in a photo (it makes the image “pop”), I am beginning to believe that 4K may provide tangible benefits beyond the classic observation of a tiny arc angle.
And then, there is issue of cost. Here too, I was wrong. This week, TigerDirect is selling a 50″ UHD (4K) TV with decent specs for just $399. Since this post will live past the current market, let me point out that this is less than ¼ of the market price last year and about ¼ the cost of a smaller 42″ HDTV just 3 or 4 years ago.
With the advent of cheap and ubiquitous Netflix dongles from Google, Roku and Amazon, I don’t care about so-called Smart TV features. What I do care about is contrast, motion index, sound and black level. If these things are on par with major brands, then we have only one question to face. Can the eye can discern the tight-grain pixels of 4K TV? As we discus above (and as Carlton Bale has explained in detail), very high resolution is only discernible at a close distance…
…But this ridiculously low cost skews my past arguments. Even if you rarely sit close enough to enjoy the additional 6.8 million pixels, I say “Go for it!” At this price, all bets are off.
As far as I am concerned, movie or TV show enjoyment depends on the following in this order: Content, sound and then visuals.
Lets face it, these days media offerings suck. I don’t care how big your video is, how deep your resolution or how many sound channels you have, a movie or TV show that sucks will still suck.
It has been proven that better sound is more important to how much you enjoy a program than video. So I’d rather spend money on properly decoded 7.1 than a bigger screen.
Referring to John Casey’s comment, all of these things (content, video quality, sound quality) are personal and subjective. As qualitative components of “enjoyment”, it is unlikely that one could demonstrate the claim “It has been proven that better sound is more important than video.”
1. The comment about content quality is a red herring. If each individual visiting this page didn’t have at least some content that he/she found enjoyable, they wouldn’t be here and they wouldn’t have built a home theater. No one says that the content has to be from Hollywood or of this century. C’mon! Surely, you acknowledge that there are some films or documentaries worth watching or sharing with friends!
2. Sound is certainly important to the theater experience. In fact, I tend to place it higher than moving beyond HD resolution. But to say that it is “more important” than video is somewhat meaningless. Is it more important, even if the choice is between upgrading already decent sound (but lacking a 2nd sub-woofer) -vs- upgrading a 180p video monitor at 15 fps?
3. In the past few months, I revisited and retracted my own post, which asserts no need for 4K: Is 4K HDTV relevant? https://awildduck.com/?p=2755 (that change of opinion is was acknowledged here at carltonbale.com and is also one of the more recent comments to my own article).
Although, I dispute that one can rank video, audio and content relative to each other, I believe that two factors are often overlooked and for many viewers can lead to a greatly improved experience:
• Thundering bass
• Extreme blacks with super-wide dynamic range.
For projection theater (as opposed to an screen lit from behind, such as LED or plasma), attaining #2 requires a projection screen with at least some gain (to avoid scattering) and a blackout box. This means painting the front 1/2 of your room a very flat black on all 4 surfaces. Without a black box, the contrast and dynamic range will be shot, because unlike a television screen, a projection screen is white.