4K TV — Do We Care?
May 15th, 2015After high definition TV took over, and most everyone who cared bought new sets, what could the TV makers due for an encore? If you think iPad sales suffer because people keep the ones they bought for several years, imagine the family TV. It’s not uncommon for sets to last 10 years or more before there’s serious deterioration in picture quality, or the set develops hardware issues. I recall giving a neighbor a 27-inch Sony TV, with a CRT, a year or two back because I had no need for it. The set was acquired in the early 1990s and had always worked perfectly.
Since there were fewer customers, TV makers had more difficulty selling product. Prices and profits continued to slip, as the industry searched for something to boost sales. Even smart TVs, with built-in apps, weren’t enough. The 2009 movie blockbuster, Avatar, delivered on the promise of 3D movies, so it was natural that TV makers tried to get into the act. There were actually two systems, one active, one passive. The former featured glasses with decoding circuitry embedded. A foolish idea, since you had to sometimes pay over $100 for a pair, and you needed one for every family member or friend who wanted to share the experience.
More realistically, active used the same glasses they handed out to you free in the movieplex, and thus TV makers might include several with your new set, depending on where it stood in the product lineup. There was even a 3D version of Blu-ray so you could spend another $10 or so on a disc to get a picture that literally popped out at you.
You’ll notice that more and more sets no longer have 3D. At first they were costly, but even as prices came down to more affordable levels, sales didn’t soar. People didn’t care, although some movies, such as the latest Marvel blockbuster, “Avengers: Age of Ultron,” are still presented in 3D form in a movie theater.
So what’s next and does anyone care?
Well, if most people already have HD, and 3D didn’t cut it for the masses, the TV makers decided that maybe a picture with four times as many pixels would help. Thus came Ultra HD, usually known as 4K. As with 3D, the first sets to come to market were costly, but the technology soon filtered down to lower-cost models. The 2015 VIZIO M-series Ultra HD sets are available for less than $1000 with small-to-medium sized screens. You can expect that the number of 4K sets in customer’s hands will soar this year; it’s in the single digits now.
But what about 4K content, and can you even see the difference?
Well, that depends. The rule of thumb is that you probably need a 55-inch or larger set to see much of a difference compared to 1080p HD at a normal viewing distance. One of my colleagues, John Martellaro of The Mac Observer, reports a big difference when watching his 65-inch 4K set at a distance of eight feet. When checking out 4K sets at the neighborhood Best Buy recently, I noticed that they often display still pictures rather than movies, since the difference in resolution is more visible. The better 4K sets also tout superior color quality.
Regardless, unlike HD, the move to Ultra HD results in a subtle improvement for most of you, but not for everyone, and thus it’s a harder sell. Worse, there’s not a lot of content yet. Some TV makers compensate by offering a digital media player or drive preloaded with 4K movies. Such streaming services as Amazon Instant Video and Netflix are rolling out 4K content, but it’s so heavily compressed to work at normal broadband speeds that the visible difference isn’t all that great. Because of the higher bandwidth, it’s going to be a while before the cable and satellite companies take the plunge.
But what about Blu-ray?
Well there is now an Ultra HD Blu-ray standard, and players ought to be available in the coming months. Assuming the entertainment companies come onboard with a decent selection of movies, that may help to jump start the industry for the holiday season. Of course, it all depends on how much these players will cost and how well they scale up regular DVD and Blu-ray discs. That’s also a key issue with 4K sets, since it’ll be some years before there’s enough 4K fare to make a difference.
You also expect that compression algorithms will continue to improve, so content companies can pack higher quality at a given bit rate, and that might help with streaming 4K fare. There’s also the open question, and the skepticism, about whether Apple will add 4K support to the next Apple TV.
I expect they will. One reason is that some existing Macs already output to 4K displays, and don’t forget the amazing iMac with 5K Retina display. There are also published reports that the existing A8 chip in the iPhone 6 series supports 4K video, only the feature hasn’t been exploited. But that doesn’t mean it won’t in the next iPhone refresh. But it won’t do much good in a tiny screen, whereas it’s tailor made for the next Apple TV.
As for me, I hope to test a 4K TV set from one of the major manufacturers soon.
| Print This Post
There is a difference viewing a high definition (HD) from standard definition (SD) was quite noticeable. But I find watching a 720p HD iTunes movies on a large-screen Panasonic HDTV is just as good as on my iPad mini with Retinal display.
Increasing the pixel count on a display is quite useful when the viewing is closer like working with the iPhone, iPad, MacBoos, and iMac. Watching TV from afar, I don’t see the tiny pixels and the picture is quite clear. So, increasing the pixel count to 4K is more about boosting sales.
I do prefer viewing HD content, but it should be about the quality of content itself. Watching bad content either from TV programs or movies is bad in HD or SD. There are so many bad contents out there.