Video Technology Cage Match: Which will win?

August 19, 2015 12:14 pm

The recent rumblings over HDR vs. 4K remind me of standards wars that came before, and about flash-113311_1280 (2)the rules of technology adoption.

Back when digital audio was emerging, and people raced to upgrade their systems with the latest high priced gear, I asked my friend where this was all going. Wasn’t the current generation good enough? After all, our ears were not getting any better.

He very astutely noted that as long as you can hear the difference between a live vs. recorded performance, or an analog and a digital recording, there was room for improvement. Similarly, I wrote in my post about 4K video that watching even the most advanced TVs was still not yet the same as looking out your window at the real world.

If the times have changed, one thing is the same: the march of technology continues.  Consumers want better experiences, vendors want to sell this year’s latest and greatest model , the media and entertainment businesses want to thrill audiences, and all need to place bets on the tech that they think will deliver.

It is not just the sizzle or specs that determine the winners. Recall that VHS beat Sony Betamax, arguably the better tech.  Yet 3D TV never really took off.  Additionally, it is about how consumers experience tech, and the dynamics of the competitive playing field.

For example, , it is easy fall in love with the 4000 pixels of UHD 4K. But, as I wrote in the above-mentioned post, it is not the only (or even the most important) spec.  Truly getting the best experience requires the right screen size and viewing perspective.  Otherwise, you won’t see the difference between 4K and 1080p.

The following chart (from Rtings) illustrates the value of 4K depending on your screen size and viewing distance:

resolution-4k-ultra-hd-chart

And there are other important factors, like HDR, HFR, and color gamut.

In this Videonet article, Matthew Goldman of Ericsson says pretty much the same thing:

[Goldman says] three other factors besides extra pixels that are equally immersive and do not require that viewers sit at the ‘proper’ viewing distance. These work best in combination and include HDR, a wider colour gamut, and ten-bit sample precision, a bundle of features Goldman dubs ‘HDR+’. Ideally, these would run on 1080p HD displays using 50 or 60 fps instead of standard 1080i ones.

Many video technologies start out in movie theaters, and then migrate to cable TV, and finally wind up online.  This has not happened for 4K yet. While device manufacturers are promoting 4K, the content producers and video technology vendors might not be putting the same marketing muscle behind it, as there are no apparent viewing criteria that the consumer might notice, or much unique intellectual property or licensing benefits that go along with 4K.

Dolby and Technicolor, on the other hand, do have an interest in promoting HDR, and have invested much in related technology and intellectual property. Amazon and Netflix have also started promoting HD, and recently we hear more about HDR in those services.

Is HDR running on high definition TV sets adequate, or will the mass market adopt 4K? Experts claim HDR benefits are more visible to the human eye than the higher resolution of 4K. With the incentive of influential companies such as Dolby and Technicolor to promote their own HDR IP, it appears that the market is shifting gradually to HDR.

Also, please see my post on Streaming Media about the Great UHD Debate.

Both technologies provide higher quality, which in turn demands that more bits are sent over the network. So, either way, we are headed towards a very bandwidth intensive future. Luckily, our AVA video acceleration technology is here to help.