HDR10 VS DOLBY VISION: TV buying guide 2017

1. 4k A MUST!
In 2017 you have to buy a UHD TV no doubt, 1080p is now a thing of the past, move on, it has happened, that TV you bought 2 years ago and didn't think you would have to replace is now obsolete.
2. HDR Capabilities 
The next thing you have to consider is HDR. HDR is the new tech term used by TV manufacturers to sell by confusing people. Personally I think its more measurable and  practical compared to the 3D TV wave of 2013-2014 and confusing contrast ratio numbers. With HDR, there's actually a difference that can be observed. HDR (High Dynamic Range) on a TV means that you can distinguish better between different lighting and colour representations. Don't go out and buy a new TV just yet! Currently we have two major competing HDR standards, HDR10, supported by many manufacturers and Dolby Vision. There are two main differences between the two. The first is the number of bits used to represent colour i.e color depth. HDR10, as its name suggests uses 10 bits to represent colour. To put things into perspective, the more the bits, the higher the colour resolution. Resolution in this case referring to the level of differentiation among individual colours per pixel. More bits allow you to see the differentiation of colour to a higher scale. For instance, using a single bit to represent black, you get either black or white i.e 2 raised to the power of 1. Representing the same black colour with 10 bits means you can get 2 raised to 10  = 1024 levels of black (More like 1024 shades of Grey). HDR10 can thus represent a single color to 1024 levels per pixel while Dolby Vision can represent the same colour to 4096 levels.
The second major difference is how the different formats handle metadata. Metadata is simply data that describes other data, I know right? So Both HDR 10 and Dolby Vision have meta data but Dolby Vision represents it dynamically (i.e. an analog signal that continually varies frame by frame) while HDR10 represents it statically, i.e. the meta data is defined at the beginning of the frame to the end. This means that Dolby's implementation can better represent change in lighting/colour continuously compared to HDR 10. Well I am no eye doctor or scientist but theoretically, Dolby Visions implementation is better. Practically, I don't think anyone would be able to pick the two apart.
Personally, if I were to buy a TV set, I would lean more towards Dolby's implementation because its superior. However, I see no major problem with HDR10 either. This is because HDR10 is currently supported by all the big TV manufacturers i.e Sony, Samsung and LG. Moreover, I am quite sure that in two years, we will have HDR12, which will then use 12 bits to represent colour in addition to some other improvements.


There are other manufacturers trying to implement two other different HDR standards (sigh). Hopefully, we get a standard that serves the consumer and is able to perform exceptionally well.
The HDR standard is still evolving. Many manufacturers now support HDR 10 (I think simply because to implement Dolby Vision they would have to pay royalties). This means that most of the TVs that you can buy at the moment will most likely have HDR10 and a few will support both. If you can get one that supports both formats, the better for you.

Comments

Popular posts from this blog

The Different Display Technologies in the world explained, CRT vs Plasma vs LCD vs OLED

Why the Lumia 520 is still my primary device, even with a galaxy s4

ARM based processors VS X86 based processors