Anirudh RegidiApr 18, 2016 19:01:20 IST
We recently got a chance to experience Dolby Vision, a brand new display standard that is supposed to raise the bar as far as visual fidelity in the industry is concerned. The technology can best be described as a form of post-processing that retains more colour information than currently available compression technology.
To give you an analogy, wav is an uncompressed audio format for music. The files are massive and there's no chance that your speaker or your ear can appreciate the finer detail in the format. Thus, we compress the audio to a format like mp3, which uses algorithms to determine how our ears work, and it cuts out unnecessary audio data accordingly. The mp3 format isn't perfect and that's why other compression formats popped up— these include formats like aac, ogg, etc.
Dolby Vision does the same, except it does so with video. Codecs like H.264 (.mp4) already exist and they are very good, but Dolby claims that Dolby Vision adds a great deal more information to these codecs. The information is tacked on to existing codecs as metadata and can be decoded by compatible hardware.
We got to experience Dolby Vision on a gorgeous 84-inch, 4K, OLED display with WRGB support from LG. We were told that the display alone retails for a little under Rs 6,00,000 and it's testament to the quality of the display that we'd gladly pay that amount if we could.
Dolby demoed a handful of scenes on the display and while we will admit that they were spectacular, especially the whites and the blacks, there was no way to tell if Dolby Vision was having an actual impact on the image quality or not. We already know that blacks on OLED panels are actually pitch black and the WRGB (the W indicates the presence of a White LED) panel means that whites will be white.
What we're saying is that by default, video on that display should have been stunning anyway and considering that we've never experienced such a high quality display before, we don't have a base-line to compare Dolby Vision against.
Dolby assures us that there's a significant difference in image quality and we'll just have to take their word for it.
To help us, and you, get a better understanding of Dolby Vision and its potential, we interviewed Mike Chao, Regional Vice President, Asia-Pacific, Dolby Laboratories. Here's what he had to say.
Disclaimer: Some of the answers have been edited in the interest of brevity and are not the exact words of Mike Chao. However, every effort has been taken to ensure that the substance of his responses is unaltered.
On a scale of 1-10, 10 being the actual scene that our eyes perceive, where would you rate the following:
- HD broadcast from, say, Tata Sky or its equivalent
- Dolby Vision certified broadcast
- Raw camera data
- Playback on Dolby Vision certified TV (such as the 4K OLED TV mentioned earlier)
- Standard Blu-Ray playback on the same OLED TV.
This is really hard to do as perceived quality is highly subjective, but we are seeing reactions from consumers that are very clear and positive once HDR and wide color get added into the mix and less clear when there is merely a step up from HD to UHD. Compression is always a consideration but is very much orthogonal to the other aspects mentioned here. HEVC helps a lot to enable UHD at reasonable bit rates.
You mentioned that Dolby Vision metadata would add 20-30% more data to an HD stream. Are current content delivery networks on set-top-boxes capable of handling this increase in bandwidth?
Yes, the increase stated is for the dual-layer profile of Dolby Vision that adds backward compatibility to standard dynamic range receivers. The enhancement layer and metadata take about 20% in addition to the backward compatible signal. Most deployed Dolby Vision services like Vudu and Netflix use this profile and delivery is not a problem. However, Dolby Vision also offers single-layer profiles for applications where backward compatibility is not required. These profiles are more bandwidth efficient.
How much of an overhead will be added when processing raw camera data to support Dolby Vision standards?
If this question is about the color grading workflow, then the answer is none. RAW data is already used today for standard color grading. Dolby Vision merely replaces the grading display with one that is capable of displaying the higher dynamic range and wider color. The color grading tools are equipped with a plug-in that creates the dynamic metadata. That’s pretty much it.
Since Dolby Vision is claimed to be compatible with existing codecs, does this mean that Dolby Vision can be compatible with streaming services such as YouTube and Netflix and that anyone can view said content with a compatible display?
Yes, Dolby Vision is codec-agnostic. Today, it is integrated with H.264 and HEVC but others like VP9 or AVS+ are also possible if required in the future.
The documentation mentions that Dolby is working with creators of game engines to ensure compatibility with Dolby Vision. Which engines would these be? How does this impact the traditional game development cycle?
Gaming is a potential use case for Dolby Vision and we have shown technology demonstrations with EPIC’s Unreal Engine and with Amazon Games’ Lumberyard recently. However, we don’t have anything to announce at this point.
If we're talking about games, that means we're also talking about PC and console hardware as well as PC monitors. Can we expect to see Dolby Vision certified monitors in the near future? Are you working with any specific brands?
Please stay tuned for announcements in the future.
How do graphics cards (from Nvidia and AMD) figure in this picture? Will these cards need to integrate Dolby Vision chips or something similar or will the technology work with existing hardware?
Typically, hardware with this kind of capability can run Dolby Vision in software but there are no products announced today.
Most PC monitors are 6-bit panels and better ones are 8-bit panels. How much of an advantage will Dolby Vision offer, especially when the technology has been designed with 12-bit panels in mind? Are lesser panels to be completely excluded from Dolby Vision certification?
We do strongly recommend to have at least a 10-bit video pipeline. Others are possible but are likely not showing the full potential of the technology.
How much does Dolby Vision certification and technology add to the cost of a display unit (monitor or TV)?
In our work with TV, OEMs have made it clear that the desire to build products with HDR and wide color capability has certainly been triggered by Dolby’s push in the direction of video fidelity. The actual Dolby Vision processing that ensures that these products perform in the best way possible does not really add significant cost. Yes, there is additional die-area on the chipset to perform the Dolby Vision functions and there is a licensing fee but the total cost here is less than what I usually pay for a cup of espresso...
You mentioned Dolby Vision certified laser projectors for theatres. Can you elaborate on their capabilities and the technology behind them? When can we expect to see such projectors in India? Is there any chance that the technology can trickle down into consumer-grade projectors anytime soon?
We have co-developed the Dolby Vision laser projector used as part of the Dolby Cinema concept with Christie. The Dolby Vision laser projection system uses state-of-the-art optics and image processing to deliver high dynamic range with enhanced color technology. Dolby Vision laser projection delivers a contrast ratio that far exceeds that of any other image technology on the market today. By comparison, standard Digital Cinema Initiatives (DCI) projectors on average have a 2,000:1 contrast ratio compared to film negatives, which can deliver an 8,000:1 contrast ratio. When it comes to laser, other projectors on the market can achieve a 5,000-8,000:1 contrast ratio. A Dolby Vision laser projector exceeds 1,000,000:1 contrast ratio.
Quoting Don Shaw, Senior Director, Product Management, Christie, "Dolby Vision laser projection also provides a more detailed viewing experience with up to 14 foot lamberts (a measure of brightness) onscreen in 3D and up to 31 foot lamberts for 2D Dolby Vision content, far exceeding any ‘ultra-bright' industry standards, to all Dolby Cinema locations."
Consumer grade projectors with Dolby Vision have not been announced to date.
Editor's note: To put things in perspective, a standard monitor has a contrast ratio (not dynamic contrast ration, mind you of around 300:1 to 600:1. The iPad, with it's exceptional display, will only manage around 800:1. Any OLED screen can, however, claim a contrast ratio of infinity:1 because blacks on OLED are pure black. Bearing that in mind, a claimed contrast ratio of 1,000,000:1 is staggeringly good. If reports are to be believed, Dolby isn't exaggerating in the slightest.