Posted February 14 2007 - 12:34 AM
| However, there are ways around this problem if it occurs, that minimise the use of filters but still allow the 9.6Mb/s data throughput of DVD-Audio, Haynes explained to High Fidelity Review reporter Sanjay Durani when he visited their West Los Angeles facility last Thursday. One of the tools used by 5.1 Entertainment is the SurCode MLP encoder from Minnetonka Audio Software (see our recent news story), which allows the operator to take some of the bit-pool from channels where the loss will be least noticed (the LFE for example) and use it for channels where it might be more needed (the front left and right pair). While Guthrie’s remarks might be accurate from one perspective, the problem isn’t seen as a real-world impediment to releasing high quality, high-resolution music on DVD-Audio; certainly not from people who work day-in and day-out with PCM, MLP and DVD-Audio at 5.1 Entertainment. Jeff Dean, who was recently named President of Silverline Records, feels that Guthrie may have been “…theorising potential problems rather than talking from real working experience.” |
| “Please don’t forget,” Bob Ludwig added, “…in the real world, time-is-money deadlines loom over everyone’s head. A MLP encode engineer might intentionally, on their own, choose to slightly compromise the integrity of the music of a difficult encode in order to meet a deadline, rather than taking the time necessary to go the two or three attempts in order to do it with zero loss. From our own DVD-Audio authoring experience where the front office asks the encode engineer why a MLP encode was not done ‘on time’, the vast majority of MLP encodes fail due to the engineer setting the stereo down-mix coefficients too aggressively, which therefore causes a digital ‘over-level’ when the six channels are combined than from any other cause.” |
| “All lossless compression systems, including those that compress 1-bit streams such as DST, are limited to the ‘entropy’ or ‘noise-floor’ in the incoming signal. This is not a problem with any technique, it is a fundamental aspect of information theory. If a signal contains a lot of wideband or high-frequency noise then you end up with a bigger file. More compression would result if this noise were reduced or, much better, not introduced in the first place. In our MLP training information, we point out as a matter of information that the size of a compressed file can be adjusted by using (gentle) low-pass filtering or selection of a word size to suit the project. This is useful background information to a certain type of producer, who may want to free up space on a disc for other assets or simply understand how the process works. |
While all these people quoted are music producers proud of thier work, I wouldn't put it past Hollywood studios to cut a few corners to make dealines and hit specific targets. In the long run uncompressed PCM may be the way to go to avoid potential compromises.
Blah, typo'd the thread title... nice. Maybe a moderator can correct it.