Separate names with a comma.
Discussion in 'DVD' started by Michael Allred, Sep 11, 2005.
Okay, let's ask a different question.
Does anyone know the sustained data-rate supported by Blu-Ray or HD-DVD?
If we ask for 5 full range channels, uncompressed 48KHz, 16b/sample, we're talking a sustained 4.25 megabits/second for audio alone (I threw in some for subwoofer(s).)
Are we realistically going to get a sustained 25Mb/s for 2 hours? By my back-of-the-envelope calculations, we're looking at a user payload of 20gigabytes for a 2 hour film with 5.1 audio, and no extras or menus.
Or error correction, for that matter.
Wait, I thought Dolby Digital Plus was a lossless format? Is this not the case?
I'm confused, because I don't see why another lossy codec (in addition to regular old Dolby Digital) would be chosen for HD DVD or Blu-ray, and I also don't see why two lossless codecs (Dolby Digital Plus and Dolby TrueHD) would be needed?
No. DD+ is an enhancement to Dolby Digital, mainly for broadcasting purposes, to allow for either lower bitrates for 5.1 (like internet or tv broadcast) or higher(>640kbps)for HD disc formats with additional discrete channels beyond the 5.1 base.
Dolby TrueHD is the "plus" version of MLP lossless you could say.
DTS won't go out of business, but I see providers having these 2 Dolby options as mainstream, with DTS and DTS+HD as a niche as it is now on DVD.
For Blu-ray disc yes. It's designed for 36Mbps-plus playback. Each layer is 25GB. Dual is 50GB, Quad is 100GB.
Another reason why Blu-ray is the choice for HD disc playback. Because of HD-DVD's data storage and bitrate limitations, it's likely that the video will be not only more compressed, but lossy audio will be the norm, not the exception. There just isn't room for a lossless track.
I just got my pre/pro. Now they are coming out with this? Unbelievable.
Whenever they invent a new, and more space-efficient, lossless compression codec, it may replace older less-efficient codecs one day. That's progress, and it's bound to happen now and then.
Well, since I won't be an early adopter to HD-DVD or Blu Ray this "progress" won't really affect me. By the time I'm ready to buy in(2 or 3 yrs. down the road) I'll shell out the money for a receiver or pre/pro that decodes these new formats.
Um, so do we need new receivers for this?
I imagine so, and one with HDMI connectors at that.
Between this, WMVHD audio, the other new Dolby and DTS formats there are some major changes coming to receiver/decoders in the next year or two.
Not really. This has been predicted for the last 2 years and has been in discussion on these audio forums.
Enthusiasts shouldn't be surprised.
That is why I think DTS's backwards compatability is an asset to them and it should be the audio codec of choice for whatever HD-DISC we get. In theory, all lossless codecs should sound the same when decoded, that is if they are truely lossless and don't add their own sonic signature. There is something wrong if two lossless codecs sound different.
It's not "sounding different". It's features: such as dynamic range control, channels available, bit resolution, bass management, etc.
Any 5.1 ready receiver will be able to use this. The SPDIF outputs of the new HD disc formats will be for lossy codecs anyway (like DD+ & DTS), so I don't see your point about DTS being more "backwards compatible" being an asset. No receiver on the market has "DTS+HD", nor Dolby Plus.
HDMI or 6 ch analog will be the delivery mechanism for these formats, and more and more receivers are being introduced to take advantage of it. I'm upgrading this year to an Onkyo TX-SR803, which has both HDMI & 7.1 ch analog in.
Not necessarily, insofar as some sound differences between two lossless codecs may not be the result of "wrong" sonic signatures added by those codecs. Lossless-ness isn't the only characteristic involving accuracy that a given audio codec can have. Among other variables, different sampling rates and sampling sizes can impact the quality of the sound (e.g., although a given 8kHz digitization of a movie soundtrack may be lossless, it sure wouldn't sound very good, nor would it sound much like a lossless 44.1kHz digitization).
You are still relying on the decoder in the dvd player you purchase which is a negative for me. I'd rather have a decoder in the Pre/pro or receiver. I mean how many of us use the decoder in the dvd player we have now?
As long as Spielberg is connected with DTS, they will never go out of business...
Another point to be aware of is that DVD players' decoders do not do any type of room eq and most of the multi-ch inputs on receivers do not do any post processing on the input signals. So if you use any type of equalizing for listening preferences you lose that if you rely on the players decoders. I want the decoder in the same box as my receiver/pre-pro and not in the player.
A lot of current high end DVD players (certainly not the sub $199 jobbers) do the post processing on their audio outputs. One reason Dolby updated MLP was to make this easier to do at the source.
Supposedly all Blu-ray players will do a full audio option setup, at least the initial ones. That will also give electronics makers the opportunity to add these options to their 5.1/7.1 analog inputs (or HDMI).
Again, if both lossless codecs encodes a signal at the same resolution and they are decoded the same way, the audible result should be the same regardless of the source's native resolution. Good or bad, lossless audio should sound it as it is.
I apologize if I missed it, but I didn't see anything in your original statement regarding identical encoding parameters.