What's new

PCM5.1 & lossless audio discussion - split thread from A Knight's Tale review (1 Viewer)

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
True, you're simply multiplying every bit by a fixed value. 24 bit DSP implies that resolution won't be lost (s/n) for minor level changes, but my ears (and those of several engineers I've spoken with) attest that for whatever reason, in playback the sound of a dialog-norm-applied signal isn't the same as one with the flag set to 0 (or played with dialog norm defeated) even when the volume has been adjusted to compensate (so it's not a level thing... it's a data-thing).

Perhaps there's more to the filtering than the simple model I've described? Or perhaps as "clean" as level changing ought to be, in practice the algorithm used in most DD decoding chips is not optimal?

However, where lossless encoding is concerned, one can't argue that if the resulting LPCM datasream passed out of the chip to the DAC isn't a bit-for-bit copy of the original LPCM, then it's not exactly a "lossless" digital packing algorithm...

;)
 

RobertR

Senior HTF Member
Joined
Dec 19, 1998
Messages
10,675
Well, you lose a little, but it's like worrying about the drop of mustard you lost from your hot dog. Would you really taste the difference? :)
 

ChristopherDAC

Senior HTF Member
Joined
Feb 18, 2004
Messages
3,729
Real Name
AE5VI
Assume that the level change is 4 dB. That's not an integer number of bits, but it's greater than one bit. The result? Well, you're more-or-less going to lose the whole LSB to dither noise, and the next-to-least will be strongly affected also. That's not much at 24 bits, but it can make an audible difference at 16.
I admit it, I normalise .WAV files which I record, but I don't kid myself. It's purely a convenience step, and if I want to do any actual work with them I go back to the backup version without the level change.
 

GregK

Screenwriter
Joined
Nov 22, 2000
Messages
1,056

While this may be true of the current Dolby Digital specs, be assured Dial Norm WAS defeatable in many of the first generation Dolby Digital decoders and receivers. Roger Dressler commented on usernet around that time that later models would not offer this feature, which of course was the case.
 

PeterTHX

Senior HTF Member
Joined
Dec 30, 2002
Messages
2,034
Couple of corrections:

Dolby Digital decoders are required by spec to have Dialog Norm. It doesn't mean the program has to have it (hence "The Lion King" observation).

It's not Dolby's fault the program providers are doing this. Their decoders must work with a wide variety of sources, be it DVD, Satellite, Cable, XBOX, etc. You want releases that don't comprimise the original source? Let the studios know.
 

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
PeterTHX,

excellent point. I'm not sure why the practice is so common... I think (and this is what I've been told by several recording engineers) the most common reason is that the folks mastering DD audio don't think twice about the default level setting. I'm sure if the default was 0 (off) we'd see far fewer DD soundtracks using dialog nor. In any case, you're right that the real issue lies with the parties doing the encoding.

There seems to be a "more is better" processing philosophy to most studio mastering these days. We see it with both video and audio. Also problematic are the many 5.1 mixes that get "dumbed down" to "improve" 2.0 mix-down quality in players... but don't get me started on that!

BTW, have any of you also noticed how many AC-3 soundtracks on laserdisc sound better than the complimentary soundtrack on the DVD? Guess what... the AC-3 encodings on LD rarely, if ever, used dialog-norm. Ironic that even the lower-bit-rate AC-3 on laser tends to sound better than the (typical) DD on DVD!




Thanks for sharing that. I didn't know. Maybe at some point along the line Dolby firmed-up the decoder specs? Though personally I think that a user-defeatable processing step like dialogue norm should always be allowed... similar to the "direct" or "pass through" option on many audiophile processors to bypass EQ circuits etc.
 

GregK

Screenwriter
Joined
Nov 22, 2000
Messages
1,056
I don't think most us are questioning the inclusion of Dial Norm in the DD spec, or it purposes, which have always been explained quite well by Dolby for those who ever bothered to really look. ..It is the ability to manually defeat dial norm which is the issue.

Should Dolby allow Dial Norm to be manually defeated again, whenever Sony or whomever screws up their Dial Norm settings, or if I should decide I don't care for it all .. for *whatever* reasons .. the option would always be up to the individual. It doesn't harm the intended content, and out of the factory it be could defaulted to "on" for those who could care less. This is what Dolby should consider.
 

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
Exactly.

And since dialog norm is designed to "normalize" between several streamed programs sharing a continuous feed (like TV shows), a single-program item like a DVD movie title wouldn't incur any down-side to manually defeating the feature upon playback... listeners are used to making minor adjustments to the volume knob when switching from one DVD to another.
 

PeterTHX

Senior HTF Member
Joined
Dec 30, 2002
Messages
2,034

Somewhat, but I'm sure all of us have experience with menus BLASTING us out of the chair, or commentary tracks either too soft...or too loud (Starship Troopers anyone?)...extras & stuff that is DD encoded yet not mixed by the original sound mixers, etc.

Look at the flak Warner got for the HD DVD menu sound levels.
 

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
Yes... Menus never use dialogue-norm and that's the ONLY intstance on a DVD that's ever warranted its use!

But why should a poorly mastered menu force you to compromise the *feature film*? Why not just have the techs at WB record a menu that's at the correct level to begin with? (ie, the real solution) or just use dialogue-norm there?


Commentarys can also be recorded to level-match with the feature film or dialogue norm used only on the commenatry track. That would work fine since the dynamic range of the film is greater and pushed dialogue to a lower level anyway... it's the compressed commentary miced-voice recording that needs to be attenutated (not the film).

And naturally Dolby could just make the dialogue norm attenuation user-defeatable upon playback so it would never matter... HT enthusiasts who want the purest sound could just defeat it when they want to watch-a-movie high-end style and then leave it set as normal when they're in special-feature mode.
 

AndreGB

Stunt Coordinator
Joined
May 19, 2004
Messages
73
Orangeman, thank you for splitting the topics. ;)

David, have you ever considered being a teacher? Thank you for your explanation. :D Like I said, I've done some basic encodings my own but I have never used any processing (beyond the ones required for lossy encodings) on my sources.

I understand now. Dial norm is basically what ReplayGain does to MP3. Some people were trying to make all MP3 sound the same, using a known "audiophile" base level. The problem is that this base level is sometimes too low, so it attenuates the MP3 too much. That is good, but unfortunately Windows sounds are not attenuated as well (and some Windows sounds are useful). And unfortunately ReplayGain doesn't restore the dynamic range killed by bad mastering. :frowning:

Now, I do think we should be able to turn off this feature on our decoders. I mean, why force it? Anyway, consider that you would be feeding the DD streams right into your receiver/processor. In this case it does make sense to leave it on on your decoder/processor (you might use it for your DVD, TV, etc). Now, why do studios use it on DVDs? David, have you talked to any other audio engineer and asked him why does he use dial norm on his DVDs? I wish to know whether there is a good answer to that question too.

Now, can I help to guess why dial norm, even though it is a simple operation on the PCM code, makes the sound worse? Because "digital amplification" is not like "analog amplification". I mean, the gain you apply using your receiver's gain is not the same as making the PCM sound louder digitally. Your receiver's gain (the volume, that is) increases the power output of your amplifier to your speakers. "Digital volume" is just a trick to make the sound louder, but it is not capable of actually increasing the electrical energy output to your speakers. And this actual electrial energy (drawn from your power outlet) is the one that makes the sound stronger, bolder, that's the one that increases the sound wave energy to make your room shake. That's how dial norm ruins the experience. Not because it changes the decoded PCM quality, but because it doesn't change the actual electrical energy applied by your amplifiers onto the speakers.

Then I wonder that if you mess with your receiver/processor settings, to change the channels gain a bit, it will make the dial norm effect less worse, won't it?

[EDIT] Poor English
 

AlexBC

Second Unit
Joined
May 1, 2003
Messages
259
Great insights Andre. I'd like to hear what others have to say about it.

The fact that DTHD still has dial norm as default kind of proves my point that things usually aren't exactly what they're advertised as. There's always the fine print.

But if the defeat function becomes standard on all receivers or surround processors (or at least the high-end ones), then I guess all be fine with it.

 

RobertR

Senior HTF Member
Joined
Dec 19, 1998
Messages
10,675
That makes no sense, Andre. It's the same as claiming that NO digital source can cause ANY amplifier to vary the electrical energy it sends to ANY speaker. That's obviously not true. The output of a D/A converter is a varying voltage, just like ANY analog source, and that voltage is determined by the digital numbers, which are always varying the volume regardless of the presence of dialnorm. The power amplifier stage sees NO difference between the output from an analog source and a digital one. Once things get to the amplifier, there is NO difference between "digital amplification" and "analog amplification". It's a meaningless distinction. Dialnorm doesn't change that process at ALL. And since you said there's no change in the PCM quality, it follows there's no change in the sound quality.
 

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
RobertR is correct about voltage etc. The way digital volume control (like dialog nor) works is by recalculating the data so the volume change will be heard directly of the D/A converter (voltage).

but this comment is concluding one error from another:


Because the LPCM data *does* change. That's how "digital volume control" works... it recalculates each data point with a new value. Any non-zero change value will cause every single bit-word to take on a different value than what it previously had. Aside from the question of how much degradation to the final sound would occur from data-recalculation of this kind, as ChristopherDac explained, dropping just 3db in level constitutes a loss of an entire bit of resolution at the 16-bit level... the native resolution of most DD encodings. Most tracks with dialog-norm seem to drop between 3 and 6 db (1-2 bits at the 16-bit level... dropping to 15 or 14 effective bits of resolution). This doesn't change even if 24 bit filters are used (but it would change if the source data was 20 or 24 bit resolution to begin with).
 

PeterTHX

Senior HTF Member
Joined
Dec 30, 2002
Messages
2,034

Unfortunately, many studio masters are still 16/48. So any DTHD tracks would be limited likewise.
 

AndreGB

Stunt Coordinator
Joined
May 19, 2004
Messages
73
Wait Alex, having dial norm is good for the reasons David posted before. Default mean whether it is "turned on" on the encoder then? Well, I hope not. This depends solely on Dolby.

Robert and David, you are right, but probably I didn't make myself so clear, sorry. The DAC itself is not an amplifier, and that's the whole point I was trying to make. So even if you increase (or decrease) the perceived loudness digitally (i.e, before the DAC) you don't change the amount of energy the amplifier is sending to the speakers. I know PCM translates to voltages, but the "number of voltages" a PCM can handle is defined by the amount of bits devoted to each sample (16-bits for CDs, DVDs, etc). So digital amplification has a logical limit. You can make the PCM describe the highest amplitide wave possible, but the energy this wave is going to have in the speaker is limited by the amplifier.

As a good example you can use your PC. If you have any speakers connected to it you can do this "test". Windows has a master volume control. So let's say you set it to maximum. Windows will be recalculating every sound on your computer to be the loudest it can be and sending it to the DAC on your sound board. So you can listen to this sound at a certain volume by changing the volume on your speaker. Now, you can set Window's master volume to 50%. You will have to increase the volume on your speakers to listen to the same sound at the same volume as before.

If digital amplification had any effect on the wave energy then the two situations I described above wouldn't be different. And they are. I can hear them both differently on my THX certified PC speakers (oh yeah - http://www.logitech.com/index.cfm/pr...CONTENTID=9586 :D).

I hope it is clearer now. ;)
 

RobertR

Senior HTF Member
Joined
Dec 19, 1998
Messages
10,675
So would LPCM, DTSHD, and any other format, making talk of 24/96 etc. pretty meaningless.
 

DaViD Boulet

Senior HTF Member
Joined
Feb 24, 1999
Messages
8,826
Despite the noise-floor of typical electronics, 20 bit and above tends to sound more natural to the ear versus digital audio at the 16-bit level. Whether it has to do with easing the task of d/a conversion or whether the smoother more "sine like" wave produced is perceived to sound more natural is a common topic of discussion among audiophiles. But anyone who's spent any time with SACD or DVD-A knows that even on "mid fi" gear the improvements are noticible for an audiophile listener. I would agree that the improvement from 16/44.1 to 20/48 is probably more noticable than the benefit of 20/48 to 24/96 etc.


I've heard different reports on this. Some engineers contend that they have many 16 bit masters and other engineers report doing almost all mixing/mastering at 20/48 with only the final downres to 16 for consumer media.

Certainly as time marches on we'll get more and more 20/48 and above-res sources to encode!
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest Articles

Forum statistics

Threads
357,052
Messages
5,129,647
Members
144,285
Latest member
acinstallation715
Recent bookmarks
0
Top