Bjoern Roy
Second Unit
- Joined
- Oct 15, 1998
- Messages
- 315
In this thread, I raised the issue of bitrate fetishism. I couldn't help it and spawned a full fletched rant.
[rant on]
This post is directed at the recent trend in the community to blame everythin on bitrate:
Title has too much EE? "Bitrate must have been too low."
Title too dark, bad shadow detail? "Higher bitrate, less extras, would have improved this."
Picture soft and filtered? "Why couldn't they have spread the transfer over 2 discs..."
Muted pale colors, too dark, bad color balance? "Oh, if this only was a superbit transfer."
Same in the audio domain: Soundtrack has no ultra low bass extension? "If only it would have been full bitrate DTS."
Give me a break.
Just read some of the recent threads and you will understand what i mean. After the superbit titles, some people seem to think that higher bitrate alone is a cure to everything.
Everything else being equal, higher video bitrate does mainly one thing: Get rid of compression artefacts (blocking, mosquito noise etc). If the bitrate is very very low, detail can suffer too, but not nearly as much as it already has through too much pre-filtering.
People seem to think the line of reasoning is this:
Bad transfer in whatever way (contrast, colors, soft, EE...) -> increase bitrate -> perfect
Eh, no!
Lets say a transfer has 3.5Mbit average bitrate. And the characteristics of the transfer might be:
- Picture is a little soft, misses fine detail
- Shadow detail/delineation is not perfect
- Black level a bit too hot (10-15IRE instead of 7.5)
- Colors a bit dull and muted
- too much EE
- no compression artefacts, only a few in darker scenes
Sounds like the average Col/Tri transfer, right?
Now, you want to make a more detailed version of this transfer. For that, you would have to prefilter the picture to a lesser degree. The result is a sharper picture, the softness is gone, hooray. But now your picture shows lots of compression artefacts. Why? Because the 3.5Mbit bitrate is not enough to tame the additional information/detail without artefacts. So in turn, you need to raise the bitrate until those compression artefacts are reduced to an acceptable level again. You might end up at 7-8Mbit average.
So you don't raise the detail BY increasing the bitrate. Its the other way around. You increase detail (by filtering less) and then you NEED to increase the bitrate to get compression artefacts back to an acceptable level. If you would simply increase the bitrate without changing anything else, the transfer would look the same.
Note, that all other characteristic of the 3.5Mbit transfer above (colors, shadows, EE) are NOT really affected through this process!
So now you have a 7-8Mbit transfer that has these characteristics:
- Picture now very detailed
- Shadow detail/delineation still not perfect
- Black level still too hot
- Colors still a bit dull and muted
- still too much EE, although the different filtering might have changed its characteristic (thinner halos etc.)
- still a few compression artefacts, probably even more than the 3.5Mbit version
Air Force One fits this scenario perfectly. The upper being the normal release, the lower being the superbit release.
And the Superbit version does indeed have more compression artefacts in some scenes than the original release. If you understand what i wrote, this shouldn't be a surprise.
On the normal filtered, soft non-SB release of a particular title, a 3.5Mbit average bitrate might have been enough to tame compression artefacts. When the SB version runs the detail full throttle, a 10Mbit average bitrate might be necessary to avoid compression artefacts. If you can only spent 7.5Mbit, you will end up with a sharper picture, but more artefacts.
Take a look at Harrison Fords first speak in Air Force One to see a Superbit scene that is highly detailed, but exhibits compression artefacts.
[rant off]
Regards
Bjoern
[rant on]
This post is directed at the recent trend in the community to blame everythin on bitrate:
Title has too much EE? "Bitrate must have been too low."
Title too dark, bad shadow detail? "Higher bitrate, less extras, would have improved this."
Picture soft and filtered? "Why couldn't they have spread the transfer over 2 discs..."
Muted pale colors, too dark, bad color balance? "Oh, if this only was a superbit transfer."
Same in the audio domain: Soundtrack has no ultra low bass extension? "If only it would have been full bitrate DTS."
Give me a break.
Just read some of the recent threads and you will understand what i mean. After the superbit titles, some people seem to think that higher bitrate alone is a cure to everything.
Everything else being equal, higher video bitrate does mainly one thing: Get rid of compression artefacts (blocking, mosquito noise etc). If the bitrate is very very low, detail can suffer too, but not nearly as much as it already has through too much pre-filtering.
People seem to think the line of reasoning is this:
Bad transfer in whatever way (contrast, colors, soft, EE...) -> increase bitrate -> perfect
Eh, no!
Lets say a transfer has 3.5Mbit average bitrate. And the characteristics of the transfer might be:
- Picture is a little soft, misses fine detail
- Shadow detail/delineation is not perfect
- Black level a bit too hot (10-15IRE instead of 7.5)
- Colors a bit dull and muted
- too much EE
- no compression artefacts, only a few in darker scenes
Sounds like the average Col/Tri transfer, right?
Now, you want to make a more detailed version of this transfer. For that, you would have to prefilter the picture to a lesser degree. The result is a sharper picture, the softness is gone, hooray. But now your picture shows lots of compression artefacts. Why? Because the 3.5Mbit bitrate is not enough to tame the additional information/detail without artefacts. So in turn, you need to raise the bitrate until those compression artefacts are reduced to an acceptable level again. You might end up at 7-8Mbit average.
So you don't raise the detail BY increasing the bitrate. Its the other way around. You increase detail (by filtering less) and then you NEED to increase the bitrate to get compression artefacts back to an acceptable level. If you would simply increase the bitrate without changing anything else, the transfer would look the same.
Note, that all other characteristic of the 3.5Mbit transfer above (colors, shadows, EE) are NOT really affected through this process!
So now you have a 7-8Mbit transfer that has these characteristics:
- Picture now very detailed
- Shadow detail/delineation still not perfect
- Black level still too hot
- Colors still a bit dull and muted
- still too much EE, although the different filtering might have changed its characteristic (thinner halos etc.)
- still a few compression artefacts, probably even more than the 3.5Mbit version
Air Force One fits this scenario perfectly. The upper being the normal release, the lower being the superbit release.
And the Superbit version does indeed have more compression artefacts in some scenes than the original release. If you understand what i wrote, this shouldn't be a surprise.
On the normal filtered, soft non-SB release of a particular title, a 3.5Mbit average bitrate might have been enough to tame compression artefacts. When the SB version runs the detail full throttle, a 10Mbit average bitrate might be necessary to avoid compression artefacts. If you can only spent 7.5Mbit, you will end up with a sharper picture, but more artefacts.
Take a look at Harrison Fords first speak in Air Force One to see a Superbit scene that is highly detailed, but exhibits compression artefacts.
[rant off]
Regards
Bjoern