I’ve read your references to the problems encountered in producing Blu-Ray disks and your speculation that they are not able to produce disks reliably of more than 20GB.
Given your definitive statement, can you state assuredly that the release in question is on a 20GB disk?
Or is it just a less than satisfactory transfer, many of which we have seen over the years on all types of media?
Again, I did say possibly on the old HD master syndrome. There are many factors in getting to such a lousy, soft transfer. Possibly a lot of filtering along with bit starvation. One thing's for certain that the IP used was less than pristine. I don't think it had anything to do with being a Super35 film, as there are any number of other films shot that way that have better, sharper transfers. Though it does add too much grain.
I've seen brand new transfers from Sony and they too have an inordinate amount of gate weave, scratches, and dirt splotches. The worst of the lot of new titles I saw was Capote.
Also, Sony has in the past artifically lowered the quality of their "regular" DVD's in order for the Superbit versions to look even better. Could it be the case here too? Perhaps they have a deluxe version with a much better transfer waiting in the wings for BD-50... triple and quadruple dipping is the name of the game at Sony.
Could you please explain how applying a softening filter would decrease space requirements? It was my understanding that only compression, length, and resolution had an effect on size. I'd be interested to know if filtering can affect the size of the end product and how it works.
Edit:
I'd still appriciate a link to somewhere describing BR's "Replication" Problems and their use of 20 gigs, I still haven't been able to find one.
Thanks for the link Robert. Yet another reliable, anonymous, internet source. He states that “The first Sony titles are being limited to 20 Gig Single Layer. They can't get decent yields even on Single Layer without reducing the amount of data on the discs. By limiting the data on the discs to 20 gig or less, they prevent data from extending into the outer diameter of the disc which is the hardest part of a disc to read. This way, they improve their manufacturing yields just enough to get something out onto the store shelves. But 50% yields is still terrible. HD DVD gives manufacturers better than 95% yields. We like it to be closer to 97% or 98%, but 95% is pretty good for a first time around.”
Is this someone who knows, or someone who is quoting another source with no name and with no available background?
there is a whole thread devoted to replication discussion over on AVS.
the 20gb talk is still conjecture, as far as I know-although we should know soon just how full these first Bd titles are.
filtering does indeed aid in compression by reducing the amount of fine detail needed to be compressed and by enabling greater compression to be utilized. an image is made up of millions of pieces of information (pixels). If you soften or filter the image, you can reduce that by several hundred thousand pixels instead (or wahtever, just being general here). The less individual and distinct pixels you have at any given point, the lmore you can compress the file and the more space you free up.
too much compression and not a high enough bit rate, and you get the image breaking up into larger blocks (rather than really, really, really, tiny blocks). Thats why fast action (quick changes from frame to frame) on Satelite or cable usually end up breaking up.
Noise and film grain take up a lot of the bitstream when the encoding is done.
By pre-filtering and removing noise (ie softening) it makes life more easy for the encoder and requires a lot less space.
When too much softening and filtering is done, it affects the picture pretty clearly.
By the way - here is the link to the thread where the 20 Gig disussion took place - I had previously posted it somewhere, but fortunately I have it bookmarked these days
This is the problem I have with Internet sources Rob. While I realize that you are just passing the information you have along, a person with no name (and who therefore can’t be checked as to his expertise) and with no title (engineer, VP, janitor or whatever?) who might work for Deluxe, is not really any kind of source.
The 20 gig limitation on BD is nothing more than speculation (and probably mostly FUD). Everyone can take a look at the source Rob_HD has provided and decide for themselves if this constitutes "fact" or rumor.
In any case, the first titles are released. It won't be long before someone can measure the file size of the data contained and determine if it exceeds the 20 gb space. Hopefully, the 20 gig rumor is nothing more than a rumor.
BTW, Rob_HD is correct about the common practice (especially with DVD) of filtering out high-frequency detail to ease compression. It's something that can be a problem for both MPEG2 and VC1 encoded HD titles as well, though because VC1 is more efficient it stands a better chance of leaving the source's natural detail in tact.
Sony may have filtered the 5E, or they might have just had a pretty lame master. Either way, if the film-source is sharper and more detailed, then there's no excuse. Hopefully we'll get confirmation.
Oh, to clarify, no one who has seen both and compared side-by-side the HBO HD and BD has said the BD is softer than the HBO version. One individual said that he thought the HD HBO image was "sharp" but he had not even seen the BD image so no comparison could be drawn from his statement.
Peter,
Often the practice of Bobbing reduces the effective vertical resolution to only 1/2 the 1080... since it just "averages" between the odd/even line pairs. That's what was meant by the 540 comment... that means of deinterlacing reduces the effective vertical resolution below the 1080 limit whereas proper frame reconstruction provides the original full 1080 resolution.
It's one reason why older "line doublers" for 480i video would reduce scan-line visibility but were often criticized for softening the picture (reducing to only 240 effective lines via simple averaging).
I'm having a very hard time with the "BR will be limited to 20 gigabytes" now.
Approaching the subject logically, I'm considering a disc. For there to be problems with yields, there must be some kind of "Corruption" of the discs. It would be *extremely* unusual for said corruption to occur only on the outer rings of a disc, rather than throughout the entire disc, unless there was a problem with the machine, which would simply need readjusted/redesigned. Corruption that is uniform across the disc is understandable, it would be a problem with the materials used, probably difficult to solve. Corruption limited only to the outer rings is more indicative of a manufacturing fault, something much easier to solve as you'd just need to change whatever was influencing the problem in the outer rings.
The source in question makes his information that much more difficult to believe with his claim that their solution is to "Eliminate the outer rings because they're the hardest to read". Unlikely at best, the only issue with outer rings is that the Player must speed up it's rotation speed in order to retain the same transfer, due to the increasing diameter of the rings. They're not harder, the data is being read at the same speed, being seen at the same speed as the inner rings because of the difference in diameter.
I'm calling shenanigins on it now. I could see it being conceivable up until the point where the claim was made that the outer rings are more difficult to read.
But this is not how its done for BD, or even 1080i output of HD DVD. The image is "upsampled" to an interlaced 60 fields signal and then converted to a progressive signal the same way a DVD player turns a 480i to 480p.
Jut to clarify Ryan, the disk always rotates at the same speed. This means that he data goes by more quickly under the laser on the outer edge than on the inner part of the disk. However the bits are spaced further apart the further from the center of the disk where the data resides.
This means that the laser always reads the data the same.
This has been true all the way back to (at least) 78 RPM phonograph records, where it always seemed to the mechanical pickup that the analog information was the same everywhere, because even though the physical grooves were traveling more quickly at the edge of the record, the physical, analog information was spread out, resulting in the needle always picking up the data at a constant rate.
The result? Your last sentence is entirely correct.