What's new

4K UHD discs of movies shot on film are being created from the 2K digital intermediate? (1 Viewer)

JoshZ

Senior HTF Member
Joined
May 26, 2012
Messages
2,300
Location
Boston
Real Name
Joshua Zyber
Beautify , thanks for the info. I need to catch that on disc.
DEAKINS. ✊🏼✊🏼✊🏼
My Hero.

I wouldn't rush to lionize him. There was quite a bit of revisionism in the new DI for O Brother. Deakins completely changed the look of the color grading, and basically undid the (in his own words from the original production, as you can hear about 45 seconds into the video Oliver posted) "very dry and dusty" look that was the whole rationale for using digital color grading in the first place. By the time the Blu-ray was released, he decided he was over that, and gave all the colors a boost.
 

John Dirk

Premium
Ambassador
HW Reviewer
Senior HTF Member
Joined
May 7, 2000
Messages
6,746
Location
ATL
Real Name
JOHN
I propose a new rule for this thread. :)

"Anyone introducing an acronym must define it." Retroactively, I believe our latest offender is @Worth . What does DCP mean, please. :D
 

Josh Steinberg

Premium
Reviewer
Senior HTF Member
Joined
Jun 10, 2003
Messages
26,388
Real Name
Josh Steinberg
Yes.

Some of this race to add more pixels and such is being driven by the same hardware manufacturing ethos that wants us to all buy new phones and computers and TVs every year, where the hardware manufacturers are driving for changes that no one using the equipment is actually asking for.

Setting that aside, there’s the practical reality of how often it’s reasonable to ask everyone in the industry to overhaul the entire infrastructure. Before the transition to digital in the 2000s, the 35mm standard with sync sound held for nearly a hundred years. As long as they were properly maintained, like any other mechanical device, a projector from 1940 could play a movie from 1990.

Theaters (with the financial aid of studios) spent fortunes changing out those 35mm projectors for digital ones at the dawn of this century and it probably took a decade to complete that roll out. The cost of those upgrades were high, and many theaters (especially independent ones) were forced out of business because the cost of the new equipment exceeded their profit margins - in the blink of an eye, they were forced out of their own business. Now, about twenty years later, the industry is in the middle of another upgrade, changing out 2K projectors illuminated by xenon bulbs for 4K projectors, many illuminated by laser rather than traditional bulbs. AMC and Regal, the major chains, are in the middle of this transition and expect to complete it by the end of this decade.

So in that context - setting aside specialty installations like theme parks - 8K is sort of crazy talk. Theaters haven’t even finished switching to 4K projection, the idea of pushing them to suddenly go 8K isn’t realistic. And the infrastructure to produce content at that resolution doesn’t exist. With today’s Hollywood productions cutting shooting and postproduction windows in efforts to keep franchise films coming on rapidly compressed schedules, the amount of computing power available and artists and technicians available means that even movies being delivered as 4K DIs still have visual effects work done at 2K.

I think, with respect, that the OP is getting a little tripped up over the number of Ks, and sort of missing the forest for the trees. For the first hundred years of moviemaking, the form a finished movie would be delivered in was a 35mm negative. Movies that were made that way are now scanned in 4K when 4K UHD discs were made.

But for the past 20-30 years, the form a finished movie would be delivered in was (and in many cases, still is) a 2K DI. Every choice the filmmakers made, from choice of film stocks, lighting, makeup, the way sets were made, which locations were picked, special effects, were all made with the idea that the 2K DI was the final product. These movies were not intended to be viewed from their original camera negatives. The original camera negatives on these films cannot be considered reference points for how the movies are supposed to look. The OP laments that there is visual information on the negative that is not captured by the 2K DI - which is true - but that’s irrelevant here because the filmmakers chose a production workflow where that information was never intended to be part of their finished movie.

The 2K DI is its own format, with its own set of characteristics, its own visual language. There is no point to holding it to a different standard than it was ever meant to be held to. Otherwise you’re sort of blaming an elephant for not being a giraffe - they’re not the same beast so how is it helpful to demand one behave like the other?

Or maybe use a food metaphor. A loaf of bread is made from many different ingredients. If the recipe calls for two cups of flour (flour being in this metaphor the camera negative), it doesn’t matter if the flour is being taken from a bag that holds four cups of flour because the recipe only calls for two. The amount of flour the bag holds isn’t relevant to how much the recipe calls for.

I think this is the point the OP is having difficulty accepting - that with modern movies created using the DI process, it is the finished DI that is the final product and reference point for how the movie is supposed to look.

And as you’ve said about whether standards should evolve past the 2K DI, they are and will continue to do so, at speeds that I don’t think anyone can truly predict. But the movies made as 2K DIs will always be 2K DIs, just the way movies made in black & white are black & white movies. We generally frown upon colorization as altering the filmmakers’ intentions. We should think about altering completed 2K DIs the same way.
 

Stephen_J_H

All Things Film Junkie
Senior HTF Member
Joined
Jul 30, 2003
Messages
7,898
Location
North of the 49th
Real Name
Stephen J. Hill
Interesting...

If I read the specs correctly, the JPEG 2000 compression format maxes out at 4K, 30FPS, so 8K DI's would require new standards.
Correct, but there's nothing earth shattering about that. Nearly every new home format with an increase in resolution has required a newer, more efficient codec. MPEG-2 was fine for DVD, but could be fickle with BD, so the H.264 AVC and VC-1 codecs were developed. 4K UHD didn't even bother with AVC and all UHD discs use H.265 HEVC. They'll probably develop [forgive my wording and acronym use] H. 266 UEVC should 8K become a consumer format, and the DI/DCP format will be something like JPEG 3000.
 

Saul Pincus

Auditioning
Joined
Jul 20, 2023
Messages
14
Real Name
Saul Pincus
Though we're primarily concerned with the aesthetic value of imagery and sound on this forum, it's worth noting that the push to create digital intermediates in the first place – as well as digital acquisition and projection – was primary a financial one. Lucas was concerned with the creative potential, for sure, but he was also keen on the savings. Once studios saw what was possible, they moved to standardize DIs and "evolve" the distribution chain.

You won't hear me making the case that I'd rather see Lawrence of Arabia in 2K – I'll still be happiest with a 70mm print. Same with 70mm blowups made photochemically from cut negative. But the 70mm prints of titles like Wonder Woman and even Tarantino's Once Upon A Time in Hollywood? Those came from digital intermediates. And they didn't look worth the effort to my eyes.

The point of the DI process in the first place was to make a lot of the hidden steps in post-production (in the finishing of a movie) less arduous, more predictable, more efficient, and more affordable. That's because there can be dozens of different formats on a given distributor's list, and each version of in every format of the film needs to look like the other. The digital realm makes consistency of these deliverables achievable in great numbers.

The downside is still consistent quality in projection – but that's always been true.

The even-greater downside, as some my heroes on this forum know all too well, is the moving-target of long-term storage of digital assets.
 
Last edited:

John Dirk

Premium
Ambassador
HW Reviewer
Senior HTF Member
Joined
May 7, 2000
Messages
6,746
Location
ATL
Real Name
JOHN
Again, for me anyway, this thread has taken on a new life as an instructional tool. Sincere thanks to all who have contributed thus far. Given what I've read, I will say this. You can't blame enthusiasts for wanting "bigger, faster, better," represented here in the form of increased resolution, etc. The hobby we all love is fueled by hardware manufacturers and studios who consistently lead us to believe it's a realistic expectation and something we should be investing in. When we examine the nuts and bolts only to find there's essentially nothing to it, I believe it's reasonable to feel misled.
 

Josh Steinberg

Premium
Reviewer
Senior HTF Member
Joined
Jun 10, 2003
Messages
26,388
Real Name
Josh Steinberg
When we examine the nuts and bolts only to find there's essentially nothing to it, I believe it's reasonable to feel misled.

I totally get this.

I know it’s ancient history at this point but I remember having a lot of discussions on this forum a decade ago when consumer 4K was being rolled out where I had more or less taken the position that it was too soon to be putting out another consumer format and that given the way the industry was currently making movies, that the hardware specs were zooming past what the actual software being made justified.

I do think it’s important to note that many 4K discs are coming from genuine 4K sources, when the actual master format can justify that. Anything that had a film negative as a final master source - basically all movies made before about 1998 - is coming on 4K disc at genuine 4K resolution. Anything that had a 4K DI as a final master source is coming to 4K disc at genuine 4K resolution.

The only instances where 2K DIs are being used in the creation of 4K discs are instances where the finished 2K DI is the master source. In those cases, that’s just how the movie was made. It is what it is. Those instances of putting 2K DIs onto a 4K disc are not completely without merit. The 1080p Blu-ray format specification has a more limited color gamut than the actual 2K DI, so even if the Blu-ray is capturing all of the resolution on the 2K DI, it’s not necessarily getting all of the other information available to the fullest extent. On the other hand, the 4K disc is capable of matching the color gamut of a 2K DI, allowing it to be a more transparent representation of the source material.

There has always been a tension between the hardware and software camps, as well as the priorities of individual consumers. It’s the age old question of, “Did you buy your stereo to play your records, or do you buy records to play your stereo?” In other words, is your priority about showing off the maximum extent of what your hardware can do at all times, or is your priority to have the hardware capability to be able to accurately play back whatever you put on it? Those two separate camps have always existed and at a certain point, there’s no way to completely reconcile those different approaches.

I don’t think you can necessarily say one approach is objectively right and one is objectively wrong. For our specific community here at HTF, I always look back to our mission statement for guidance, and that mission statement speaks towards wanting to view movies as close as possible to how they were originally presented in theaters, and as the filmmakers wish their work to be seen - and in the cases where those two things are not one and the same, our mission statement leans toward honoring the wishes of the filmmakers. (Which is why, for instance, I wasn’t up in arms with the recent 4K release of “American Graffiti” even though it doesn’t look a thing like a 1973 movie shot on film would have looked - it does, however, look exactly as its director wishes it to look on modern displays, and as the author of that work, I respect his right to have a say in that choice, even if it is one that will make some people unhappy. I sympathize with those unhappy viewers and completely acknowledge their point that the movie doesn’t look now as it once did, while respecting that for better or worse, it’s not up to us laypeople to make those decisions.)

Where I respectfully disagree with our OP, Wes, is in the idea that it is “disturbing” that 2K DI masters are being used for 4K discs when that is the highest quality element available that represents the intended final product the filmmakers meant to create.

It’s nearly always been the case historically that there’s more information on the camera negative than the audience was meant to see. For example, nearly every non-anamorphic widescreen movie shot on film (that is, most movies with 1.85:1 aspect ratios) were actually shot on a film negative that captured a 1.37:1 frame, an image that was technically taller than 1.85, with more information visible on the top and bottom of the frame than was seen in projection. It was industry standard practice to acquire the image on a taller frame, and then to matte it to the proper aspect ratio in projection. We watch these movies on disc now in 1.85:1, as they were meant to be seen. The fact that the camera negative contains additional information that the filmmakers never intended the audience to see in and of itself makes that extra information irrelevant. That the extra information exists at all is a fluke of how those formats were created and standardized, not something that is being taken from the audience.

Or, to use a completely different kind of example, look at a series of very popular movies shot on film and completed as 2K DIs: the Harry Potter movies. Those movies all have a very striking, very unique visual palette that was only possible to achieve in the digital realm, a look that was very specifically designed and intended. If you were to go back and look at what the camera negative captured, it would look nothing like what the finished movies look like. It seems obvious that the look that was achieved using the 2K DI as the master format is the correct look for those movies. If we were given the ability to go back to the film negatives and recreate those movies using how the film negative looks as a guide, we would be undoing every creative choice the filmmakers made over a decade. Surely that cannot be the correct approach in that case.

These questions, debates, and any confusion arising from them were probably inevitable once the hardware manufacturers started creating equipment that exceeded the specifications of the environments in which these movies were originally made. Even in relatively straightforward scenarios like taking a pre-digital era movie that was shot and completed on 35mm film, that has 4K worth of information available on the camera negative that can be scanned at 4K resolution, it’s less simple than it might seem. For one thing, 35mm projection prints had about 720p worth of resolution (or less) thanks to generational loss from the process of making a film print from a negative, and the filmmakers were well aware of that and used that to their advantage as part of their process. Start scanning the original negative and suddenly you see things the filmmakers never imagined it would be possible for you to see, anything from the artificiality of makeup and wigs to a matte painting that looked like a real landscape on a film print but now looks like the painting it is. Seams from special effects, wires holding models in place, cables hoisting performers in the air, the presence of stunt doubles no longer appearing to be the lead actor they were doubling for. Technicians today transferring older movies to 4K are faced with all kinds of choices that were just inconceivable when the movie was made about what they should show and what they should try to “massage” to hide, in order to retain the illusions the original filmmakers created.

There aren’t necessarily any easy answers.

But that’s why I think it’s important to always keep in mind that the original camera negative in and of itself is not the movie - the original camera negative is a vital part of the recipe that helps you create the movie.
 

Lord Dalek

Senior HTF Member
Joined
Apr 4, 2005
Messages
7,107
Real Name
Joel Henderson
If we're having that open matte discussion again...
Screenshot_20240209_085615.jpg
Screenshot_20240209_085412.jpg
Screenshot_20240209_090020.jpg
Screenshot_20240209_085343.jpg
 

Wes Candela

Visual Storytelling Enthusiast
Premium
Joined
Nov 2, 2012
Messages
492
Location
New York, NY
Real Name
Wes Candela
I wouldn't rush to lionize him. There was quite a bit of revisionism in the new DI for O Brother. Deakins completely changed the look of the color grading, and basically undid the (in his own words from the original production, as you can hear about 45 seconds into the video Oliver posted) "very dry and dusty" look that was the whole rationale for using digital color grading in the first place. By the time the Blu-ray was released, he decided he was over that, and gave all the colors a boost.
I wouldn't rush to lionize him.
Oh...but I do. He's a photography godsend.
maybe in this instance, he ruined the original look of the film,
but what he has created visually in the world of film has been above and beyond phenomenal
IMO
(in my opinion)
 

Wes Candela

Visual Storytelling Enthusiast
Premium
Joined
Nov 2, 2012
Messages
492
Location
New York, NY
Real Name
Wes Candela
Perhaps Wes is disheartened because so many of the newer Warner Archive classics are advertised as being from a scan of the original camera negative. Of course, these films were made long before modern editing techniques, where the concept of a digital internegative didn't even exist as a step in the editing process.
Rich, you're nailing it thanks
Honestly, I'm just disheartened because I should have known better, digital editing means editing something digitally

I just put two and two together and realized exactly what you just said, that 4K releases were being struck from 2K sources.
I think it's my own fault for not realizing this step in the process, but that is exactly why I am disheartened thanks for that very well put
 

Wes Candela

Visual Storytelling Enthusiast
Premium
Joined
Nov 2, 2012
Messages
492
Location
New York, NY
Real Name
Wes Candela
You also have to keep in mind that a 2K digital theatrical presentation is sharper than a projected 35mm release print. Films shot and completed on 35mm film are generally thought to have somewhere between 3-4K of real image detail - Kodak argues for 6K, but that's only for the newest stocks and it's in their best interest to estimate high.

At any rate, that only applies to the original camera negative, which no one ever sees or was meant to see. A print you'd see in a cinema is several generations removed from that, so you end up with something much closer to 2K. Factor in the mechanics of film projection and you're down to a perceivable resolution of something closer to 720p.

There's an argument to be made that older films really shouldn't be seen in 4K. The filmmakers knew that generation loss and projection would hide the seams in thing like make-up, sets and stunt doubles. All of those things are much more obvious in newer 4K masters than they ever were in original release prints.
Yeah, I disagree with this statement only because I’ve had this discussion quite a few times.
I do not believe there is any definitive way to calculate how many pixels are in a 35 mm image, that being said sources say it’s approximately 20 million pixels worth of data.


2K was beautiful when that’s all we had but we have more now our televisions can now display 4K and 8K resolution

And the beautiful, beautiful, beautiful thing about celluloid is that the better quality film, negative, and the higher quality scanning

The more data we can retrieve from a 35 mm film print.

Now, I believe we are in the infancy of the digital revolution. The same reason people are listening to records now is the same reason I am upset over the fact that 4K discs are being created from 2K scans.

We deserve the best technology can offer if it’s stated that’s what we’re getting but like everything you need to read the fine print

I did not therefore coming to the realization that some of the films over the last 15 years have been put on 4K desk from a 2K intermediate is disturbing to me

Simply because I believe we should have the most information possible. And it is misleading to say “hey watch this movie in 4K.”

When really what you’re doing is watching a 2K image blown up to 4K resolution in some cases

And it is in those some cases that I was surprised by until the other day when I came to realize this was happening

Yes, you are seeing much more than the directors intended you to see or ever thought you would

But how many directors in the 1950s through the 1990s knew we would be watching televisions that were 85 inches at home?

it’s also the issue of color, because of the greater color gamut of 4K, I wanna know we are seeing those colors
and I want to know we’re seeing as much detail as possible, which is why I get so excited at the mention of

” scanned from the original camera negative”

this is just me talking, but if 4K is the new standard, I would like to see true 4K images on the television that I paid money to buy

However, I digress,

Also, I don’t wanna see a 2K film projected onto a movie theater screen

I was just discussing this with my cousin, and he said I bet the reason walking out of an IMAX 70 mm show is so breathtaking is because we are getting used to seeing Films in the theater at a kept resolution of 2K more than we realized.


But again, it’s just my opinion

35 mm film can yield up to 20 million pixels

Way beyond 2K

let’s go for it
 
Last edited:

Josh Steinberg

Premium
Reviewer
Senior HTF Member
Joined
Jun 10, 2003
Messages
26,388
Real Name
Josh Steinberg
A well-shot 35mm camera negative exposed in optimal conditions with optimal lenses is something between 3K and 4K.

A 35mm projection print, as seen by audiences in commercial theaters, is much closer to 720p.
 

JoshZ

Senior HTF Member
Joined
May 26, 2012
Messages
2,300
Location
Boston
Real Name
Joshua Zyber
For one thing, 35mm projection prints had about 720p worth of resolution (or less) thanks to generational loss from the process of making a film print from a negative, and the filmmakers were well aware of that and used that to their advantage as part of their process. Start scanning the original negative and suddenly you see things the filmmakers never imagined it would be possible for you to see, anything from the artificiality of makeup and wigs to a matte painting that looked like a real landscape on a film print but now looks like the painting it is. Seams from special effects, wires holding models in place, cables hoisting performers in the air, the presence of stunt doubles no longer appearing to be the lead actor they were doubling for. Technicians today transferring older movies to 4K are faced with all kinds of choices that were just inconceivable when the movie was made about what they should show and what they should try to “massage” to hide, in order to retain the illusions the original filmmakers created.

At at even more fundamental level is the texture and visibility of film grain. On countless UHD copies of 35mm movies, the main discernible difference between the 4K and Blu-ray editions is a coarser and more distracting emphasis of the grain texture, well beyond how it would have looked during projection of a theatrical print, and no doubt well beyond anything the filmmakers intended.
 

Josh Steinberg

Premium
Reviewer
Senior HTF Member
Joined
Jun 10, 2003
Messages
26,388
Real Name
Josh Steinberg
At at even more fundamental level is the texture and visibility of film grain. On countless UHD copies of 35mm movies, the main discernible difference between the 4K and Blu-ray editions is a coarser and more distracting emphasis of the grain texture, well beyond how it would have looked during projection of a theatrical print, and no doubt well beyond anything the filmmakers intended.

I find it helpful to think of home media releases of non-digital productions as “translations”. It may not be as extreme a disparity as translating from, say, French to English, but film and digital are different mediums and I believe it makes sense to appreciate them as such.

I don’t disagree with Wes about wanting high standards, I don’t disagree that movies completed in the analog realm as cut negative should be scanned in 4K for 4K releases. I only disagree (courteously) in the specific example of 2K DIs being redone because once the filmmakers on any given project made the choice to use the 2K DI as their master format, that became the final product, and my belief is that choice should be respected.
 
Last edited:

Worth

Senior HTF Member
Joined
Jul 17, 2009
Messages
5,258
Real Name
Nick Dobbs
Yeah, I disagree with this statement only because I’ve had this discussion quite a few times.
I do not believe there is any definitive way to calculate how many pixels are in a 35 mm image, that being said sources say it’s approximately 20 million pixels worth of data...

35 mm film can yield up to 20 million pixels

Way beyond 2K

Well, film doesn't have a fixed resolution, so you can scan it at whatever pixel count you want, but there's a point of diminishing returns. For 35mm motion picture film, that's generally considered to be between 3-4K. Kodak claims 6K for newer stocks.

chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.kodak.com/content/products-brochures/Film/Capturing-Information-on-Film.pdf

A single frame of color film scanned at 4K by 3K resolution with 10-bit depth contains about 50 megabytes of data. However, there is actually a lot more information than that on each frame of 35mm film. We have conducted tests where we have scanned film at 6K by 4K resolution at 10-bit depth, resulting in about 100 megabytes of data, or twice as much image information.

At any rate, what's on the film and what an audience in a cinema would have seen are two very different things. This has actually been studied.

Here's the gist of it:

In the study, MTF measurements were used to determine the typical resolution of theatrical release prints and answer prints in normal operation, utilizing existing state-of-the-art 35mm film, processing, printing, and projection.

The prints were projected in six movie theaters in various countries, and a panel of experts made the assessments of the projected images using a well-defined formula. The results are as follows:​

Measurement​
Lines​
35mm RESOLUTION
Answer Print MTF​
1400​
Release Print MTF​
1000​
Theater Highest Assessment​
875​
Theater Average Assessment​
750​

If you're interested in reading the entire study, a PDF of it is available for download here:

 
Last edited:

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest Articles

Forum statistics

Threads
357,072
Messages
5,130,100
Members
144,283
Latest member
Nielmb
Recent bookmarks
0
Top