Jump to content



Sign up for a free account to remove the pop-up ads

Signing up for an account is fast and free. As a member you can join in the conversation, enter contests and remove the pop-up ads that guests get. Click here to create your free account.

Photo
- - - - -

Sony Press Release: Sony Announces "Mastered in 4K" Blu-ray Titles


  • You cannot start a new topic
  • Please log in to reply
47 replies to this topic

#21 of 48 OFFLINE   Michel_Hafner

Michel_Hafner

    Supporting Actor



  • 842 posts
  • Join Date: Feb 28 2002

Posted January 09 2013 - 08:06 PM

Higher resolution would not, in itself, have an effect on banding. A greater color space would, which could be done with existing technology. Doug
The problem is the 8 bit limit of Blu Ray, not the size of the colour space (unless you refer to the 8 bit by this). With roughly 220 values between black and white/red/green/blue you are quite limited. Either noise the picture up (dither) and lose detail or live with banding for critical material. Solving this requires 10 bit and more, not higher spatial resolution, as you said.

#22 of 48 OFFLINE   SilverWook

SilverWook

    Screenwriter



  • 1,718 posts
  • Join Date: Oct 11 2006

Posted January 10 2013 - 06:57 AM

Superbit titles did have higher bitrate audio tracks, not to mention DTS, so they were not total marketing B.S. at least in that regard.

#23 of 48 OFFLINE   Persianimmortal

Persianimmortal

    Screenwriter



  • 1,120 posts
  • Join Date: May 22 2012
  • Real Name:Koroush Ghazi
  • LocationCanberra, Australia

Posted January 10 2013 - 07:12 AM

Basically this is just a token step to make 4K hardware appear more palatable, mainly for marketing purposes. In practical terms I doubt these "Mastered in 4K" titles will look any better than any current high-quality 1080p Blu-ray. The real jump will come when streaming or disc media allows high bitrate full 4K resolution, and that won't be for a while yet. Let's keep in mind that many people think Blu-ray is overkill as it is, and aren't overly concerned about the highest quality imagery when DVD will do just fine for them. Selling 4K is going to take a lot more than touting the benefits of higher resolution, especially as huge screen sizes are necessary to see those benefits, even with native 4K, much less this "mastered in 4K" watered-down nonsense.

#24 of 48 OFFLINE   Doctorossi

Doctorossi

    Supporting Actor



  • 841 posts
  • Join Date: May 23 2012

Posted January 10 2013 - 07:13 AM

Superbit titles did have higher bitrate audio tracks, not to mention DTS, so they were not total marketing B.S. at least in that regard.
Superbit discs weren't B.S., in the sense that they did actually deliver, but they were a kick in the teeth because they were charging you a premium for the maximized exploitation of the medium that should've been standard with all DVDs.

#25 of 48 OFFLINE   Colin Jacobson

Colin Jacobson

    Producer



  • 5,391 posts
  • Join Date: Apr 19 2000

Posted January 10 2013 - 08:22 AM

Originally Posted by Ejanss  Uh, ahem: In other words...it's Superbits. Complete with releasing a Spiderman movie first--It wouldn't be Sony if they didn't. (Wait, "superbits" in quotes?....You don't even remember what they were, do you?) And while I've already gotten all the Superbits/Fifth Element jokes out of my system over on another board, think our friend here accidentally hit on Sony's big mistake: Most people don't know what 4K is, but they're still so traumatized from 3D, they think it's something they have to buy one MORE set just to watch. And they don't want to, so they won't buy the movies for it. Me, I've already got 4K mastered disks of Baraka and Chitty Chitty Bang Bang, and have to admit they look better than most of my Blu's, but not that I'd rush out and buy a $10000 set for it, with 3D or no. (Well, it would have to be "with".) Like the above-named examples, if a movie I was going to buy anyway was in 4K, that's nice, but I was given the same Superbit Choice between extras OR good-looking on store shelves, the same thing would happen that happened to Superbits. Sony's so determined to promote and associate the 4K label with the new sets, they want to show us they've got the cart to go with the horse, and they're not worrying about what happens if the horse throws a shoe.
Minor note: "Spider-Man" wasn't one of the first Superbit titles.  IIRC, the line launched in late 2001, and the "Spidey" SB wasn't out until almost three years later...
Colin Jacobson
http://www.dvdmg.com

#26 of 48 OFFLINE   Douglas Monce

Douglas Monce

    Producer



  • 5,514 posts
  • Join Date: Nov 16 2006

Posted January 10 2013 - 08:23 AM

The problem is the 8 bit limit of Blu Ray, not the size of the colour space (unless you refer to the 8 bit by this). With roughly 220 values between black and white/red/green/blue you are quite limited. Either noise the picture up (dither) and lose detail or live with banding for critical material. Solving this requires 10 bit and more, not higher spatial resolution, as you said.
That is exactly what I'm talking about. If they went even to a 4:2:1 color space, it would all but eliminate color banding. The current color sample for blu-ray playing back AVC is 4:2:0, but AVC, and blu-ray are capable of running a 4:4:4 (12 bit or so called "deep color") color sample, but I'm unaware of any software mastered to use it. Doug
"I'm in great shape, for the shape I'm in."
Bob Hope in The Ghostbreakers

#27 of 48 OFFLINE   Torsten Kaiser

Torsten Kaiser

    Film Restoration & Preservation



  • 96 posts
  • Join Date: Nov 30 2002

Posted January 10 2013 - 09:11 AM

There seem to be a lot of misconceptions here about 4K, its actual use and usage as well as the technical necessities and the environment around it, such as the color space and the bit depth. To begin with, "4K MASTERING" as advertised by SONY merely implies that the master used for the Blu-ray encode was crafted in a 4K environment and workflow THE WHOLE TIME, with the resulting encode being a downscale from 4K to HD1080 first and then encoded (for instance, via AVC).  Most masters related to 4K so far ARE NOT crafted in a 4K workflow the entire time down the line to the end result, but rather only very partially - meaning that, while the film elements would be sanned at 4K, the actual workflow that follows would be in HD1080, or, for new productions with SFX, mostly in 2K; although this is changing. Also, 4K DOES HAVE a very relevant and noticeable effect DUE TO THE USE OF OVERSAMPLING on the quality of any (correct) downscaled HD1080 master compared to a native HD1080 transfer.  It exhibits and allows for much better color rendition, detail, more accurate sharpness and finer delination in textures as well as has the advantage of dramatically reducing - on a couple of scanners even elliminating - aliasing.  This is why we also use 4K in certain projects, whenever we can. LOA was mentioned and it was claimed that 4K workflow (the scan at Fotokem was 8K) had no effect on the resulting master, but rather the well treated HD1080 environment.  Let me make this very clear:  a native HD1080 scan of LOA would never have resulted on that kind of detail and rendition due to many factors, some of which I already mentioned. Note that BARAKA and CHITTY CHITTY BANG BANG were NOT MASTERED ENTIRELY in the 4K environment.  BARAKA was scanned at 8K resolution at Fotokem, the scan files were then downscaled to HD1080 for further work. CCBB was, as far as I know, scanned 4K and worked down the line in HD1080 as well. However, the claim that "4K MASTERING" workflow and downscale would result in something significantly different than 4K scanning and downscaling to HD1080 for mastering is, indeed, misleading - and if you look very closely, the suggestion being made in the public relations campaign targets specifically HD1080 and 2K native masters yet mentions with no word existing methods that are pretty similar such as 4K scanning and HD1080 or 2K workflows.  There is of course, a reason for this. As for the "Superbit" comparison: it does not apply.  The SUPERBIT DVDs were, indeed, a bit (pardon the pun) of a hustle.  The quality was actually not tied to the bitrate alone, and could have been easily achieved by other means (2-Disc sets with the Extras on Disc 2) on "normal" DVDs).  So yes, it was a marketing "gag".  Also note that there were some "Superbit Editions", which utterly failed the very own program: PANIC ROOM was released with some 30%+ of empty space - resulting in visible artifacts on the film's MPEG-2 encode. Apparently, some extras were planned but striken at the last moment, and someone came up with the (less than bright) idea to issue the DVD under the SUPERBIT program, forgetting or ignoring that the already finished glas master contained a significantly reduced encoded file. Anyway, with regard to banding and color space and codecs (yes) in that environment: Blu-ray has a working specification of 4:2:0 Color Space in 8bit per channel at YUV.   Does this affect or result necessarily in banding ?  No. For the record: Most masters are 4:2:2 YUV or 4:4:4 RGB HDCAMSR or DPX or, if pre-formatted for encoding, already prepped (with codec) PRORES 4:2:2 YUV mov files.  All of the mentioned are delivered in 10bit, mostly HEAD range, few in full that need to be converted. 12bit is only available in Post and / or scanning and also used for Digital Projection for the x-y-z Color Space.  It is not used in broadcast and/or digital media such as Blu-ray. Banding comes actually into play when the color range (8bit at full range 0-255, at HEAD 16-235) is manipulated in such a way that the detail between the tone of each respective color is stretched (mostly with the aim to accomplish a more vibrant picture).  This is similar to the stretching of the gradation and/or gamma curve. When one looks into the histogramm after performing this action, one can see that entire parts of the spectrum have been eliminated.  This results in "effects" such as solarization, banding, posterization.  However, this can happen just as easily (and often does) already in the 10bit range stage during mastering - and the encode, depending on its quality and the way it was handled, put - sometimes significant - artefacts "on top".  Here, the 8-bit realm and the 4:2:0 Color Space do not "help" but can "add" to the problem.  But, they ARE NOT THE CAUSE. Does a 10bit realm (0-1023 in Full Range, 64-894 in Head Range) help ?  Yes, of course.  Does 4:4:4 RGB color space help ? Yes, of course.  But if the color timing and mastering is not done correctly, both are just as "helpless". The main issue/objective is to keep the color spectrum intact.
Torsten Kaiser
-----------------------
TLEFilms Film Restoration & Preservation Services
www.TLEFilms.com

#28 of 48 OFFLINE   Robert Harris

Robert Harris

    Archivist



  • 7,907 posts
  • Join Date: Feb 08 1999
  • Real Name:Robert Harris

Posted January 10 2013 - 11:46 AM

Originally Posted by Torsten Kaiser  There seem to be a lot of misconceptions here about 4K, its actual use and usage as well as the technical necessities and the environment around it, such as the color space and the bit depth. To begin with, "4K MASTERING" as advertised by SONY merely implies that the master used for the Blu-ray encode was crafted in a 4K environment and workflow THE WHOLE TIME, with the resulting encode being a downscale from 4K to HD1080 first and then encoded (for instance, via AVC).  Most masters related to 4K so far ARE NOT crafted in a 4K workflow the entire time down the line to the end result, but rather only very partially - meaning that, while the film elements would be sanned at 4K, the actual workflow that follows would be in HD1080, or, for new productions with SFX, mostly in 2K; although this is changing. Also, 4K DOES HAVE a very relevant and noticeable effect DUE TO THE USE OF OVERSAMPLING on the quality of any (correct) downscaled HD1080 master compared to a native HD1080 transfer.  It exhibits and allows for much better color rendition, detail, more accurate sharpness and finer delination in textures as well as has the advantage of dramatically reducing - on a couple of scanners even elliminating - aliasing.  This is why we also use 4K in certain projects, whenever we can. LOA was mentioned and it was claimed that 4K workflow (the scan at Fotokem was 8K) had no effect on the resulting master, but rather the well treated HD1080 environment.  Let me make this very clear:  a native HD1080 scan of LOA would never have resulted on that kind of detail and rendition due to many factors, some of which I already mentioned. Note that BARAKA and CHITTY CHITTY BANG BANG were NOT MASTERED ENTIRELY in the 4K environment.  BARAKA was scanned at 8K resolution at Fotokem, the scan files were then downscaled to HD1080 for further work. CCBB was, as far as I know, scanned 4K and worked down the line in HD1080 as well. However, the claim that "4K MASTERING" workflow and downscale would result in something significantly different than 4K scanning and downscaling to HD1080 for mastering is, indeed, misleading - and if you look very closely, the suggestion being made in the public relations campaign targets specifically HD1080 and 2K native masters yet mentions with no word existing methods that are pretty similar such as 4K scanning and HD1080 or 2K workflows.  There is of course, a reason for this. As for the "Superbit" comparison: it does not apply.  The SUPERBIT DVDs were, indeed, a bit (pardon the pun) of a hustle.  The quality was actually not tied to the bitrate alone, and could have been easily achieved by other means (2-Disc sets with the Extras on Disc 2) on "normal" DVDs).  So yes, it was a marketing "gag".  Also note that there were some "Superbit Editions", which utterly failed the very own program: PANIC ROOM was released with some 30%+ of empty space - resulting in visible artifacts on the film's MPEG-2 encode. Apparently, some extras were planned but striken at the last moment, and someone came up with the (less than bright) idea to issue the DVD under the SUPERBIT program, forgetting or ignoring that the already finished glas master contained a significantly reduced encoded file. Anyway, with regard to banding and color space and codecs (yes) in that environment: Blu-ray has a working specification of 4:2:0 Color Space in 8bit per channel at YUV.   Does this affect or result necessarily in banding ?  No. For the record: Most masters are 4:2:2 YUV or 4:4:4 RGB HDCAMSR or DPX or, if pre-formatted for encoding, already prepped (with codec) PRORES 4:2:2 YUV mov files.  All of the mentioned are delivered in 10bit, mostly HEAD range, few in full that need to be converted. 12bit is only available in Post and / or scanning and also used for Digital Projection for the x-y-z Color Space.  It is not used in broadcast and/or digital media such as Blu-ray. Banding comes actually into play when the color range (8bit at full range 0-255, at HEAD 16-235) is manipulated in such a way that the detail between the tone of each respective color is stretched (mostly with the aim to accomplish a more vibrant picture).  This is similar to the stretching of the gradation and/or gamma curve. When one looks into the histogramm after performing this action, one can see that entire parts of the spectrum have been eliminated.  This results in "effects" such as solarization, banding, posterization.  However, this can happen just as easily (and often does) already in the 10bit range stage during mastering - and the encode, depending on its quality and the way it was handled, put - sometimes significant - artefacts "on top".  Here, the 8-bit realm and the 4:2:0 Color Space do not "help" but can "add" to the problem.  But, they ARE NOT THE CAUSE. Does a 10bit realm (0-1023 in Full Range, 64-894 in Head Range) help ?  Yes, of course.  Does 4:4:4 RGB color space help ? Yes, of course.  But if the color timing and mastering is not done correctly, both are just as "helpless". The main issue/objective is to keep the color spectrum intact.
Torsten, Always a pleasure reading your thoughts. One point should be made. Generally, when images are scanned at 4k, and the resultant files massaged (that would be a technical term) to fruition at 4k, those projects are destined for preservation.  Make no mistake.  It would be a waste to scan and work files at 4k, if the final result is for any other rationale. RAH

"All men dream: but not equally. Those who dream by night in the dusty recesses of their minds wake in the day to find that it was vanity: but the dreamers of the day are dangerous men, for they may act their dreams with open eyes, to make it possible. This I did." T.E. Lawrence


#29 of 48 OFFLINE   Torsten Kaiser

Torsten Kaiser

    Film Restoration & Preservation



  • 96 posts
  • Join Date: Nov 30 2002

Posted January 10 2013 - 12:14 PM

Torsten, Always a pleasure reading your thoughts. One point should be made. Generally, when images are scanned at 4k, and the resultant files massaged (that would be a technical term) to fruition at 4k, those projects are destined for preservation.  Make no mistake.  It would be a waste to scan and work files at 4k, if the final result is for any other rationale. RAH
Absolutely correct, provided the respective project stays in 4K. The workflow is, to be absolutely certain, the best if not ideal approach, and no argument should be made against it (well, maybe there is one hurdle called budgeting that "occasionally" does apply) :D However, the "4K MASTERING" titles are marketed and available only as HD1080 Blu-ray editions, of course, making these "mere" downscaled derivatives or byproducts of the 4K projects. Only a true 4K playable medium would change that - and this is quite sometime off yet. The problem with the "4K MASTERING" marketing is that it suggests that on 4K displays it (the resulting HD1080 Blu-ray) would render close or the same quality as native 4K or at least bring it significantly closer than any other method, including (the not mentioned by name) such as 4K scanning/lower resolution workflow. It is here where things get substantially more hazy and become more advertising claim rather than actual fact that can be taken at face value.
Torsten Kaiser
-----------------------
TLEFilms Film Restoration & Preservation Services
www.TLEFilms.com

#30 of 48 OFFLINE   Michel_Hafner

Michel_Hafner

    Supporting Actor



  • 842 posts
  • Join Date: Feb 28 2002

Posted January 11 2013 - 08:01 PM

  However, this can happen just as easily (and often does) already in the 10bit range stage during mastering - and the encode, depending on its quality and the way it was handled, put - sometimes significant - artefacts "on top".  Here, the 8-bit realm and the 4:2:0 Color Space do not "help" but can "add" to the problem.  But, they ARE NOT THE CAUSE.
But is it not so that YUV values from 16-235 are simply not enough to present all possible subtle shadings without banding, even when no mastering mistakes are made, unless one adds some noise to mask the banding?
Generally, when images are scanned at 4k, and the resultant files massaged (that would be a technical term) to fruition at 4k, those projects are destined for preservation. Make no mistake. It would be a waste to scan and work files at 4k, if the final result is for any other rationale.
I would agree when the rationale is 1080p instead of preservation. But what does that mean? Is it a waste of time to go from 2K DIs to 4K DIs for 35mm and for the initial release of films? These films need to be preserved eventually as well. So we rescan in 4K later but release in 2K? Or do we preserve these new films in 2K and reserve 4K for the big classics when we preserve/restore them (although the originals of these new films have more resolution than all 35mm classics)? (And this is de facto what happens these days). I think all 35mm, especially modern 35mm, needs 4K not just for preservation but original post production for full quality presentation at its release, comparable to what the best prints could achieve in the pre digital age concerning a detailed, sharp and analogue looking and digital artifact free image with proper grain texture.

#31 of 48 OFFLINE   Robert Harris

Robert Harris

    Archivist



  • 7,907 posts
  • Join Date: Feb 08 1999
  • Real Name:Robert Harris

Posted January 12 2013 - 12:19 AM

But is it not so that YUV values from 16-235 are simply not enough to present all possible subtle shadings without banding, even when no mastering mistakes are made, unless one adds some noise to mask the banding? I would agree when the rationale is 1080p instead of preservation. But what does that mean? Is it a waste of time to go from 2K DIs to 4K DIs for 35mm and for the initial release of films? These films need to be preserved eventually as well. So we rescan in 4K later but release in 2K? Or do we preserve these new films in 2K and reserve 4K for the big classics when we preserve/restore them (although the originals of these new films have more resolution than all 35mm classics)? (And this is de facto what happens these days). I think all 35mm, especially modern 35mm, needs 4K not just for preservation but original post production for full quality presentation at its release, comparable to what the best prints could achieve in the pre digital age concerning a detailed, sharp and analogue looking and digital artifact free image with proper grain texture.
A nice thought, but the reality is that most new productions, be they sourced on film or data, do not have the luxury / problems / added expense of 4k. The majority are all 2k DIs, and look just fine on both film and data. When we restore or preserve film-based productions, the point is to preserve everything on the original, as viewed in release. And 4k does the job nicely. But take a very high quality recent film, for example Flags of Our Fathers, and you're viewing 2k. A newer film, Silver Linings Playbook - 2k. Les Miz - 2k. In most situations, 4k is an unecessary luxury. RAH

"All men dream: but not equally. Those who dream by night in the dusty recesses of their minds wake in the day to find that it was vanity: but the dreamers of the day are dangerous men, for they may act their dreams with open eyes, to make it possible. This I did." T.E. Lawrence


#32 of 48 OFFLINE   OliverK

OliverK

    Screenwriter



  • 1,866 posts
  • Join Date: Feb 01 2000

Posted January 12 2013 - 01:11 AM

I would agree when the rationale is 1080p instead of preservation. But what does that mean? Is it a waste of time to go from 2K DIs to 4K DIs for 35mm and for the initial release of films? These films need to be preserved eventually as well. So we rescan in 4K later but release in 2K? Or do we preserve these new films in 2K and reserve 4K for the big classics when we preserve/restore them (although the originals of these new films have more resolution than all 35mm classics)? (And this is de facto what happens these days). I think all 35mm, especially modern 35mm, needs 4K not just for preservation but original post production for full quality presentation at its release, comparable to what the best prints could achieve in the pre digital age concerning a detailed, sharp and analogue looking and digital artifact free image with proper grain texture.
In my experience below a certain viewing distance there is a difference between 2k and 4k that is readily visible so with everything new that comes into cinemas it would be a benefit to be able to match the actual resolution of the new 4k projectors as it really makes sense to have them in movie theaters and also in some home theaters. I was pretty certain that very soon the majority of projects would be handled in 4k and higher just a year ago but since then have heard that really big movies were handled at much lower resolutions (Prometheus, The Hobbit) even though they were already in the digital realm and shot with higher rez cameras - still seems to be too expensive to do even these very large movies completely in 4k and special effects might still be 2k and stay that way as I doubt they will be redone. As RAH pointed out 2k is considered to look just fine and one could maybe say that 2k is the look of "good enough" and that is why we still have it as long as costs for 4k are significantly higher than for 2k.

#33 of 48 OFFLINE   Robert Harris

Robert Harris

    Archivist



  • 7,907 posts
  • Join Date: Feb 08 1999
  • Real Name:Robert Harris

Posted January 12 2013 - 01:54 AM

In my experience below a certain viewing distance there is a difference between 2k and 4k that is readily visible so with everything new that comes into cinemas it would be a benefit to be able to match the actual resolution of the new 4k projectors as it really makes sense to have them in movie theaters and also in some home theaters. I was pretty certain that very soon the majority of projects would be handled in 4k and higher just a year ago but since then have heard that really big movies were handled at much lower resolutions (Prometheus, The Hobbit) even though they were already in the digital realm and shot with higher rez cameras - still seems to be too expensive to do even these very large movies completely in 4k and special effects might still be 2k and stay that way as I doubt they will be redone. As RAH pointed out 2k is considered to look just fine and one could maybe say that 2k is the look of "good enough" and that is why we still have it as long as costs for 4k are significantly higher than for 2k.
And, especially for the average viewer, the difference is difficult to detect, even in theatrical situations, unless one is in a testing situation, with images side by side. Even more so when one records out to film. RAH

"All men dream: but not equally. Those who dream by night in the dusty recesses of their minds wake in the day to find that it was vanity: but the dreamers of the day are dangerous men, for they may act their dreams with open eyes, to make it possible. This I did." T.E. Lawrence


#34 of 48 OFFLINE   OliverK

OliverK

    Screenwriter



  • 1,866 posts
  • Join Date: Feb 01 2000

Posted January 12 2013 - 04:49 AM

And, especially for the average viewer, the difference is difficult to detect, even in theatrical situations, unless one is in a testing situation, with images side by side. Even more so when one records out to film. RAH
Certainly true, hopefully with time prices for full 4k pipelines will come down so that it is no longer deemed a luxury.

#35 of 48 OFFLINE   Torsten Kaiser

Torsten Kaiser

    Film Restoration & Preservation



  • 96 posts
  • Join Date: Nov 30 2002

Posted January 12 2013 - 07:45 AM

Originally Posted by Michel_Hafner  But is it not so that YUV values from 16-235 are simply not enough to present all possible subtle shadings without banding, even when no mastering mistakes are made, unless one adds some noise to mask the banding? (...)
That would be news to me. Again, 10bit or more are of course, more preferable, but 8bit depth in Head range (16-235) is not the root cause of banding per se.  Tests with HDCAM (8bit, with 3:1:1 color space !)  on a 5m screen have shown that even on this medium a testpattern with very complex shadow detail and color tones exhibited FIRST SIGNS OF BANDING only after copies were made (2nd generation - just for comparison:  at 10bit the same happened in 4:2:2 on HDCAMSR after the 8th copied generation).  With BD encodes, the sources are 10bit, the encode being - if everything was done right - the FIRST generation. Ergo: banding is - provided everything was done right - very unlikely to the extend that you refer to.  Otherwise all masters for DCT, D-5 or Digital Betacam would have had this problem in general - and, again, this was not an issue simply because of color space, (8)bit depth or Head Range limitations.  However, the moment the gradation and gamma curves are twisted/stretched in such a way that the signal loses detail as described, banding becomes increasingly likely.   As for "adding noise":  not necessary as film has its native "noise" that makes banding even less likely - even at 8bit in 16-235 HEAD RANGE. You know it: its emulsion grain.  However:  remove the grain digitallly - especially by poor means - and you will encounter banding as a result as well.
Torsten Kaiser
-----------------------
TLEFilms Film Restoration & Preservation Services
www.TLEFilms.com

#36 of 48 OFFLINE   Dick

Dick

    Producer



  • 4,482 posts
  • Join Date: May 22 1999
  • Real Name:Rick

Posted January 13 2013 - 08:21 AM

If the 2k images on my 46" (soon to be 60") screen were are sharper, they'd be dangerous. I can't imagine the need for 4k unless I was among the 1% and could afford an awesome, designated home theater room with a near-theatrical-size screen. Certainly that is the demographic they intend it for.

#37 of 48 OFFLINE   Persianimmortal

Persianimmortal

    Screenwriter



  • 1,120 posts
  • Join Date: May 22 2012
  • Real Name:Koroush Ghazi
  • LocationCanberra, Australia

Posted January 13 2013 - 04:26 PM

According to this chart, for a 60" screen for example, you would need to be sitting around 3 feet away to see the full benefit of 4K over 2K, and up to 6-7 feet away to see a partial benefit. So yes, 4K is really designed for large screens, because at a normal viewing distance of say 8-10 feet, we're talking an 80" screen required to see even the slightest benefit of the extra resolution, and a 140"+ screen to see the full benefit of 4K. Cost aside, consumer acceptance is the key here. The simple fact is that most people are just not capable of fitting 80"+ screens into their typical viewing areas. Such a TV would absolutely dominate the average living room. So unless and until we get cheap, paper-thin OLED screens, which perhaps can be wall mounted flush and even roll up when not in use, 4K or 8K is going to be super-niche.

#38 of 48 OFFLINE   Kevin EK

Kevin EK

    Screenwriter



  • 2,857 posts
  • Join Date: May 09 2003

Posted January 13 2013 - 05:28 PM

I see nothing here but the usual marketing. If anything, it's nice that they say they'll work from 4K, but the reality, as pointed out numerous times here, is that most consumers would not be able to reasonably detect the difference. I have a 65" HDTV with 3D capability.  A year ago, it was top of the line.  Right now, I'd say that it's still pretty darn close to top of the line.  I have no desire to spend tens of thousands of dollars investing in a new 4K set, particularly at a beta test level. I think what I have will do very well for years to come. Now, I can see a difference coming when the companies work the bugs out of glasses-free 3D. At this time, you can have a set where without glasses you can find a few sweet spots in the room where you can actually perceive the 3D without wearing any glasses.  But that's not good enough to justify anyone spending thousands to buy that set. I'd say it will be a few more years before you can buy a set where you can walk around the room and perceive the 3D from most places and at most angles, and then it will be a few more years after that before the prices drop down into our stratosphere.  (And by that time, the 3D should be worked out enough that nearly the whole room would be a "sweet spot".) But buying a 4K HDTV today? No way. And advertising that new Blu-rays will be special because of 4K mastering just assumes that the consumer will think that the label makes the product special. For many customers, that may well be the case.  Judging from the comments, I'd say it won't be with the readers of this forum.

#39 of 48 OFFLINE   Michel_Hafner

Michel_Hafner

    Supporting Actor



  • 842 posts
  • Join Date: Feb 28 2002

Posted January 13 2013 - 09:22 PM

That would be news to me. Again, 10bit or more are of course, more preferable, but 8bit depth in Head range (16-235) is not the root cause of banding per se.  Tests with HDCAM (8bit, with 3:1:1 color space !)  on a 5m screen have shown that even on this medium a testpattern with very complex shadow detail and color tones exhibited FIRST SIGNS OF BANDING only after copies were made (2nd generation - just for comparison:  at 10bit the same happened in 4:2:2 on HDCAMSR after the 8th copied generation).  With BD encodes, the sources are 10bit, the encode being - if everything was done right - the FIRST generation. Ergo: banding is - provided everything was done right - very unlikely to the extend that you refer to.  Otherwise all masters for DCT, D-5 or Digital Betacam would have had this problem in general - and, again, this was not an issue simply because of color space, (8)bit depth or Head Range limitations.  However, the moment the gradation and gamma curves are twisted/stretched in such a way that the signal loses detail as described, banding becomes increasingly likely.     As for "adding noise":  not necessary as film has its native "noise" that makes banding even less likely - even at 8bit in 16-235 HEAD RANGE. You know it: its emulsion grain.  However:  remove the grain digitallly - especially by poor means - and you will encounter banding as a result as well.
Yes, fortunately film has built in dither that helps most of the time. The 8 bit limitation is obvious on computer generated noise free subtle colour gradients (no complexity needed or wanted). These do not exist like that on film, but CGI productions/cartoons can easily have them if one does not avoid them. I also see banding regularly on fade ins and fade outs of films on Blu Ray. How much of this is in the master and how much due to compression would be interesting to know.

#40 of 48 OFFLINE   Michel_Hafner

Michel_Hafner

    Supporting Actor



  • 842 posts
  • Join Date: Feb 28 2002

Posted January 13 2013 - 10:08 PM

A nice thought, but the reality is that most new productions, be they sourced on film or data, do not have the luxury / problems / added expense of 4k. The majority are all 2k DIs, and look just fine on both film and data.
Yes, they do. But they also do not have the resolution of the best prints of the past. They are an economic compromise.
When we restore or preserve film-based productions, the point is to preserve everything on the original, as viewed in release. And 4k does the job nicely.
By that you basically admit that nowadays with 2K DIs we see less of the new originals than we did it in the past of the old originals. :) The inherent contradiction of this 4K for preservation and 2K for new films is that it makes no sense to preserve something at higher quality than the audience is ever able to see it. Films are made for people to watch. We preserve so we can see it now and in the future. So why should we preserve at 4K if we are not willing to produce new material at 4K from 4K originals? Do we simply want to preserve these later at 4K? (Unrealistic for any sfx heavy films). Or do we want a two class society? 2K for new productions and 4K for classic films worth the extra expense? I think it's extremely short sighted these days not to do a 4K DI for any new 35mm films that are not very low budget. These 2K films have cornered themselves to look unnecessarily limited compared to all the new 4K material that will be available within a couple of years.
But take a very high quality recent film, for example Flags of Our Fathers, and you're viewing 2k. A newer film, Silver Linings Playbook - 2k. Les Miz - 2k. In most situations, 4k is an unecessary luxury. RAH
That depends on the goal. For butts in seats I guess it is for now. The average audience is not discriminating enough to make this a factor. For showing us what was shot down to fine detail without pixel artifacts the 2K is insufficient, though. A fact I'm reminded every time I go see a 2K presentation in a modern wide screen cinema.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users