-

Jump to content



Photo
- - - - -

Progressive image better than interlaced image?


This topic has been archived. This means that you cannot reply to this topic.
32 replies to this topic

#1 of 33 OFFLINE   DaViD Boulet

DaViD Boulet

    Lead Actor

  • 8,805 posts
  • Join Date: Feb 24 1999

Posted July 10 2006 - 09:08 AM

Quote:
just as the perceptible dif to the human between 1080i and 1080p is apparently impossible to see

Hey Peter,

where did you hear this? That may be true from 3 screen widths away... but from any decent viewing angle the eye can most certainly see the difference between 1080i and 1080p (and properly versus incorrectly deinterlaced 1080i->p). Your eye can see the difference between 300 and 600 dpi laser-printing. That's much finer resolution than the 1920 x 1080 pixel count of an HD movie screen.

Quote:
wonder if the audio codecs afforded by HDMI 1.3 will be perceptible to the human ear in terms of comparison to standard DTS and DD, be it EX or ES 6.1

Sure they will. Even the lossless 16/44.1 on laserdisc sounds better than compressed DD on DVD and you don't need a fancy system to hear it (the same way that CDs sound better than MP3s even on regular headphones and speakers). Higher-fidelity DD+ and lossless Dolby True-HD and DTS-HD would sound better than conventional lossy DTS/DD on most decent sound systems.
Be an Original Aspect Ratio Advocate

Supporter of 1080p24 video and lossless 24 bit audio.

#2 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 10 2006 - 10:17 AM

There IS no visible difference between 1080i and 1080p - if both done right. Period.


Cees

#3 of 33 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer

  • 3,729 posts
  • Join Date: Feb 18 2004

Posted July 10 2006 - 10:23 AM

Now, that's simply not correct. The eye can distinguish between displayed intelaced and progressive video at the same resolution. There are a variety of phenomena which lead to a reduction in percieved picture quality with interlaced scanning.

As far as 1080i vs. 1080p input to progressively or simultaneously-scanned displays, displayed as progressive video, you have quite another story depending on what "done right" means. If you're talking about film formatted as interlaced video, 24pSf-60, and then properly deinterlaced, sure. 1080i30 video, however, is much superior to 1080p30 in terms of motion rendition, but converting it to 1080p60 for non-interlaced display is [begins with P which rhymes with T which stands for] trouble.

#4 of 33 OFFLINE   Paul Hillenbrand

Paul Hillenbrand

    Screenwriter

  • 1,263 posts
  • Join Date: Aug 16 1998
  • Real Name:Paul Hillenbrand

Posted July 10 2006 - 10:31 AM

Quote:
Originally Posted by Cees Alons
There IS no visible difference between 1080i and 1080p - if both done right. Period.


Cees
I'll agree to this Only, when the 1080i is deinterlaced to a the completely native and "true" 1080P format and viewed on a display that supports the same. Posted Image

Paul
Avatar: "The Annunciation to the Shepherds" Painting by Nicolas Berchem (1680-1683)
BD 3D, BD, HD DVD, DVD collection


#5 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 10 2006 - 12:25 PM

Quote:
Originally Posted by ChristopherDAC
1080i30 video, however, is much superior to 1080p30 in terms of motion rendition
No, unless you're talking about TV now. And even then, discussing the veracity of the resulting display is complicated (each pixel is "exactly" where it should be - in time).

But when the image is scanned from a film frame, it doesn't matter if the lines are first transferred this way or that way: when back at their own (proper!) place in the displayed frame, there is no longer any difference and the image is "exactly" the same in both cases. When I say "done properly", I mean that the imnage is reconstructed correctly (not influenced by faulty conversions).

Motion rendition is as good as the motion rendition was on the original film.

When the eye "can see" the difference, progressive is worse: "the eye" notices flicker! Most modern monitors simply present the whole image to the eye, however.


Quote:
Originally Posted by Paul Hillenbrand
I'll agree to this Only, when the 1080i is deinterlaced to a the completely native and "true" 1080P format and viewed on a display that supports the same. Posted Image
Of course, Paul! Posted Image


Cees

#6 of 33 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer

  • 3,729 posts
  • Join Date: Feb 18 2004

Posted July 10 2006 - 01:35 PM

That's what I'm saying, Cees. You have to distinguish between "video" and "film transported via video". If you have actual video, made using a video camera, 1080i30 is a completely different animal than "1080p" at whatever framerate. In resolution, it's the same. In motion rendition, it's about 2/3 of the way toward 60p from 30p [according to experimental visual-perception results, not just theory]. If displayed in its native format, on a scanning display such as a CRT, it has a distinctive visual appearance, different from a progressive format. 30fps progressive has an intolerable flicker ; again, 30i is most of the way toward 60p in this respect, but not all there.

If you're only using it to transport film material, as I said before, there's no inherent difference as long as the processing at both ends is complementary. But you can't make blanket statements about "1080i" and "1080p" without specifying application.

#7 of 33 OFFLINE   Lew Crippen

Lew Crippen

    Executive Producer

  • 12,060 posts
  • Join Date: May 19 2002

Posted July 10 2006 - 01:57 PM

Well, I did think that we were talking about movies Chris--all in good old 24 fps. And many of the hot new HD video cameras have options to simulate 24 fps, something that is of interest to filmmakers.
¡Time is not my master!

#8 of 33 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer

  • 3,729 posts
  • Join Date: Feb 18 2004

Posted July 10 2006 - 02:12 PM

The original ToddAO, 30fps. The added motion rendition, by the way, makes quite a difference to the sensation of realism.

ShowScan, 60fps.

Frame-sequential 3D, 48fps.

Silent films, often 18-22 fps, sometimes variable.

And in any case, "movies in good old 24fps" aren't the only thing going. What about High Definition TV shows? What about computer animation at idiosyncratic framerates? Videodiscs aren't just for major studio motion pictures, at least I certainly hope they aren't.
I'm talking about video technology in general, because the statement referred to was a blanket assertion about 1080i versus 1080p. Depending on the display used, and depending on the video source, very different things happen, and one simple statement just isn't a[plicable enough to be meaningful.

#9 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 10 2006 - 06:50 PM

So, when talking about film on DVD and the perceived difference between 1080i and 1080p (at any viewing distance or angle), we can agree that
Quote:
There IS no visible difference between 1080i and 1080p - if both done right. Period.
Posted Image


Cees

#10 of 33 OFFLINE   Juan C

Juan C

    Second Unit

  • 450 posts
  • Join Date: Jan 22 2003

Posted July 10 2006 - 09:20 PM

...with a corollary:

There CAN be a visible difference between 1080p and 1080i - if any of them are done wrong.

Posted Image

#11 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 10 2006 - 09:27 PM

Posted Image


Cees

#12 of 33 OFFLINE   Peter Overduin

Peter Overduin

    Supporting Actor

  • 781 posts
  • Join Date: Dec 31 1969

Posted July 10 2006 - 10:30 PM

Quote:
Originally Posted by DaViD Boulet
Hey Peter,

where did you hear this? That may be true from 3 screen widths away... but from any decent viewing angle the eye can most certainly see the difference between 1080i and 1080p (and properly versus incorrectly deinterlaced 1080i->p). Your eye can see the difference between 300 and 600 dpi laser-printing. That's much finer resolution than the 1920 x 1080 pixel count of an HD movie screen.

Sure they will. Even the lossless 16/44.1 on laserdisc sounds better than compressed DD on DVD and you don't need a fancy system to hear it (the same way that CDs sound better than MP3s even on regular headphones and speakers). Higher-fidelity DD+ and lossless Dolby True-HD and DTS-HD would sound better than conventional lossy DTS/DD on most decent sound systems.

...what Dave said...think i will reitre to my HT room and watch something in analog....my head hurts (I know i am qquoting something a French soccer player is saying today)...and wait for the next onslaught of announcements. I suppose if I can spend 800 on my first DVD player, the Pioneer DV500, I should be able to drop 700 on HD...sigh
Peter

My Collection

#13 of 33 OFFLINE   DaViD Boulet

DaViD Boulet

    Lead Actor

  • 8,805 posts
  • Join Date: Feb 24 1999

Posted July 11 2006 - 01:50 AM

Quote:
So, when talking about film on DVD and the perceived difference between 1080i and 1080p (at any viewing distance or angle), we can agree that
Quote:
There IS no visible difference between 1080i and 1080p - if both done right. Period.




Cees

Cees,

They eye can see "jaggies" with 1080i even mastered from film beause the alternating scan-lines cause twitter with fine detail like plaid shirts etc... just like with 480i. Since film has more detail than 1080 lines can capture you can get moire and aliasing with 1080i just like you can with 480i. Proper deinterlacing to 1080p removes these artifacts entirely, hence 1080p display looks better than the native interlaced signal.

If you mean they should look no different after proper deinterlacing the 1080i to 1080p ... then you're now talking about progressive, not interlaced. Naturally 1080p looks like 1080p (when proper deinterlaced film material is compared to native 1080p of the same).

1080i does not look as good as 1080p even when mastered from film when displayed in native form (like on CRT). The eye can see interlacing artifacts and gaps between the lines. If you can't, it's because your CRT display isn't full 1080 resolution and is blurring the gaps between the lines (like most 1080i CRT dispays do). They aren't full resolution and can't reveal full 1080p clarity but the softness they add helps to mitigate some of the interlacing visibility so it can work in the favor with 1080i content (but would noticably soften real 1080p if the same display could lock and show it as well).

1080p looks better than 1080i. Just like the eye can see the difference between 480i and 480p display from film-source material. Would you tell me that 480i and 480p look the same when mastered from film-source material? No difference with HD. Just more detail with 1080 lines.
Be an Original Aspect Ratio Advocate

Supporter of 1080p24 video and lossless 24 bit audio.

#14 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 11 2006 - 03:40 AM

David,


I'll give it one more try.

Think of an individual film frame. It's like a photograph, nothing is moving in it.

Now a scanner is scanning that image. The machine produces a number of horizontal lines (e.g. 480 or 575, or 1080). We'll give those lines a number, starting with 1 or 0, it's not really important. We call the horizontal lines having received an odd number the odd lines, the other ones the even lines.

It doesn't really matter in which order the horizontal lines are produced, because the image isn't moving. When the scanner spot reaches a certain place, it will scan the same "pixel" independent of the sequence of scanning.
The result is a number of horizontal lines, and all we have to do is: make sure we know where each line belongs.

Next step: the horizontal lines are transmitted to another place (through all sorts of in-betweens). It could be done in various ways, but two specific orders of transmission are really standardized: interleaved (all odd lines first, then all even lines), or progressive (all lines directly after each other).
Top to bottom.

Now they are sent to a display, to be shown to the human eye. All that matters is: they should get at the proper place.

This can be done in various ways too: older TV sets, for instance, presented each "odd half-frame" after each "even half-frame" (each horizontal line exactly at the right place!) and let the human eye do the cumulating.
Why? Because the frame frequency was too low, and progressive images would result in visible and distracting flicker (the top of the image was fading out already before the bottom was painted on the TV screen).

More modern monitors present the image at a much higher frame-rate (~ 100Hz). CRT-type displays still have to "paint" their image (top-down), but LCD- and Plasma-screens don't have to do that. Those images are read from a cache memory.

Now all that matters is: the various horizontal lines have to get at their right place inside the image. If they do, the image is "perfect" (for the given resolution). Jaggies and moiré may be there, but will not be caused by the way the horizontal lines were transmitted.

If you want to call the resulting image of an LCD-screen "deinterlaced", you're free to do so, but then the term has become worthless, since the real difference is the way the lines were transported.

Once the image is reconstructed, it's no longer possible to tell how (in what order) it was transmitted.

Note that I do not try to muddy this topic with problems around 3:2 pull down (NTSC) and that we're explicitly not discussing real time moving images (TV and video recorder).

Interleaved vs. progressive, apples to apples. Posted Image


Cees

#15 of 33 OFFLINE   DaViD Boulet

DaViD Boulet

    Lead Actor

  • 8,805 posts
  • Join Date: Feb 24 1999

Posted July 11 2006 - 04:04 AM

Cees,

Only talking about film-source material here:

Can your eyes see the "line twitter" with a 480I signal on a NTSC television versus the same signal deinterlaced (properly) to 480P?

Yes. You see twitter on fine detail...especially horizontal lines that are captured one 30th of a second by one field then dropped the next 30th of a second in the next field... Venecian Blinds are the most obvious offender in 480I sources, and often images with content like tweed-jackets is filtered first to remove the fine detial to avoid moire and twitter when viewed in 480i mode.

1080I has the same 60Hz "refresh" rate as 480I. The same line-twitter on fine horizontal detail is visible. In fact this is why so many older HD transfers are vertically filtered... to minimize aliasing and twitter on interlaced displays.


Quote:
Once the image is reconstructed, it's no longer possible to tell how (in what order) it was transmitted.

Then it's now a progressive-scan image. Not an interlaced one. Sure. That's why I've posted time and time again that *with proper deinterlacing*, a 1080i feed (from film) can become perfect 1080p source material for a 1080p display.

When I say that 1080i doesn't look as good as 1080p I'm talking about *watching* it in 1080i mode... not progressive. Naturally with proper deinterlacing it looks like 1080p... because then is *is* 1080p!

Posted Image

However, anyone with traditional CRT HD sets will see the 1080i in native mode, and most current progressive-display sets like Plasmas etc. don't apply the proper deinterlacing algorithm to restore the true 1080p original on the screen so it's not a given that anyone with a plasma will see "real 1080p" when feeding it a 1080i signal.



Quote:
More modern monitors present the image at a much higher frame-rate (~ 100Hz). CRT-type displays still have to "paint" their image (top-down), but LCD- and Plasma-screens don't have to do that. Those images are read from a cache memory.

There are many ways to deinterlace an image in order to "paint" a solid frame on the screen, and they are NOT all equal.

With film based material interlaced in 60Hz, only inverse-telecine (3-2 reversal) can do it right.
Bobbing (interpolating) merely softens the image (but reduces jaggies), and interweaving fields without first identifying the proper frame pairs and "odd man out" extra field from the 3-2 repitition results in combing artifacts during motion which are very distracting.


p.s. I understand the process of scanning a film-frame to the 1920 x 1080 level and then splitting the "lines" into odd/even pairs for transmission and how no actual information is lost. That's not the problem... the problem lies in how most consumer displays handle the digital processing of the interlaced signal when converting back to 1080p for display. The "reconstruction" you talk about that folds the fields back into the original frames is what is meant by "inverse telecine" or "3-2 pulldown reversal" where 24fps film-based source material is concerned. Any other method used to digitall convert the 1080i to 1080p for display purposes will degrade the image and NOT look as good as the original 1080p, and in many instances not look as good as the raw (unprocessed) 1080i!
Be an Original Aspect Ratio Advocate

Supporter of 1080p24 video and lossless 24 bit audio.

#16 of 33 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer

  • 3,729 posts
  • Join Date: Feb 18 2004

Posted July 11 2006 - 08:49 AM

Incidentally, I caught something from one of Cees' most recent posts. Namely, he's assuming that television technology in Europe and America is the same. In fact, the equivalent of the European 100 Hz set, which would be a CRT set displaying 120 interlaced fields per second, has never been seen in the US to the best of my knowledge.
60fps progressive, in fact, seems to be quite difficult to find in an American tube set [most of them seem to display 480p60 {or 540p60, on certain Toshiba sets} but convert 720p60 to 1080i30], despite higher scanrates being quite common among CRT computer monitors. If each field in a 1080i signal is being displayed twice in 1/25 of a second "over there", then the results are going to look quite different from displaying each once. I'm not sure about CRTs with 3:2 pulldown correction ; I'm guessing 480p72 but am not sure.

#17 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 11 2006 - 11:34 AM

Quote:
In fact, the equivalent of the European 100 Hz set, which would be a CRT set displaying 120 interlaced fields per second, has never been seen in the US to the best of my knowledge.
Christopher, yes, I was wondering about that while I wrote it down. We have them since more than 15 years already.
By now they had become standard - except that, in turn, the CRT TV sets are phasing out.


Cees

#18 of 33 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer

  • 3,729 posts
  • Join Date: Feb 18 2004

Posted July 11 2006 - 12:11 PM

Quote:
Best home CRT set ever made I believe!
If only it weren't too big for my purposes… Posted Image

Cees, I don't know why the idea never caught on over here. I assume that it had something to do with the considerably greater immunity of 60 Hz to flicker over 50 Hz, although anybody who uses a computer monitor knows that's not always enough. As a matter of fact it's worse with widescreen because [I find] peripheral vision is more sensitive to flicker — rear-projection sets in bright showrooms give me the pip.
Also, the comments I've read about the European "100 Hz technology" have not always been favourable : in particular, I think there's a complaint that the inversion of the field cycle messes up motion rendering from video. I suppose there are about three ways of displaying 100 Hz from PAL video : two kind of field repitition, either repeating the odd field and then the even [1,1,2,2,3,3,4,4…] or by transposing the fields [1,2,1,2,3,4,3,4,…] ; or by averaging fields from successive frames [1,2,{1+3},{2+4},3,4,{3+5},{4+6},…]. None would be quite satisfactory with video motion, though the repitition methods would work fine with film-source material [the first still makes an ugly scan-pattern]. Over here, for whatever reason, the emphasis in "EDTV" work was always on progressive scan.

#19 of 33 OFFLINE   Cees Alons

Cees Alons

    Executive Producer

  • 18,640 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted July 11 2006 - 12:35 PM

Christopher,

To the very best of my knowledge, the image is stored in a (full frame) memory and displayed twice before it's replaced (probably at least 2 frame memories).
David would call this de-interlaced (I wouldn't Posted Image ).

(Those TVs commonly also have "PIP" (picture-in-picture) allowing a second picture to be displayed in a corner - proving that it's in a memory).

So, in your notition, that would be 1+2, 1+2, 3+4, 3+4, etc.
With that fame rate, it wouldn't be necessary to interleave the image on screen. However, if they have chosen to do so anyway, that would pose no problem (reading it from memory any way they want it).

Since about 5 years there's often also logic available to interpolate motion changes and thus make those motions visibly smoother (especially important with real time TV, like sport). The chip doing that is developed by Philips and is called the "Falcon". Personally, I suspect processing the image like that, and I wonder if I want it for films on DVD (it can be switched off).


Cees

#20 of 33 OFFLINE   DaViD Boulet

DaViD Boulet

    Lead Actor

  • 8,805 posts
  • Join Date: Feb 24 1999

Posted July 11 2006 - 12:39 PM

Quote:
I have a HTPC hooked up to it, its much easier to read text and such in 720p than 1080i (impossible in 1080i for small text: too much flicker)

That last "small text: too much flicker" is what I'm talking about with line-twiter on fine object detail in 1080i.

That's where 1080p will look better.

Cees, I can see where the difference in display technology on our different shores had caused a context of miscommunication. Yes, almost ubiquitously all consumer display gear in the U.S. locks in at 60Hz... even DLP/LCD projectors that in theory could have any refresh rate *easily*. Grrr!

I'm hopeful that with "1080p24" being bandied about these days, that we'll see some cool high-end displays that can lock at native 24 intervals: 24, 48, 72 etc. With contstant "on" displays like bulb-based projectors or LCD screens there's no flicker even at lower rates, though higher rates make it easier to mix content from various sources (like 60Hz 1080i live feed with 1080p24 film feed).

In any case, deinterlacing of "interlaced" signals is still the key!
Be an Original Aspect Ratio Advocate

Supporter of 1080p24 video and lossless 24 bit audio.


Back to Blu-ray



Forum Nav Content I Follow