1. Guest,
    If you need help getting to know Xenforo, please see our guide here. If you have feedback or questions, please post those here.
    Dismiss Notice

1080p

Discussion in 'Displays' started by MikeSh, Nov 20, 2005.

  1. MikeSh

    MikeSh Well-Known Member

    Joined:
    Oct 11, 2000
    Messages:
    120
    Likes Received:
    0
    I've been looking at the Samsung 1080p DLP RP units and had a chance to see one of these in a 56" model next to a Samsung 720p 50" model (DLP RP).

    The 1080p had a noticeably better picture so i'd probably go that route.

    But then I found out from reading here that the 1080p sets that are currently on the market only accept 1080i inputs(among others) and then internally convert the signal to 1080p. I then read the same info on the Samsung website.

    Does anyone know if 1080p sets will be inputting 1080p signals any time soon? Will HD or Blue Ray DVD be in 1080p?

    Do I really lose much in quality inputting 1080i and having the TV convert it to 1080p?

    MikeS.
     
  2. Allan Jayne

    Allan Jayne Well-Known Member

    Joined:
    Nov 1, 1998
    Messages:
    2,406
    Likes Received:
    0
    I would say that these 1080p DLP TV's give you the benefits of DLP and the benefits of "1080" but only some of the benefits of "P".

    To begin, the 1080p TV must convert everything to 1080p because that is all the DLP display element can do.

    (copied from another post)

    You will get a stunning picture no doubt. But with very few exceptions, I think Sony has one, you cannot upgrade to the best you could possibly get later using add-ons if your 1080p TV does not accept 1080p input.

    I don't know how well your TV can do it but some 1080i to 1080p conversions out there do only interpolation and also make a gray mishmash of (a torture test of) alternating side to side white and black lines stacked.

    I would wait.

    Today for $1500. Lumagen has an external de-interlacer that gives far better 1080i to 1080p conversion than 99% of the 1080p TV's (and projectors) out there. For that it needs a 1080p input on the TV. It also eliminates the need for progressive in a DVD player and makes most interlaced players outdo most progressive players out there. For all I know it will outdo the 480i to 480p stage preceding the 480p to 1080p (or 480p to 1080i or 480p to 720p) stage in your TV as well if the 480i to 480p stage is not Faroudja or Silicon Image.

    As it turns out, converting 1080i into 720p for the 720p model TV you saw faces the same de-interlacing hurdles as converting 1080i into 1080p. The Lumagen "HDP" unit surmounts this and 99% of the TV's out there today don't, too. What you saw comparing those two Samsung models was just the difference between "720" and "1080".

    Video hints:
    http://members.aol.com/ajaynejr/hdtvnot.htm
     
  3. Dennis Oblow

    Dennis Oblow Well-Known Member

    Joined:
    Apr 26, 1999
    Messages:
    143
    Likes Received:
    0
    Real Name:
    Dennis
  4. John Whittle

    John Whittle Well-Known Member

    Joined:
    Mar 22, 2004
    Messages:
    185
    Likes Received:
    0


    The HP DLP set is supposed to accept 1080p over HDMI and it's rumored that the new JVC Dila 1080p sets MIGHT accept 1080p (there just are too few to test and there is a question about what you use to test).

    Right now the only source seems to be computer output and the question is whether a 1080p over HDMI will be the same as a stand alone device that outputs 1080p. Perhaps the PS3 will have 1080p output and we'll know that soon.

    As for Blu-Ray or HD-DVD, it certainly would be possible to do this with a film source and hard with a live video source. Of course if it's a film source and flagged, it's an open question of whether the DVD player or the display device will do a better job of reconstruction of a 1080p signal.

    The last I heard, MS said there were problems with Blu-Ray and would only have HD-DVD in the next version of Windows. There is also speculation that HDCP which is controlled by Intel might be one of the reasons behind the Macintosh move to Intel chips.

    So while some would say wait, I'd suggest you decide what your input sources will be before you decide. There is little chance you'll see any live HDTV at 1080p due to bandwidth and compression problems and if it's a film source at HDef, there is probably little need for a 1080p source since the "p" can be rebuilt from the encoded signal anyway.

    John
     
  5. Mike Milillo

    Mike Milillo Well-Known Member

    Joined:
    Apr 14, 2003
    Messages:
    142
    Likes Received:
    0
    very informative post John, thankyou.
     
  6. Dan Hitchman

    Dan Hitchman Well-Known Member

    Joined:
    Jun 11, 1999
    Messages:
    2,714
    Likes Received:
    0
    However, a 1080i source is filtered in order to lessen interlaced artifacts such as jaggies and twitter. You lose resolution that way. Even if you had perfect weaving of 1080i back to 1080p you wouldn't get the lost detail back.

    The HDMI spec. you want in a TV is the ability to allow an input of 1080p/60 Hz.

    Also, a TV must be able to do correct de-interlacing of 1080i sources... most sets bob 1080i so the scaler only deals with 540p rather than weaving the signal to 1080p and then scaling. You get a lesser 1080i image on even a 720p or so-called 1080p native set if that's the case (basically you're only getting a portion of the actual detail contained in the signal!!). In a recent Perfect Vision magazine finding most sets sent for review failed the de-interlacing of 1080i signal test. Most newer Toshiba HDTV's did quite well dealing with 1080i as well as a couple other manufacturers.

    I'd wait until next year's models to see if sets improve, and include 1080p/60 Hz HDMI inputs for Blu-Ray players and the Playstation 3.

    Dan
     
  7. ChrisWiggles

    ChrisWiggles Well-Known Member

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    dan, 1080p60 is not the only concern, presumably the more important consideration is film-based content in which case you will want to be able to input 24p, 24psf and 48p directly via external scaling, because only p60 is not that useful unless all you watch is video content via an external scaler.
     
  8. Dan Hitchman

    Dan Hitchman Well-Known Member

    Joined:
    Jun 11, 1999
    Messages:
    2,714
    Likes Received:
    0
    This is true... it depends on whether the player will convert everything encoded on a disc to 1080p at 60 Hz or not, or if they will output raw data and let the display handle the frame rate and interlace/progressive conversions (1080p source conversion should not be done if the display can accept 1080p video).

    If not then the TV must accept whatever the player or other source device is outputting and convert it to 1080p/60 itself.

    There are too many variables right now and so I still think waiting until Blu-Ray is set to buy a new digital TV is a good idea. That should be some time next year.

    Dan
     
  9. ChrisWiggles

    ChrisWiggles Well-Known Member

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    the tv's shouldn't convert everything internally to 60hz though, they should be able to display at both film and video rates, otherwise this limitation doesn't really matter at all. I don't agree with the desire to wait for HD-DVD or BLu-ray for a new display. There are a lot of people with CRTs a decade old (like me!) who will quietly figure out ways to see hd resolutions.
     
  10. Dan Hitchman

    Dan Hitchman Well-Known Member

    Joined:
    Jun 11, 1999
    Messages:
    2,714
    Likes Received:
    0
    Chris,

    You wouldn't want to see 24 fps at 24 fps (motion picture projectors have extra quick frame shutters so that you get the approx. of 48 fps in order to lessen judder)! You'd have motion artifacts up the wazoo! What we need are displays that can do higher refresh rates that model exactly the frame rates of the various sources so we don't have odd 2:3 pulldown conversions and whatnot.

    Dan
     
  11. ChrisWiggles

    ChrisWiggles Well-Known Member

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    yes, but you can run a digital at 24hz refresh, it's really the same thing as film material played at 48hz if you think about it 48hz repeats frames it accomplishes the same thing. You can't run a CRT that slow (24hz), however, because it depends on the scanrate to maintain the image temporally while a digital does not. 48hz doesn't have any affect on judder if the source is 24 fps, what you see is still 24fps judder. There is no interframe interpolation going on, it's still 24fps at 48hz, not 48fps at 48hz so the inherent 24fps judder is not effected. Obviously using a film multiple of 24, 48, 72 eliminates the 2:3 judder if you try to play film material at video rates. (gotta make sure I get my "fps" and "psf" right here in these kinds of discussions...lol, it's confusing enough already!) [​IMG]
     

Share This Page