Basic HD DVD question

Discussion in 'Playback Devices' started by Gary Strang, Nov 26, 2005.

  1. Gary Strang

    Gary Strang Auditioning

    Joined:
    Nov 22, 2005
    Messages:
    1
    Likes Received:
    0
    Hello everyone, I just got a new HDTV (Toshiba 52HM84) which I am very happy with, and have a question about high-def DVD players.

    I'm looking around online and seeing high definition DVD players that will upscale a standard DVD image to 720 and/or 1080 for purportedly a far better picture. These players aren't prohibitively espensive, starting at around $199, but I'm wondering if these claims of producing a far better picture on HDTV are fanciful.

    Does anyone have any experience of playing DVDs on a high def DVD player who can tell me if the increase in picture quality on HDTV is worth it?

    For reference, the players I'm interested in are the Samsung DVD HD-841 and the Toshiba SD-4980.
     
  2. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    not familiar with your display, but googled it, looks like a DLP rptv. In which case it scales everything to it's native resolution. That's all the DVD player would do, bypassing the internal scaling of the TV with better quality scaling would improve things, but that depends on the quality of the video processing of the upscaling dvd player compared with that internal to your display.

    Yes it is best to 1:1 pixel map to your display with high end processing, but most cheaper upscaling players aren't going to be too different from that already in your display. You will have to compare yourself.

    Lastly, these are upscaling DVD players, they are not HD-DVD players, that's a different animal completely.
     
  3. Stephen Hopkins

    Stephen Hopkins HW Reviewer
    HW Reviewer

    Joined:
    Jul 19, 2002
    Messages:
    2,598
    Likes Received:
    0
    One thing to also consider is that scaling players are (usually) using a digital output (DVI or HDMI). When connected to a digital display via component you're converting from digital to analog at the source, transmitting, then converting from analog back to digital at the display. With a digital display and a digital input/output all the way from source to display you're elimintating two conversions, one D-A and one A-D. The elimination of these conversions is far more likely to improve your picture quality than the actual scaling of the image.
     
  4. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1

    I would not agree with this. Both transmission methods are fully capable of transmitting the image cleanly. Only if implementation is poor on a certain input would there even be differences at the display with different inputs. I've seen DVI transmission to digital displays be visibly inferior to analog YPbPr input, and this has to do with input/ouput implementation, and not at all to do with DAC issues. People blow the "digital transmission" way out of proportio, IMO.
     
  5. Stephen Hopkins

    Stephen Hopkins HW Reviewer
    HW Reviewer

    Joined:
    Jul 19, 2002
    Messages:
    2,598
    Likes Received:
    0
    Implementation is assuredly key to any signal transmission. There are inferior digital sources, but A/D and D/A conversions are most definitely the largest source of picture quality degredation. Any time you convert from digital to analog or vice-versa there is a loss of the original signal. Whether it is visible/audible depends on implementation. But a pure digital signal will always be superios to THE SAME signal converted D-A, or D-A + A-D. That said, comparing two different players or even digital vs. analog on the same player can still yeild different results because implementation and SEVERAL other variables in the signal path prevent you from ever being able to compare THE SAME signal. Experimentation is always kee to what looks/sounds best to any given user.
     
  6. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1

    It is this that I disagree with. Analog signal transmission is fully capable of sending the signal, pixel for pixel. Digital transmission is fully capable of sending the signal pixel for pixel. I will continue to disagree with notions that digital transmission is necessarily and inherently superior, when it comes to video, because it is wrong, and it is *certainly* not the "largest source of PQ degradation."

    Yes, I agree that DAC/ADC design is complex, but there are a zillion other problems that you will find in consumer gear that are significantly more important, and commonly done incorrectly, which leads to times where analog inputs are preferred, or digital inputs are preferred, and it has nothing to do whether the signal is or is not inherently digital or analog.

    Here's a Q for instance, which is better, analog RGB transmission or HDMI YCbCr digital component video transmission? I would say analog RGB, as YCbCr is a compressed format, but then that limitation has already been imposed by the DVD encode. These are all reasons why I find it ridiculous when people say "digital signals=better" because digital or analog has almost nothing to do with it. Is digital composite video better than HD out an analog YPbPr output? The whole "digital is better than analog" falls apart because it's a nonsensical comparison.
     
  7. Stephen Hopkins

    Stephen Hopkins HW Reviewer
    HW Reviewer

    Joined:
    Jul 19, 2002
    Messages:
    2,598
    Likes Received:
    0
    There's no way you can completely represent an analog frequency modulated signal as digital. There are always going to be compromises in the signal. HOW the conversion is done is what matters. The same is true for HOW a purely digital signal is handled. A PURE digital signal of a digital source will ALWAYS be truer to the source than THE SAME signal with conversion to analog and especially with another conversion from analog back to digital. The comparisson is in no way nonsensical. It's not apples-to-apples but it is definitely a comparisson of two ways of doing the same job. If for a transmission of a digital source EVERY other variable somehow did not affect the signal and the signal received is compared to the original signal there's no way a D-A + A-D converted signal can be truer than the direct digital path. HOW consumer gear does this leaves so many other variables out there that you can get a wide variety of results with both digital and analog transmission.
     
  8. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    It is nonsensical, because you are not at all considering how quantization noise impacts the signal. You are taking the simplistic approach that "gah there's quantization noise!" therefore the digital signal is better. If the quantization noise doesn't degrade the signal transmission, i.e. it is fully pixel for pixel reassembled at the other end, than your concern *is* nonsensical.


    It's these kinds of statements that are silly, because they are meaningless. What are the capabilities of the digital system? What's the sampling rate, what's needed to preserve, is it enough? What are the capabilities of the analog system? What's needed? You have absolutely no consideration of those questions that I can see. I can assure that you that video signals can be transferred from one place to another very cleanly in analog formats such as YPbPR or RGB, and they have been for decades. They can also be transmitted from one place to another digitally using the same formats, and they likewise have been for years. I'm sure tons of the content you are seeing now has been through analog/digital conversions many times. Egads! your concern is misinformed and simplistic. I apologize if I'm blunt here, it's just people keep making these kinds of statements because they see a little picture in a book about sampling or read something about sampling theory and think they know what they're talking about, and then apply that to all ov audio video and say "LP is better because blahblahblah digital" or "digital is better because blahblahblah" and it's usually always absurd on its face.
     
  9. Stephen Hopkins

    Stephen Hopkins HW Reviewer
    HW Reviewer

    Joined:
    Jul 19, 2002
    Messages:
    2,598
    Likes Received:
    0
    YOU'RE NOT READING WHAT I'M SAYING. Analog transmission can be extremely clean. I'm simply saying an analog transmission of a digital signal offers a source of noise that is not there in a purely digital path. There's no way that it is even theoretically possible for a digital signal converted to analog and then back to digital to be truer to the source as an UNCOMPROMISED digital feed. It's just not possible. Whether it's visible/audible depends on the severity of the compromise which is purely a function of how the hardware handles the conversion. I'm sitting here agreeing with you that implementation is most definitely the most important factor.

    Also, I'm a senior Aerospace Engineering student at Georgia Tech focusing on Flight Control Systems, a focus requiring understanding of both digital and analog electronics a well as DSP and signal transmission. You shouldn't assume someone's understanding is based on what's thrown at them by electronics manufacturers.
     
  10. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    And what I'm saying is that such a statement is meaningless with regards to consumer electronics, in particular video transmission as we were discussing here. Any conversion from digital to analog, yes you are right. My point is that it's a miniscule source of problems in getting the video intact from the source to the display.

    It's like saying that doing a tire tread by hand is better than what you can buy, when discussing a ford festiva. True, perhaps, but the difference is not really a problem.

    In video, it's straw in the wind compared to everything else that's damaging the signal, and certainly is the case with consumer gear and the implementation of those circuits. THAT's what I'm saying.

    You might also be unaware that at long distances, analog RGB or YPbPr may be preferred because for instance DVI can start to have all kinds of problems. This is why the digital/analog thing is really moot.
     
  11. Allan Jayne

    Allan Jayne Cinematographer

    Joined:
    Nov 1, 1998
    Messages:
    2,404
    Likes Received:
    0
    When A/D conversion chops up the analog waveform into about 1-1/2 times the number of pixels implied by the source (the reciprocal of the Kell factor) or more the degradation is likely to be negligible. A HDTV is more likely to work with a pixel width (per scan line) well in excess of 720 (native DVD) from the git go and probably works with the same pixel width for the entire digital portion of the signal path, that pixel width being the horizontal pixel count of the display.

    When you buy an upscaling DVD player you are stuck with the de-interlacing the player comes with if you use the upscaling. If you are dissatisfied you ahve to take 480i out of the player and put that into a better de-interlacer perhaps that in your TV.

    Video hints:
    http://members.aol.com/ajaynejr/video.htm

    Kell Factor -- The (subjective) ratio of the perceived lines of resolution of a digitized picture to the number of pixels in that direction (horizontally, etc.) over the same distance. (Lines of resolution correctly must specify a distance such as centimeters or screen height.)
     
  12. Stephen Hopkins

    Stephen Hopkins HW Reviewer
    HW Reviewer

    Joined:
    Jul 19, 2002
    Messages:
    2,598
    Likes Received:
    0
    Your points about the ability to percieve the difference and the fact that there are many other factors contributing to signal degredation are valid... they are also aknowledged in my posts, as is the fact that you need to experiment and see what gives you the best PQ. If it were a moot point then experimentation to find what works best would be pointless as well since either transmission technique would work equally as well.

    Also, 720p and 1080i DVI have been proven to work well over the 5m restriction reccomended by the DVI spec and definitely long enough to be used succesfully in most HT installations, even ceiling or rear wall mounted front projection setups. For 1080p in the future it's most likely transmission distance will be reduced. Still, most any instance that DVI or HDMI would be used will be within their acceptable distance limitations. You should really stop assuming people don't know the things you do.
     
  13. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1
    And steve, I do apologize for being gutsy in my posts, but it's just a common kind of language that simplifies things too much and it's a little frustrating from my perspective sometimes, because people draw erroneous conclusions from discussions like this.

    Analog transmission can resolve per-pixel 1080p very nicely. Analog circuitry in displays is a more complex matter, but that isn't a DAC issue, IMO. Still, we have computer monitors that have been displaying resolutions like 1600x1200 at very fast refresh rates, progressive, with analog transmission over crappy VGA connectors after DAC, with no problems for many years now. The difference is ALL in the implementation, and not so much whether the signal is digital or not, or originated as digital or analog or not. If you tell people that DAC/ADC necessarily can risk compromises (which in a broad but dangerous sense because people will twist this in all kinds of ways.) in the signal, then people will use DVI to a digital display based on what you've said, without realizing that very often DVI may actually be worse, as certainly has been the case on certain components or displays.
     
  14. ChrisWiggles

    ChrisWiggles Producer

    Joined:
    Aug 19, 2002
    Messages:
    4,791
    Likes Received:
    1

    No, that precisely shows why it is a moot point. If it weren't a moot point, then we could all say that digital transmission is superior, and then it's not an issue to test individual systems. Precisely because the implementation is the issue, and not whether it is or is not digital is WHY you need to test individual systems.

    Here's another example, what about digital outputs that clip levels? This is infinitely a bigger problem than conversion to analog, which is done fairly easily even on cheap equipment. What about pixel clipping on digital inputs or outputs, this can happen even on very expensive equipment. All these totally dwarf DAC/ADC issues, which is exactly why people should test their individual video chains, because we CANNOT make generalizations that analog transmission or digital transmission is preferred.
     

Share This Page