What's new

Not happy with my DVD-ROM screenshots (1 Viewer)

Vince Maskeeper

Senior HTF Member
Joined
Jan 18, 1999
Messages
6,500


I am currently running a LE card from the older 7500 series of ATI Radeon cards. There have been 2 major gneration releases since (although my card is recognized as a fave for HTPC use still to this day). I think I gave $50-75 for it last year- but they're nearly impossible to find now.

I'm about to make the jump to the newer ATI stuff (9500/9500 PRO)- I skipped the 8500 stuff due a driver issue with grayscale ramp [which I hope is corrected in the 9000 drivers].

If I finally make the jump to the 9000 card (gotta remember which one I was planning to buy- I lost all my bookmarked articles on the cards)-- I'll probably sell off the old trusty LE card for $35 or so.

-Vince
 

Wayne Bundrick

Senior HTF Member
Joined
May 17, 1999
Messages
2,358
It doesn't entirely make sense, Vince. MPEG decoding is almost completely deterministic, that is, given the same MPEG stream, all decoders are expected to produce the same output. If a decoder doesn't produce the expected output then it is defective by definition. The only reasonable difference between decoders would be in the inverse discrete cosine transform which is the final step of decoding. There are many different methods of doing IDCT, some more suited for hardware and some more suited for software, and their accuracy can vary (generally the faster methods are less accurate), but I think the MPEG standard has tolerance specifications for the allowed deviation from the "reference" IDCT.
 

Vince Maskeeper

Senior HTF Member
Joined
Jan 18, 1999
Messages
6,500
Wayne,

This of course is not the case- look at the issue of the chroma bug-- if all MPEG decoders were equal- either all or none of players would have the problem. I think the same concept is true of audio D/A conversion- in theory it should create a specific output- but not all are created equal.

In addition- you're now talking about the ability of a CPU versus a dedicated DSP chip for the function. If you dabble much in pro audio or video editing/production- you find that all these industries favor systems with dedicated processing for specific functions (audio/video effects for example)-- even when working with top shelf main CPU power. Dividing jobs off to dedicated hardware usually means more reliable and consistent results.

Add in the factor of resolution scaling-- while technically not strictly "mpeg decoding" it is a large part of how the image is displayed on PC.

I'm certainly not an expert on this topic- if you're seeking more exacting answers- I'd suggest posting the same query over on AVS- I'm sure you'd get an encyclopedia on the variable quality of MPEG decoding.

But, I would suggest trying for yourself and you'll see the difference.

-Vince
 

Wayne Bundrick

Senior HTF Member
Joined
May 17, 1999
Messages
2,358
Technically the chroma bug is not a bug in the MPEG decoder but rather what comes after the decoder. Progressive scan output of interlaced sequences isn't a part of the MPEG decoding spec. Likewise the scaling problem Brian found with his old version of PowerDVD isn't a MPEG decoding bug, it's just poor scaling. Similarly there will of course be differences in D/A conversion, but again that's what comes after the decoder.

But capturing a screenshot from DVD-ROM involves no D/A and is entirely digital. If each of us were to post 720x480 native screenshots, they should all be identical. I think hardware could be having more benefit in scaling than decoding.
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Forum statistics

Threads
357,036
Messages
5,129,259
Members
144,286
Latest member
acinstallation172
Recent bookmarks
0
Top