Hello everyone I've long noticed the artifacting (I learned it's called "Mosquito Noise") evident in most digital (MPEG) programming, especially in low-bandwidth streams, or in scenes with frames that change very rapidly (i.e. confetti falling, ocean waves, snow, rapid zooms, and the like). I know that it's related to the fact that MPEG is designed to save bandwidth by storing the changes from one frame to the next, rather than storing the entire contents of each frame. My point, though, is this: doesn't it seem like a lot to tolerate, given the exceptional image quality that today's technology can produce? It seems that a true purist would laugh at our efforts to assemble amazing home theater systems using these spectacular imaging devices, only to sit there and tolerate what could be classified as a horrible degradation to image quality, a situation that we'd never experience with old-fashioned analog film. Isn't it analagous to using bad gas in a brand new sports car -- you're proud of the capabilities of your car, and who really minds if it sputters once in a while when the bad bit of gas reaches the engine? Did anyone watch the Olympics in high definition? Although the quality of the images was amazing, certain sports, like swimming, exhibited a great deal of artifacting. Don't get me wrong -- it's not that I'm not a great fan of the new technology. I just wonder why we're so tolerant.