What's new

Why is Edge Enhancement more obvious on LCD than on CRT? (1 Viewer)

Joined
Nov 13, 2008
Messages
16
Real Name
John
Just wondering about this. There's no question my old 19" CRT monitor towers over my 24" TN LCD for picture quality. I'm sure a big part of this is the superior black levels, and its better handling of the 'almost blacks'


But one thing I find a little odd is its handling of defects, especially EE. I've heard this mentioned in other forums, so it's not just my imagination.


So far, the only theories I've been able to come up with are;

Irregular pixels? I'm not sure about this, but I'm guessing that the way a CRT tube fires, each pixel isn't going to be exactly the same shape & size every time. We're talking about tiny, tiny, tiny variations, but those constant variations might be serving to hide some of the digital flaws and give CRT a more 'organic' look?

Greater brightness? Contrast ratio on LCD is usually worse than CRT, but maximum brightness is usually a lot higher. Maybe it's as simple as increased brightness producing brighter halos?


Also; I've noticed in comparisons there's a distinct 'flatness' to movies on the TN LCD; could this be a result of the inferior 'color depth'? or is that another effect of CRT's superior blacks?
 

ManW_TheUncool

His Own Fool
Premium
Senior HTF Member
Joined
Aug 18, 2001
Messages
11,947
Location
The BK
Real Name
ManW
Most of the observations you've made wrt color depth, contrast ratio, blacks, etc. are all related to each other and are basically more or less just observations about the same thing from different POVs, etc.

RE: the pixel definition/accuracy thing, yeah, the diff there probably has mostly to do w/ the inherent limitations of (analog) CRTs vs fixed-pixel (digital, chip-based) displays like LCDs, not that LCDs don't have their own related limitations (like fill factor due to use of the chip, which is probably what can sometimes translate into the screen-door effect, particularly for very large LCD projection setups). I suspect this aspect of CRT inaccuracy is very similar to the "euphonic" distortions/inaccuracies that one prefers from quality analog audio tech/gear, especially of the tube (and vinyl) variety. They're the kind of inaccuracies that we humans have grown (or maybe even unconsciously trained) to prefer (and even "love") vs the other set of compromises found in digital (and/or semi-conductor-based or similar) audio tech/gear. Most of us tend to prefer warm and fuzzy (to an extent) over cold, seemingly edgy and clinically accurate (but yet still missing something else and still far from actual perfection).

At the end of the day, I think the pursuit of the absolute sound (or such other similar playback ideals for a recorded medium) is kinda moot when I pick up a real violin w/ my meager beginner's skills and find that sound never before heard by me on any recording on any gear/setup I can possibly afford (both in terms of $ and effort) -- and that's besides to point whether one can reasonably find true audiophile recordings that also satisfies musically, not just technically.

Anyhoo...

_Man_
 

ManW_TheUncool

His Own Fool
Premium
Senior HTF Member
Joined
Aug 18, 2001
Messages
11,947
Location
The BK
Real Name
ManW
BTW, sorry if I sound like we shouldn't care about these issues. That's not exactly what I meant. I'm just saying we're probably not quite as concerned about accuracy, etc. to the Nth degree as we might like to think, but ultimately, we're just looking for what makes us happy/satisfied (at least for the forseeable future) -- and that more often than not involves personal preferences on which set of compromises suit us best (whether decisions/preferences are made consciously or subconsciously).

Basically, at some point, it becomes kinda moot (and just academic) when it comes to stuff like why EE looks worse on LCDs vs CRTs (or whether the 2.0:1 Criterion-MARed version of The Last Emperor is acceptable or not under the circumstance or something else like that).
 
Joined
Nov 13, 2008
Messages
16
Real Name
John
Bought the Last Emperor bd, too (perhaps we should start a support group.) One of the main things that swayed me is Bertolucci's assertion that the film was always intended to be cropped to 2.2 (he felt the best way to see it was on 70mm prints in the big theatres,) so that makes the further cropping to 2.0 a little easier to stomach.



About analyzing to the nth degree, in this case I've already seen the difference without analyzing; just trying to figure out why it's there. In terms of warm/fuzzy vs cold/clinical, you could argue that also has to do with film as a medium. It's an organic medium and invisible organic flaws are preferable to harsh digital flaws. I wonder which set of flaws is better for something like CGI animation, though.


I think I see your point about color-depth being related to contrast-ratio, etc. It's kind of like how 24bit sound can span a much greater range of decibels than 16bit?

Or, in other words, if the blacks are really black and the whites are really white, then there's room for a lot more shades of grey in between?
 

Joseph DeMartino

Senior HTF Member
Joined
Jun 30, 1997
Messages
8,311
Location
Florida
Real Name
Joseph DeMartino
There's no question my old 19" CRT monitor towers over my 24" TN LCD for picture quality.
Let's not forget that some of the difference you're seeing, at least on widescreen movies, is simply the difference between a tiny letterboxed image on a 19" set and a full- or nearly full-screen image on a 24" LCD. Smaller screens and tiny phosphors do a lovely job of concealing flaws in source material. Back in the days of VHS tape I had tons of off-the-air recordings of hard-to-find movies. The all looked very nice on my 19" TV/VCR combo. When I got my first big screen TV, a 46" rear projection set, I found most of them literally unwatchable. A larger image magnifies flaws, as well as details, and that alone is probably responsible for a least a percentage of the problems you're now seeing.

Regards,

Joe
 

ManW_TheUncool

His Own Fool
Premium
Senior HTF Member
Joined
Aug 18, 2001
Messages
11,947
Location
The BK
Real Name
ManW
Originally Posted by Darth Lavender ">[/url]

Bought the Last Emperor bd, too (perhaps we should start a support group.) One of the main things that swayed me is Bertolucci's assertion that the film was always intended to be cropped to 2.2 (he felt the best way to see it was on 70mm prints in the big theatres,) so that makes the further cropping to 2.0 a little easier to stomach.

[/QUOTE]
Well, it's quite evident that the picture has suffered at least in some shots due to the 2.0:1 crop even w/out doing a side-by-side comparison. Guess we'll just have to live w/ Storaro's imposed set of compromises in this instance unless one is willing to settle for the quality compromises (vs the AR/composition compromises) of one of the available DVD versions out there (or simply go w/out) -- I still haven't seen the R2 PAL version, except for a few screen caps that show odd color balance.
[QUOTE]About analyzing to the nth degree, in this case I've already seen the difference without analyzing; just trying to figure out why it's there. In terms of warm/fuzzy vs cold/clinical, you could argue that also has to do with film as a medium. It's an organic medium and invisible organic flaws are preferable to harsh digital flaws. I wonder which set of flaws is better for something like CGI animation, though.

[/QUOTE]

Yep. Like I tried to allude, basically, a lot of our preferences regarding flaws/compromises just have to do w/ what we're used to seeing/hearing/experiencing. Consider the CGI example, it's soooo much easier to spot (and be distracted by) less-than-perfect (or rather, ironically, coldly, unreally/unconvincingly perfect) live action CGI of things that we're used to seeing/experiencing vs CGI effects (or animation) of things we're not so familiar w/ in real life. When CGI is not expected to look, sound and feel exactly like the real thing, it becomes free of that burden to convince us to the N-th degree that's needed -- and depending on what it is, the CGI creators can even freely throw in all sorts of completely unreal aspects into the effects/animation and have those actually be convincing parts of the CGI creation.

It's like if I listen to music written/arranged for and played on an electric guitar, I naturally accept it more than if someone simply tries to play a piece originally written for acoustic and played on electric w/out giving due consideration to the inherent diffs between such guitars.
[QUOTE]I think I see your point about color-depth being related to contrast-ratio, etc. It's kind of like how 24bit sound can span a much greater range of decibels than 16bit?

Or, in other words, if the blacks are really black and the whites are really white, then there's room for a lot more shades of grey in between?

[/QUOTE]
Yep. BTW, people often don't think about that in the digital photography world and just assume they want higher dynamic range capability for their camera, but there are tradeoffs to consider there (at least at this point in the tech). If you want higher DR, you're gonna need more bit-depth (and thus, larger file sizes, higher storage and processing demands, etc etc) to accommodate the higher contrast ratio unless you sacrifice granularity w/in that contrast/gamma curve in one way or another, eg. maybe fewer fine grain steps in the dark gray and off-white regions for starters. That's actually the kind of tricks that lossy compression of audio typically uses to reduce the needed bandwidth, ie. throw away much of the low bass and high treble first.

Likewise, while shooting RAW affords extra degrees for postprocessing of photos, there are real limitations there too since the available bit-depth is limited. Further, people often overlook that doing color correction in the digital realm (w/ RAW files) is not w/out compromises again due to the actual available bit-depth in the source files. If in the process of color correcting a tungsten lit shot, you need to throw out lots of the red channel data and/or push what little there is in the blue and green channels -- because tungsten lighting has very little blue and green -- the final result does not just magically appear out of thin air just because you're using RAW files. The results will be much weaker than if you had made the shot w/ all the color correction done in the shooting chain before the light of the image is caught by the image sensor, eg. longer exposures to go w/ high quality color filters (to do the color balancing), lighting that actually need less/no correct correction, etc. And of course, factoring in the color/exposure corrections into the shoot, instead of in post, requires certain compromises too, ie. takes lots of extra work, gear, etc., longer exposures may not be feasible for certain kinds of shots, etc.

And roughly the same things can be said if you shoot film, instead of digital, too of course.

At the end of the day, it's all physics, and there just is no free lunch there.

[QUOTE]Originally Posted by [b]Joseph DeMartino[/b] [url=/forum/thread/292463/why-is-edge-enhancement-more-obvious-on-lcd-than-on-crt#post_3603001]
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest posts

Forum statistics

Threads
357,007
Messages
5,128,246
Members
144,228
Latest member
CoolMovies
Recent bookmarks
0
Top