I have a 1080p display which means that, no matter what I send to it, it's only going to show me 1080p. If a different resolution or an interlaced picture is sent to it then it has to be scaled or deinterlaced SOMEWHERE along the chain because the display can ONLY output 1080p. Fine.
I usually have my Sony player directly output "Original Resolution" when I'm watching blu-rays (not DVD's as the player is a much better scaler but I digress) so THAT'S the way I watched OKLAHOMA! and it's gorgeous. Since it's 1080i is being sent from the player that means the TV is doing the deinterlacing (just like it does for digital cable) and it's giving me that wonderful "looking out the window" feel with a picture that has amazing depth which I entirely chalked up to the higher frame rate.
Now I don't know WHAT made me decide to play around (so please don't ask!) but the other night (for fun) I decided to change the output on the player to 1080p to see what it looked like if the player itself did the deinterlacing before it even hits the display. Well, imagine my surprise but the picture is now ever so subtly sharper, the colors pop more, and there is more shadow detail. In other words, the picture looks even BETTER than it did, but here's the thing.... that delicious three dimensional look is greatly greatly diminished. It still looks phenomenal but loses the sense of depth it had when outputting 1080i.
Which one is right?
I LIKED the deep look that made me think I could reach out and grab Gordon MaCrae's butt anytime I wanted to but now I'm wondering if that's just a digital side effect of the interlacing. Is it a subtle version of the dreaded "soap opera" effect? There is NO frame interpolation so that's not the issue.
Many people have remarked on the three dimensional feel of the blu-ray but is it right? At what output is everyone watching it? Does anyone know which look is more accurate to 1955? I've never seen 30fps projected on film so I really don't know. The increased sharpness, color, and shadow detail makes me think that's the right way to go but gee whiz the other look is mighty mighty appealing in it's own right.
The only thing that changes when you switch from 1080i to 1080p output on the Blu-ray player is where the deinterlacing takes place - in the player or in the TV. It's possible that one device may have a better processing chip than the other. However, the 30 fps rate on Oklahoma has a very straightforward 2:2 cadence that should be no effort at all for any deinterlacer to handle. If one chip detected the cadence incorrectly and screwed up the deinterlacing, it would likely be immediately noticeable in jaggie and aliasing artifacts all over the place.
Deinterlacing will have no effect at all on color or contrast.
I suspect that what you're experiencing here is pure Placebo Effect. You expected to see a difference when you changed the setting, so your brain perceived one.