The only thing that changes when you switch from 1080i to 1080p output on the Blu-ray player is where the deinterlacing takes place - in the player or in the TV. It's possible that one device may have a better processing chip than the other. However, the 30 fps rate on Oklahoma has a very straightforward 2:2 cadence that should be no effort at all for any deinterlacer to handle. If one chip detected the cadence incorrectly and screwed up the deinterlacing, it would likely be immediately noticeable in jaggie and aliasing artifacts all over the place. Deinterlacing will have no effect at all on color or contrast. I suspect that what you're experiencing here is pure Placebo Effect. You expected to see a difference when you changed the setting, so your brain perceived one.