I watched the Blu Ray and I have some questions for you, Mr. Harris (and everybody else who has insight). The transfer seems to consist of basically two different kinds of shots. Shots with great clarity and sharpness that seem to have 1080p detail and shots that have more or less DVD resolution and also look often pasty. The latter shots are almost all sfx shots or standard opticals (mostly dissolves). Now it's well known that opticals with their generational loss degrade image quality but the extent it happens here (from 1080p to 480p) looks excessive to me. Why do the opticals look so much worse in this transfer? Is this normal? And is it normal that not only the actual dissolve is suffering but the rest of the shot as well, so the whole shot looks very bad compared to the surrounding shots? Why not replace the rest of the shot with the original? (Tonal) Continuity? What are the reasons such dissolves are not redone digitally for the HD keeping the transfer on a consistent image quality level? Cost? Original elements not available? I found the big jumps in image quality rather distracting every time a dissolve was done. One could actually easily see in advance when another dissolve was coming. Great quality, cut, bad quality, dissolve, bad quality, cut, great quality... Did original prints of that film show the same strong quality variations? I also noticed not faint halos on some shots that had no opticals. Sharpening or part of the photography of the time?