First, there are two types of DVI connections: DVI-I (carries both analog and digital) and DVI-D (carries only digital signals). The projector I selected for purchase has DVI-I. Why? DVI provides straight digital signal to your display. Because there is no digital to analog conversion (DA) the chance of artifacts are reduced.
HDCP is a sub-topic to DVI. It incorporates an emerging encryption for digital transmission of video content. Hollywood is concerned about copying high definition content. Some contend it is to maintain the 'quality' of the transmission. You can be the judge of that (off my soap box). As for the limitation of DVI which supports HDCP, I'm still trying to weigh that one. The projector I have selected to purchase has DVI but does not support HDCP. As far as I can tell, HDCP's full implementation is still some time away. From what I have researched, there should be support for legacy equipment (non-HDCP DVI devices) for backward compatibility.
Hope this has not confused or offended anyone. Your input is welcome/invited.
Based purely on the fact that DVI-I supports both analog and digital one would think that DVI-I is better (more flexible). My opinion is that most content you are using DVI for would be in a digital format so DVI-D would work anyway. The point, maybe I digressed too much, is that DVI should be a consideration in the purchase of your display device. The prevalence of DVI is becoming a large factor for current and future capabilities. Replacing plasma tv's arn't cheap. I just wanted to make sure Rick had information which might prevent a future "Oh I wish I had..." moment.