PQ-wise there's absolutely no difference between 1080i and 1080p, of course. Both have the full HD resolution of 1080 horizontal lines.
Whether or not you see a difference with a 720 lines picture, depends on the distance of your TV set (and it's size). When you watch a 39" image, or smaller, from a 12 - 15 ft distance, you won't see a difference between a 720p image and a 1080p image.
No, you will certainly see the difference! (But even that huge TV will allow you to sit closer, say 8', if you need or want to, without viewing problems when you're watching genuine 1080-material.)
My rule of thumb is that anything under 42" you'll likely not be able to tell a difference between 720p and 1080i at a reasonable viewing distance... Unless you read this forum.
Most of the small differences in picture quality aren't typically discernible to the average and slightly above average person but once you read thread discussions pointing out the small blemishes it will start to become more and more apparent.
That said I'm still not convinced I can see differences between 1080i and 1080p at a screen size less than 56".
There IS no difference between 1080i and 1080p. How could it? They're exactly the same image.
(A 1080p transfer fills the image-buffer line-after-line; a 1080i transfer fills the odd lines first, then the even lines. The resulting image, read by the display, is the same, of course.)
In high motion scenes I can see a slight difference. Again the overall image has to be pretty large. A buddy and I did some blind testing with a couple projectors awhile back trying to determine which he should buy. Each time I could pick out a slight difference.
The 1080i/p difference is dependent on your display's video processing capabilities. Some can detect the cadence (and properly deinterlace) and some can't. The cadence coming out of 1080 24fps material that has been interlaced to 1080i 60 should be a piece of cake for most displays to handle.
That's all well and good, but when feeding 1080i into a 1080p display one has to hope that the display can correctly deinterlace and apply inverse telecine, which a great many cannot.
True, but strange enough it was the other way around in many TV-sets until last year.
Apparently, they had de-interlacing logic installed and to accept 1080p, the manufacturer had added an interlacing circuit in these machines. Sometimes these worked lousy, sometimes not, but it added another level anyway.
I think that the newest TV sets and projectors simply handle both, 1080i and 1080p, correctly. If they do, the "theoretical" question of which of the two is best becomes moot.
I make a point of this, because so often one encounters the notion that 1080i would be a lower grade of 1080p (somehow closer to 720p, or worse ), which isn't true at all.