To be exact it is hard to say because you lose a smidgen of horizontal resolution in the receiver, a smidgen in the cable coming in, and a smidgen in the cable going out to the TV and three smidgens may or may not cause visible degradation. Experts have some formulas that relate bandwidth of several components or cables in a row to bandwidth of one component or cable but I don't know those formulas and for all I know those formulas are watered down averages. I really think the smidgen lost in a 100 MHz bandwidth receiver circuit is too small to count because you need 37 MHz bandwidth for 1080i and at 100 MHz significant losses don't start (the frequency response doesn't become non-flat) down at the 37 MHz frequency range.
The definition of bandwidth for this purpose is the frequency range where the individual frequency treated worst comes out at least half as strong (within 3 dB) of the frequency treated best. If two components or cables in the signal path both treat the same frequency the worst, that frequency comes out only a quarter as strong as a frequency treated best. Go for a higher bandwidth and chances are the frequency response flatness will go up higher too. If for the entire signal path the bandwidth is 37 MHz I doubt if you will notice degradation of 1080i.