I am curious as to when the "wild" west came to an end? What I mean is when the last of the old west style towns, people wearing sidearms and when the last of the Indian attacks. I ask this because of the new show Peacemakers coming out. It tagline is that the old west is coming to an end... It is supposed to be a CSI meets the old west. Thanks!