CharlesD
Screenwriter
- Joined
- Mar 30, 2000
- Messages
- 1,493
so 1990 was part of the '80s and 2000 part of the '90s? I think not.
so 1990 was part of the '80s and 2000 part of the '90s?Yes, 1990 was part of the same decade that included most of the 80's (Jan 1, 1981 - Dec 31, 1990).
Computers and computer programmers.Tell that to a COBOL programmer . Then again, COBOL is so old they index an array by roman numerals. The correct answer to the "who counts starting with zero" is 'C' programmers (try debugging array logic written in 'C' by former COBOL or Fortran programmers, UGGH!).
I understand the argument for why the millennium started on Jan 1 2001, but to say that 1990 is part of the 80s is incorrect IMO. "The 90s" is the decade of years that have a nine as their third digit. 1990.No, I don't think you are fully grasping the concept. Your confusion lies in the fact that "the 80's" is not the technical term for the decade that includes most of the years 198x. "The 80's" is really a casual term referring to 1980 - 1989 not the technical term for a 'proper' decade.
Most of the 80's fall in the 199th decade (Jan 1, 1981 - Dec 31, 1990). The 200th decade (Jan 1, 1991 - Dec 31, 2000) was the last decade of the 20th century.
So on Jan 1, 2001, we began the 21st century AND the 201st decade.
So, Bryan, now could you know explain millennia?
I thought about throwing millenia in there too, but didn't. But for anyone counting, the 1st millenium goes from Century 1 to 10 (Jan 1, 1 - Dec 31, 1000) and the 2nd millenium goes from Century 11 to 20 (Jan 1, 1001 - Dec 31, 2000). We started the 3rd Millenium on Jan 1, 2001 and it will end Dec 31, 3000 (Centuries 21 to 30).
...and it will end Dec 31, 3000I plan to throw a big party. You're all invited.
Joe
You know, everyone was so worried about the y2k bug, but that's nothing compared to the y10k bug.... how are we going to handle a 5 digit year...Let's just say that all the COBOL programmers will be able to name their own price...
"the 80's" is not the technical term for the decade that includes most of the years 198x.Do you know silly that sounds? Who actually uses this mythical technical term, for the "decade that contained most of the 80s"? (I'm guessing it's only a by-product of insisting that the century started in 2001, a further indicatation that it is no longer a good idea.) Say it out loud: "nineteen eighty was not in the eighties." But you don't say: "See that eighty-year-old man over there? He's in his seventies." Sure, there's a difference between years and age, but you're only making it more complicated.
Look, I get your argument. It's logical, it's simple. But it causes ugly consequences. It's also fundamentally flawed. And it's really no one's fault. When they back-dated AD 1 in the 6th century, they weren't using zero as a regular number. But that's all you've got: they started with 1, so you're sticking with it. They also started with the flawed Julian calendar, but they managed to fix that. We can fix this century/millenium thing, so that everything makes sense. (You can still keep the historical oddity that there was no year zero so you can impress people at parties.) All you have to do is free your mind....
//Ken
Look, I get your argument.It's not my argument. I'm just presenting the FACTS. I don't care if the current method is the best method or not.
You can wax poetic about which method is better, but the fact remains that 1980 is in a different decade than 1981-1990. Hopefully you understand that.
Me: And they all thought I was an idiot! Do you hear me? an IDIOT!
Therapist: And how did that make you feel?
Since we use only four digits to represent the year, every computer in existence will crash in the year 10,000.But by then the Butlerian Jihad will have wiped out all "thinking machines" and be replaced by human computers, who are smarter than that and won't get tripped up by a little inconsequential thing like a 5-digit year, e.g. the year 10,191...