Hacker Newsnew | comments | leaders | jobs | submitlogin
Digital Equipment answers a user's complaint that the year 2000 should not be a leap year. (york.ac.uk)
72 points by bdfh42 2 days ago | 18 comments




15 points by yan 2 days ago | link

That sort of response well before Wikipedia is quite impressive!

reply

13 points by SlowOnTheUptake 2 days ago | link

FTA: "Although one can never be sure of what will happen at some future time, there is strong historical precedent for presuming that the present Gregorian calendar will still be in effect by the year 2000. Since we also hope that VMS will still be around by then, we have chosen to adhere to these precedents."

The writer proved remarkably prescient but,alas, Digital Equipment Corporation was not there to greet the new millennium.

reply

1 point by ken 2 days ago | link

"alas, Digital Equipment Corporation was not there to greet the new millennium."

Or even to (February) 2000.

reply

1 point by thorax 2 days ago | link

Poor, DEC! VMS made the trek, though.

reply

2 points by rbanffy 2 days ago | link

I remember the day the Compaq takeover was announced.

Oh boy. I knew that day the Alpha was doomed. Pitty... Such a promising achitecture.

Well... Back to my x86... I have work to do.

reply

1 point by SwellJoe 2 days ago | link

By the end, and even the last few years, the Alpha was no longer as fast as competing processors for the vast majority of workloads. The x86 architecture had become RISC behind the scenes and Intel and AMD had figured out ways to accelerate CISC to the point where it may have even been a net win for performance (or at least not a significant loss...smaller numbers of instructions had to traverse the slow RAM to CPU bus, and could then be "decompressed" into fast executing but more verbose instructions all within the CPU pipeline, where the bus was dramatically faster--raw clock speed had just caught up to the Alpha, but performance was significantly better, and by 2003 when the last ever Alpha clocked at 1.3GHz the Athlon 64 was released with clocks up to 2GHz). Others, like Sun and IBM and others, had made more progress on the parallelization and shared memory fronts, as well, making Alpha less useful for scientific and other large scale computing work.

Finally, the Alpha technology was sold off to Intel. The advantages of the architecture (and some of the engineers involved) have been assimilated into the CPU borg. It may be ugly, due to such a long and sordid history, but the x86 architecture is now wicked fast. Also the Alpha had elephantine power and cooling requirements. My last company once attended an event where we setup our boxes side by side with a competitors Alpha-based systems, and they had discreetly placed a box fan behind their biggest unit, because, without it, the ambient temperature of the room was too high for the Alpha box to operate reliably. Their box was much faster than ours though. I asked if they included a box fan, or if that cost extra.

But don't let me stand in the way of nostalgia. I, too, remember staring in awe at a DEC that was running at 400MHz, when the Intel architectures were still stumbling along at 166, or something along those lines (and with a dramatically slower bus, and only dual CPU capability in the Pentium Pro).

reply

1 point by gcv 2 days ago | link

Are you sure? I don't have the numbers handy, but I dimly remember that, in around 1998-1999, the Alpha 21264 ran circles around Intel and AMD CPUs of the same era, in integer and especially floating-point performance. I'm not talking about clock speed, but SPEC benchmark results.

reply

1 point by SwellJoe 2 days ago | link

Yes, I'm sure. The only date I specifically mentioned was 2003, which is when the last Alpha was released and the 64 bit x86 architecture became available. In 1998-1999 the Alpha was still probably in the lead on all counts except possibly price/performance. Things move fast in CPUs, and a significant lead one year can turn into a trailing position the very next.

reply

2 points by aikiai 2 days ago | link

Years divisible by 100 are normally not leap years, UNLESS they are divisible by 400.

http://en.wikipedia.org/wiki/Leap_year#Algorithm

I knew about the 100, didn't realize the 400 though until I looked into it. The response is comical, but you can understand where the user got confused.

reply

1 point by bigthboy 2 days ago | link

Well that was certainly entertaining to read... I can only imagine the user on the other end of this being like...=O touche...I also learned about how the calendar came to be because of that! So, another plus.

Does anyone else think its interesting that throughout history there have been numerous... "this isn't working for me, lets just skip some days and it'll be okay!" Heck, this may not even really be there year 2008.

reply

1 point by mhb 2 days ago | link

They knew that a lunation (the time from one full moon to the next) was 29 1/2 days long, so their lunar year had a duration of 364 days. This fell short of the solar year by about 11 days.

Isn't something wrong here?

reply

2 points by ComputerGuru 2 days ago | link

Yeah. The lunar year is 351 days, as-is the case with the modern Islamic calendar.

reply

1 point by m_eiman 2 days ago | link

What I'd like to read is the customer's reply to the reply :)

reply

3 points by ComputerGuru 2 days ago | link

If that were me, my reply would be to go out and by another DEC :)

reply

1 point by StrawberryFrog 2 days ago | link

"By imperial decree, one year was made 445 days long to bring the calendar back in step with the seasons."

Er, is that right?

reply

3 points by Retric 2 days ago | link

Yes. http://en.wikipedia.org/wiki/46_BC

reply

1 point by ComputerGuru 2 days ago | link

Another interesting tid-bit for the comments:

The year 1582 didn't have Oct. 5th through Oct. 14th (inclusive). They just skipped from the fourth of October to the fifteenth!

reply

4 points by dfranke 2 days ago | link

Which resulted in widespread violence when landlords tried to make their tenants pay a full month's rent.

reply