I thought some of you with a comp-sci or math background would find this discovery I made interesting!
Thursday, March 17, 2005 marks a very distinctive date in computer history. Microcomputer clocks are based on the number of seconds that elapsed since midnight of January 1, 1970, which is a universal standard defined as the "epoch."
However, at 19:58:31 Central on March 17, for the first time in 35 years, the entire world of computers will roll over to the longest ever number of repeating clock digits: 1,111,111,111 seconds.
If you have software that supports milliseconds, tonight you can see an even more amazing timestamp of 1,111,111,111,111 which is exactly one-trillion, one-hundred eleven billion, one-hundred eleven million, one-hundred eleven thousand, one-hundred eleven milliseconds since 1970!
It is interesting to note that you will never see a clock readout of 2,222,222,222 seconds (effectively June 1, 2040) or beyond because that is out of range: Most modern personal computer software only stores the date field as a signed 32-bit integer. Therefore, today's computer systems would cease to function properly beyond 2038, which gives humanity plenty of time to prepare for the next "Y2K bug."
Now, if you miss this historical numerological event, you can still look forward to one more to occur in the next 20 years. On Friday, February 13, 2009, we can count up to 1,234,567,890 seconds since the "birth of the first personal computer".
(For the Unix geeks, 1970 was also the year Bell Labs invented UNIX, hence the date choice.)