Writer's Box|LinksURLs|Fractals|3Bs|Note|Freedom|Search|CatInHat |
||||||||||||||||||||||||||||||||||||||||||||||||||||
|
----------------------------------- Everything below this point is freeze framed, a still picture if you will, from 10/09/2000. ----------------------------------- The next dates to be on the lookout for any problems relating to the Y2K bug are February 29, 2000 (because it's the first leap year of the double aughts) and October 10, 2000 (because it's the first time in the year that all digits will be used for standard date coding, eg. 10/10/2000 v. 10/9/2000). Anonymous Quotation... "Trust the computer industry to shorten the term 'Year 2000' to 'Y2K'. It was that kind of thinking that got us into this situation in the first place."
What is the Year 2000 Problem Anyway?Many computer systems use a two-digit date format (mm/dd/yy), and experts believe that these systems could interpret the Year 2000 (00) as 1900. This means that mission-critical information systems and the institutions that depend on them could be severely compromised by the turn of the century. Everything from financial records to hotel reservations would be affected. This scenario of information meltdown has been called the Year 2000 problem, or Y2K. Computer systems that cannot correctly process dates beyond 2000 are at risk of failure one second after midnight on December 31, 1999. In addition, some systems may not recognize that the Year 2000 is a leap year. If a year is evenly divisible by 4, it is a leap year, unless it is also evenly divisible by 100. But if a year is also evenly divisible by 400, it is a leap year. So 1900 was not a leap year, but 2000 is. Some BackgroundBack in the era of mainframe computers or big iron (a.k.a. the Iron Age), programmers used an abbreviated date format to conserve precious memory. It happened back when computers, as simple as a calculator, filled up entire rooms and 'memory' was really expensive. The US government decided to only allot 2 characters for the year rather than 4 characters - to save money... figuring the problem would be solved by the turn of the century. Other companies followed suit, making it almost a standard (ie. Apple/MacIntosh). Using two digits to denote the year, e.g. 99 for 1999, turned out to be a bad idea. Year 2000 problems are beginning to rear their ugly heads. Do you realize most companies fiscal year ends June 30, 1999? This means you can expect Y2K problems to begin happening six months earlier than most people expect. Keep Your Records in Hard Copy format. This may just be your saving grace. Prez Sez and Vice Prez Sez"Let Y2K be the last headache of the 20th century, not the first crisis of the 21st century." --Bill Clinton, State of the Union address, 1/19/98 "How could [Y2K] be a problem in a country where we have Intel and Microsoft?" -- Al Gore, VP USA (please tell us he was joking, right?)
|