Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site arizona.UUCP
Path: utzoo!watmath!clyde!cbosgd!ihnp4!arizona!gary
From: gary@arizona.UUCP (Gary Marc Levin)
Newsgroups: net.bugs,net.flame,net.puzzle
Subject: Re: Computer bugs in the year 2000
Message-ID: <20381@arizona.UUCP>
Date: Mon, 21-Jan-85 11:47:25 EST
Article-I.D.: arizona.20381
Posted: Mon Jan 21 11:47:25 1985
Date-Received: Tue, 22-Jan-85 06:17:49 EST
References: <820@reed.UUCP>
Organization: Dept of CS, U of Arizona, Tucson
Lines: 22
Xref: watmath net.bugs:502 net.flame:7929 net.puzzle:505

>       I have a friend that raised an interesting question that I immediately
> tried to prove wrong.  He is a programmer and has this notion that when we
> reach the year 2000, computers will not accept the new date.  Will the
> computers assume that it is 1900, or will it even cause a problem?
>  ...
> Spencer L. Bolles

The problem won't be the computers, but the software.  Some software is
bound to be wrong, only considering the last two digits of the year.

Actually, the year 2000 will probably make some faulty software work
correctly for 100 years longer than they should.  2000 is the second
level exception to the leap year rule.

    Leap years are those years divisible by 4,
    EXCEPT those divisible by 100,
    EXCEPT those divisible by 400.

Programs that assume that all multiples of 4 are leap years are wrong,
but the problem won't come up until 2100.
-- 
Gary Levin / Dept of CS / U of AZ / Tucson, AZ 85721 / (602) 621-4231