Practical Aspects of the Year
as it relates to software
The primary problem with Y2K is the combination of three issues:
Issue 3) is not usually fixable, so we are left with addressing either 1) and/or 2). If source code is available, the job is substantially easier; otherwise reverse engineering or binary patches are needed.
Issue 2 Alone
[Cf. Appendix 1]
Dates that are always within 40 years of "now" can be simply interpreted by adopting either a 20 year lifetime (yeah, right - that's what we did in 1974) or, better, a moving window based on "now" and flagging dates that fall outside the total 80 year window.
Issue 1 Alone
Unless the date interpretation mathematically adjusts the year by 1900, and it can be extended to 3 digits (packed signed dates sometime allow this), this option is not possible. It's worth a shot, but don't rely in it.
Issue 1 and 2 combined
Some people believe that date fields must be widened in order to accommodate the extra digits. For legacy mainframe or Cobol systems, this is usually not true. If the year is stored as display (two bytes for 2 digits) then a poor approach is to adopt a packed decimal format; a better one is binary. [Cf. Appendix 2]
Another advantage with binary fields is that they are always valid! Those favouring packed fields might say that this is a disadvantage, but it allows "bilingual" programs that can work with either the old or new data format, and allows "conversion by osmosis" rather than the "big bang" approach with its attendant backout problems, especially for on-line systems. "Bilingual" programs allow greater overlap in project management and less slippage if problems arise [cf. Appendix 3]
The "Groundhog Day" Approach
For some internal systems, consider either running with the same date all the time, or dates 100 or 28 years old. 28 years back allows day-of-week calculations to still work.
Should Not Be an Issue
The year 2000 is a leap year, which often satisfies simple algorithms [cf. Appendix 4].
Dates that are within the current epoch can be simply fixed by doing what people do - make an intelligent guess based on distance from "now" to possible interpretations of the date. My MasterCard expiry date of "10/00" could be interpreted as either the year 1900 or 2000. Common sense (a commodity that computers sadly lack) says that 2000 is likely. Data entry dates (orders, invoicing, cheques,..) are classical candidates for this approach.
In a couple of years time, a personnel system holding my mother's and daughter's year of birth as "00" would need to be interpreted as 1900 for my mother but 2000 for my daughter. So dates of Birth cannot be universally fixed this way.
Some people have difficulty working with binary data; I dont know why. The reason to choose binary over packed should be clear once it is explained. Many systems treat "00" differently to " ", and occasionally "99" has a special meaning. All these nuances can easily be handled with binary fields. Incidentally, it is usually better NOT to simply store the year as a straight binary number, but use an offset. This allows character blanks to be interpreted as binary zeros and dates in the current epoch to show as characters. This makes testing and debugging very easy (especially with ASCII character sets) using simple file dumps. For problems with widening the data, see Appendix 3.
Problems with the classical approach of either widening the field (or doing a simple cut over to a new format) are that all programs that use the field must be converted simultaneously with the file (and all generations of it). This makes conversion proceed at the pace of each slowest part of each conversion exercise, end to end (because one program may need to be converted several times for several files). The lack of overlap causes a very long critical path, with a slip in any phase delaying the whole project.
I have been considering Y2K conversion for over 20 years, and have developed elegant yet simple algorithms, so that the testing and conversion necessary for bilingual programs is very easy. In most languages it can be done in a couple of lines of code. Once the routines are written, they can be duplicated throughout the system.
Once coded, the approach is to firstly make all programs bilingual with file reading only, then gradually convert the files afterwards. If a mistake is made, it usually affects only a small part of the system.
Whilst we may curse Julius Caesar for leap years, we can thank Pope Gregory for a quirk of destiny that 2000 is a leap year. Any of the three centuries either earlier or later would not have been. I find it stranger that some programs treat 2000 as NOT a leap year.