Josh Hodas, assistant professor of computer science at Harvey Mudd College, gives the following overview:

Image: U.S. DEPT OF COMMERCE
WARNING SIGN. Logo from a Web site of the U.S. Department of Commerce is an indication of the priority being given to heading off potential Y2K problems.

The "Y2K Problem" includes a whole range of problems that may persist for several years and result from the way some computer software and hardware represent dates--hence the name "Y2K," which stands for "Year 2000" (K is an abbreviation for "kilo," or 1,000). In short, because many computer systems store only the last two digits of the year in a date, you can't really tell in which century that date falls.

Until recently, this ambiguity hasn't mattered in most instances. Computers have manipulated dates relating to recent events, and so they have been able to treat all dates as belonging to the current century. As we get closer to the new millennium, however, many more systems will need to juggle dates from two different centuries, and they will have to be able to distinguish between them if they are to avert myriad failures.

Consider the new credit cards you might be carrying--many of which already have expiration dates in the next century. When you try to buy something, the credit card terminal has to determine whether or not the card has expired. To do so, it runs a program that checks whether the expiration date is greater than the current date. If the card expires in 2003, then obviously the answer should be yes. But if the system uses only two digits to represent the year, it will find that 03 is not greater than 99, and that your card has already expired. This kind of problem--faced already by some major credit cards--can plague any system that depends on date comparisons.

In the situation above, there is at least one person standing by who can intervene when the problem occurs: the card owner or issuer. More disturbing, perhaps, are the "silent killer" versions of the Y2K problem, which can occur with any of the millions of embedded processors used in computers, toasters, cars, power plants and many other devices.

Next, consider a system that is required to run an internal safety test once a year. The system regularly checks whether the difference between the current date and the last test date is greater than 365 days. When a last test date in this century is subtracted from a date in the next century, though, the answer is a negative number less than 365, so the system will not believe it is time to perform the test. And in fact, left as is, the system will not perform another safety check for another 100 years.

Why do these systems make such bad assumptions? In most cases, designers just never thought the programs they were writing would be in use for so long--namely into the 21st century. And there were real practical advantages to using shorter, two-digit years in dates at the time they were put in place.

There are many different ways to fix the problem. Ideally we could just rewrite all the offending programs and modify all the existing stored data. But in many cases the program is so old that the original "source code" (the form written by the programmer before it is converted into a digital form understood by the computer) is lost and it is impractical or impossible to modify the digital form.

When the source code does exist, there may be no "compilers" (the programs that convert source code to its digital form) compatible with that version of the code's language anymore. And even if the program could be successfully modified, changing all the stored data would be impractical. Many non-date-dependent programs would also need to be changed because the placement of data in the file would change when space for the additional date digits was added.

In the last case, where code can be changed but stored data cannot, it is sometimes possible to buy some time. The most popular technique is called "windowing:" it takes advantage of the fact that many systems store information only about a relatively brief period, called the window. For example, at my college, which was founded in 1955, we can safely assume that any stored graduation date in the range from 00 to 55 refers to the next century. As a result, no ambiguity about graduation dates will occur for another 55 years, and with a little programming, we can put off the larger problem until then. Of course, for recording students' birth dates, we must use a different window. For the birth dates of faculty members, there is yet a different window and so forth.

Another popular technique involves reusing the space allocated for a two-digit date in a more efficient way. Since the same space is used, other data doesn't move, and other programs that don't access the date fields don't need to be rewritten. It turns out that this is possible because in many older databases, the numbers are stored using a fairly inefficient representation, called Extended Binary Coded Decimal Interchange Code (EBCDIC, pronounced "ehb-sih-dik").

Although there are many proposed solutions floating around, the real problem is that it is hard to imagine how all the systems that need fixing can be fixed in the necessary time frame. Moreover, in the case of the millions of embedded processors, it is unclear how the fix might be disseminated.

So, should you sell your house and move to a cabin with a 10,000-gallon fuel tank and a bunker full of food? As bad as all this sounds, there are many who feel that the doom-and-gloom predictions are really just variations on millennialist fever. Although there will certainly be bumps in the road, most experts believe that the worst problems will be avoided. Many major industries and government agencies have already run tests: They set the clocks on their computers forward to various dates in the next century to see what would happen. And in most important cases--including banks, nothing did.