Y2K38: The New Y2K

January 16th, 2008 by Andrew

Remember the Y2K scare? Everyone was afraid the world would end when clocks changed over to January 1st, 2000. Why? Because many antiquated computer systems were never programmed with knowledge of the year 2000. The values for years on these systems were stored as only two numbers. So 76 meant 1976, and 00 meant 1900. So what would happen when the year changed from 1999 to 2000? The computer thought it was 1900 again. There was a wraparound problem. Luckily, the world didn’t end, and no major catastrophes resulted.

Fast forward 38 years to 2038, when a similar problem will occur. Unix timestamps are used to keep track of time for many systems. A Unix timestamp is the number of seconds since the Unix epoch, which was January 1st, 1970 at 12:00am. Unfortunately, these Unix timestamps have been stored using only 32 bits as signed integers. The maximum signed integer which can be represented in 32 bits is 2,147,483,647 seconds. January 1st, 1970 12:00am plus 2,147,483,647 seconds is January 19, 2038 3:14:07am. When it turns to 3:14:08am on that day, the Unix timestamp will wraparound to -2,147,483,648, so those seconds would be subtracted from 1970, giving you a date of about 8pm on December 13, 1901.

There have already been reports of problems from this. Many programs which calculate dates into the future more than 30 years, and use Unix timestamps, are likely to fail or get very confused at the least. There doesn’t appear to be any easy solution to this problem. Changing from 32 to 64 bits would allow us to calculate time in seconds for another 290 billion years. Unfortunately, the standard 32 bit timestamp is used in so much software and hardware that it’s impossible to update everything that relies on it. We’ll see what happens in 2038… or should I say 1901?

Comments

2 Responses to “Y2K38: The New Y2K”

  1. Fearless_Fool

    04/25/2009 at 1:02 pm

    I’m missing something. 2^32 seconds is 136 years. 1970 + 136 = 2106, not 2038 (68 years). Maybe unix only gives you 30 bits of time?

  2. @Fearless_Fool you’re half right here. I updated the post to reflect what actually will happen. The Unix timestamp isn’t stored as an unsigned integer as I had previously assumed. It’s a signed integer. The maximum positive value of a 32 bit signed integer is 2,147,483,647 (01111111 11111111 11111111 11111111 in binary). When the timestamp changes to 10000000 00000000 00000000 00000000 binary, the decimal value would be -2,147,483,648, meaning January 1st, 1970 minus 2,147,483,648 seconds, which would put the date back to 1901.

Leave a Reply