
perl -e 'use Date::Parse; print localtime((0x80000000 +
Wed Jan 9 17:37:04 2019
Previously, previously, previously, previously, previously, previously, previously.
perl -e 'use Date::Parse; print localtime((0x80000000 +
Wed Jan 9 17:37:04 2019
Previously, previously, previously, previously, previously, previously, previously.
Damnit, I had to spend Y2K making sure all my computers kept running at midnight. With luck I'll either be retired or dead come 2038.
How much do you think this is going to be a problem?
The time_t problem is essentially the off_t problem. Enough of the computer industry realised there would be a problem and held the LFS summit in 1996. Most unices had largefile support by 2000-2002, and it flipped around to become default on most systems (e.g. 64-bit APIs being default, and 32-bit APIs being hidden behind a flag) by around 2006.
As far as I've found, most unices had full 64-bit time_t support by around 2014, so the problem is now downgraded to getting at and repairing things that cannot possibly go wrong. It's unlikely to be the "all programs on all computers might go wrong, we're just not sure" headless chicken panic of Y2K.
I don't particularly think the Y2038 problem will be a real problem. The set of legacy systems susceptible to it that were not already replaced during the Y2K firedrill is probably pretty small.
I just find it somewhat shocking that we crossed that halfway point, because it feels like yesterday that the Y2038 bug was just impossibly far into the future -- it was just a punchline to Y2K jokes, right?
I have seen a number of databases and file formats that expect timestamps to fit in ten digits/32 bits. In general, this is easier to fix than the Y2K issue though because it is only a boundary level issue. Y2K fixes were nastier because there was logic that expected to deal with the two digit years, not simple comparisons (timestamps are usually just compared against each other).
Y2038 is interesting, but what will be really interesting is around Y2050 when the Y2K hacks in truly ancient code nobody is willing to touch start to break.
I don't believe Linux has 64-bit time_t support on 32-bit platforms, although people are working on it. And there are apparently still many 32-bit devices out there (home routers and such).
It's not just about the operating system.
Many (embedded) PC BIOS cannot handle timestamps larger than 32 Bit. And that can be much harder too update in the field than the OS. I know of applications where these systems are intended to run longer than 20 years.
See below for the 2038 problem in action on a Raspberry Pi 3 running an up-to-date version of Raspbian. Raspbian uses a 32-bit linux kernel and userland.
Sounds like linux is working to follow openbsd and make time_t 64-bit even on 32-bit platforms. Hope they get there soon. Somehow I suspect there are still a lot of x86/arm/sparc/ppc/mips boxes now running 32-bit code. Android anybody?
[pi@raspberrypi] ~$ uname -a
Linux raspberrypi 4.14.79-v7+ #1159 SMP Sun Nov 4 17:50:20 GMT 2018 armv7l GNU/Linux
[pi@raspberrypi] ~$ cat test.c
#include <stdio.h>
#include <time.h>
int main(int argc, char** argv) {
printf("sizeof(time_t) = %zu\n", sizeof(time_t));
return 0;
}
[pi@raspberrypi] ~$ cc test.c
[pi@raspberrypi] ~$ file a.out
a.out: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked, interpreter /lib/ld-linux-armhf.so.3, for GNU/Linux 3.2.0, not stripped
[pi@raspberrypi] ~$ ./a.out
sizeof(time_t) = 4
All of that logic is sound.
But what about the software running on {Google,Amazon,Microsoft,Apple,etc}'s cloud systems that depends on a library written in the mid-1990s that blithely shoves a time_t into a short?
See also: https://gcn.com/articles/2018/08/24/software-bill-of-materials.aspx
(Full disclosure: I went to school with Allan, and he and I chatted about this topic earlier this week, since he hasn't got a lot to do right now, given he's furloughed. He's smart about this shit, I'm not: I just push the buttons that make the daemons move.)
I dig some digging (google, my email history over the past year or so), but because of the current state of business in the US federal government, this is the closest link I could get to The Actual Thing: https://www.ntia.doc.gov/SoftwareTransparency
I'm pretty sure that Allan's really angry that he can't, by law, talk about this right now. But I agree with him that it's really a thing that intelligent people should be worried about.
PS, I applaud your use of "unices" as the plural. (It's linguistically questionable, but I dig it.)
And today is the 15th anniversary of Unix time being half over. I guess the coincidence is because (a) Unix time started at the beginning of a year, (b) Y2K was also at the beginning of a year, and (c) Unix time is within 0.07% of an integral number of years long - 68 years plus 18 days.
Oh hey, Wikipedia has a cute animation:
It was two weeks to 19 Jan 2038, nobody was particularly worried, and that's when I received the DEC tape pack in the mail. It was labelled just "you'll need this- love, ken" and it contained a small (pre-ANSI) C program, which, once a suitably old compiler was produced, generated a PDP11 a.out. As far as I could tell, the program contained a highly compressed neural network trained to predict a way out of the forthcoming time_t apocalypse. I was particularly impressed, as neither LZ compression, nor backpropagation, had been invented at the time the tape was cut. Did I mention the tape had been sent from 1968? Mr. Thompson always was a joker.
My Y2K story: I had to spend midnight in the call center of a local Internet company. The upside was I got a very good bonus, and they let us the building roof (summer down here) to have a nice dinner with my wife.
Got to watch a spectacular fireworks show from a basically impossible to get location!
(And of course nothing bad happened.)
We had a (sort of) Y2.019K issue this week. Some software is still writing 2 digit years and ours interprets 19 as the start of a 4 year date.