I've got some tests in active code bases that are using the end of 32-bit Unix time as "we'll never get there". That's not because the devs were lazy, these tests date from when that was the best they could possibly do. They're on track to be cycled out well before then (hopefully this year), so, hopefully, they'll be right that their code "won't get there"... but then there's the testing and code that assumes this that I don't know about that may still be a problem.
"End of Unix time" is under 12 years now, so, a bit longer than the time frame of this test, but we're coming up on it.
Now I feel bad for using (system foundation timestamp)+100 years as end of "forever" ownership relations in one of my systems. Looking now, it's only 89 years left. I think I should use nulls instead.
While there was a lot of FUD in the media, there were also a lot of scenarios that were actually possible but were averted due to a LOT of work and attention ahead of time. It should be looked at, IMO, as a success of communication, warnings, and a lot of effort that nothing of major significance happened.
But before you judge the fix too hashly, I bet it’s just a quick and easy fix that will suffice while a proper fix (to avoid depending on external state) is written.
An impossibly short period of time after the heat death of the universe on a system that shouldn’t even exist: ERROR TIME_TEST FAILURE
"End of Unix time" is under 12 years now, so, a bit longer than the time frame of this test, but we're coming up on it.
https://en.wikipedia.org/wiki/Preparedness_paradox
Dissimilar to the global climate catastrophe, unfortunately.