2100 and 2400 will be a shitshow
Not as much as 2038
Yeah that's a different shitshow but agreed it is likely to be worse - like y2k the effects are smeared out before and after the date.
Why?
Because of the Year 2038 problem.
32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.
This has already been patched on all 64 bit OSes though - whatever 32 bit systems are still in existence in another 15 years will just roll their dates back 50 years and add another layer of duct tape to their jerry-rigged existence
2038 will certainly be a shit show
Yeah but I'll be dead so not my problem lmao
Why
2100 not a leap year (divisible by 100). 2400 is a leap year (divisible by 400). Developing for dates is a minefield.
Because they're not leap years but are 0 === year % 4
Luckily, none of us will be there.
Nah.
Same thing happened in 2000 and it was a mouse’s fart.
Because of months of preparation. I know, I was doing it.
And now that every time library has been updated, we're safe until our grandchildren reimplement those bugs in a language that has not yet been invented.
I've already seen reimplementation of 2 digit dates here and there.
LOL fuck those guys.
Fortunately I will not be involved. Hopefully I can make something from 2038 though.
You’re not the only one forseeing a nice consultant payday there.
I went to uni in the mid 90s when Y2K prep was all the rage, went back to do another degree 20 years later. It was interesting to see the graffiti in the CS toilets. Two digits up to about 1996, four digits for a decade, then back to two.
Won’t the computer’s clock reset every time you go to sleep and stop cranking the power generator?
Yeah who knows if our computers are sticks by either date
Then there's my code, which didn't even survive the time change.
In every project I've ever worked on, there's been somebody who must have been like, "HurDur Storing timestamps in UTC is for losers. Nyeaahh!"
And if I ever find that person, I'm going to get one of those foam pool noodles, and whack him/her over the head with it until I've successfully vented all my frustrations.
I just use a float between 0 and 1 with 0 being 1970 and 1 being the predicted heat death of the universe.
The only time using UTC breaks down is when any sort of time change gets involved.
If I say I want a reminder at 9am six months from now and you store that as UTC, a day light savings change will mean I get my reminder an hour early or late depending on where in the world I am
But wouldn't you calculate the time in the future in the right time zone and then store it back as UTC?
It depends on the application.
I don't remember all the specifics but this is the blog post I refer to when this topic comes up
https://codeblog.jonskeet.uk/2019/03/27/storing-utc-is-not-a-silver-bullet/
So TL;DR: there might be unexpected time zone rule changes in the future. The solution presented in the article is to store both UTC and local time, so the application can easily adjust itself if such change happens.
Your code made it to the time change!!?
Same... The change from 12 to 1
People who haven't had a birthday in almost four years are like
.
Programming aside, where I live in Southern Europe we have a tradition according to which leap years bring bad luck. After 2020, I don't know what to expect... nuclear apocalypse maybe?
Always, always, always, without taking any shortcuts, use a tzinfo library for your language.
Anyone who doesn't use standardized libraries for tz should be summarily tried.
I worked in broadcasting (programming broadcasting applications), everything is done with PTP (Precise Time Protocol) and TC (timecode) in video. We had to support leap second, it's not as easy, but in the end, insert black frames for 1s and that's it.
I hope leap days are handled a bit more sophisticated!
Insert black frames for 24 hours and you're good to go!
Haha yes, no problem with those 😁
I'm not worried about my code, I'm (very slightly) worried about all the date libraries I used because I didn't want code that shit again for the billionth time.
Your comment made me go look at the source for moment.js. It has "leap" 13 times and the code looks correct. I assume they test stuff like this.
Yeah, I'm generally using the common data/time libraries in most (if not all) languages and I'm pretty sure they've all been through more than 1 leap year at this point. I just never 100% trust the code I don't control - 99.9% maybe, but never 100.
I just never 100% trust the code I don't control
I never 100% trust the code I do control. Partially because a lot of it is inherited but also because I know corners were cut but I can't always remember when and where
Yeah... I patched some unit tests...
Before it was 50/50 that they'd fail on leap day, but after the patch it's 50/50.
I'm not worried at all - I love me some tz database.
I hope the homeassistant guys already have this covered, because I didn't use it 4 years ago to know
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.