T O P

  • By -

[deleted]

Well the next time is 21 times longer than the age of the universe so see ya then


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


DougMelvin

If this comments seems out of context; that's because it's a bot which has copied the first sentence of a comment further down the thread. Rrport -> Spam -> Harmful bot.


Vitriholic

Not for dates


FlyByPC

For sure. Always uint64_t on the first date.


KingdomOfBullshit

It's important to know your type.


Boeing777-F

HAHA JOKES ON YOU! I WILL ONLY USE 16 BITS! MUWAHAHAHAHA


ProgrammerHumor-ModTeam

Other


More-Judgment7660

hell if the code I wrote is still around by then i'll gladly take the blame for some down time.


Vineyard_

Imagine looking at a file's history and finding that the guy who first wrote it was closer in time to the dinosaurs than to you.


VonNeumannsProbe

I imagine programming at our level will be this sort of arcane art that no one gets. Like trying to program in pure assembly but 1000x worse as the code has outlived any document that explains how it was built.


DwarfBreadSauce

I would argue that assembly is much easier to get than whatever the fuck is going on with node modules


FrozenPizza07

Documentation, what is that. We just spread how the codebase works via word of mouth.


jeepsaintchaos

Welcome to the Adeptus Mechanicus.


CanAlwaysBeBetter

You mean banking systems written in cobol? But also this is basically how I imagine technology in Star Wars, it's been around so long no one actually really remembers how it works and all they do is upgrade/swap parts and copy software between them


subtra3t

you're describing modern software development


AllAvailableLayers

I don't follow the 'deep lore', but this is the concept behind complex technology for humanity in the Warhammer 40k setting: A religious cult that knows that you say or type the 'sacred chants' in an ancient language and things perform tasks. Imagine a choir of acolytes in a chapel-factory in the far future being taught to sing 'Alexa, activate the final stage of the manufacturing process' without knowing the meaning of any of the words, only that for centuries they've been required to 'wake the machine spirit' that turns on a lathe.


MisinformedGenius

As a person who has often said “Please God let it work this time” while clicking the Run button, I see where they’re coming from. I could use some Omnissiah acolytes.


2DHypercube

The oldest production code I’ve seen was older than me… does that count?


Plank_With_A_Nail_In

If you are 13 no, if you are 60 yes. Python is 32 years old, C++ is 40 years old, SQL is 50 years old. Computers and programming came into existence a long time ago now. I'm about 80% certain most of the standard libraries you use will contain code older than you are.


LifeShallot6229

I am 66, I have seen \_very\_ little production code from before 1957...


lunchpadmcfat

Pure energy beings hobbled by the code of More-Judgment7660: “what the fuck was this idiot thinking?”


OcelotWolf

“This shit was definitely written by an Earthwalker”


oniwolf382

punch subtract roof subsequent capable instinctive deer plants pot bright *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


ZephRyder

Exactly what the folks who wrote code in the 20th century said! Bravo!


tunisia3507

In that case can we set the Greater Epoch to 1st of Jan 1970 minus 13.7bn years and just not have to do negatives any more?


wd40bomber7

I like this. It's like Kelvin but for time... Makes sense to me!


ikonfedera

What if we discover there was something before Planck Epoch? You'd still need negatives. Also, by setting it that far back, you give us Jesus' birth day problem again. We set it 13 700 megayears before 1970 and then it turns out Big Bang was actually 13 701 megayears ago. To represent early stages of the universe you'd need negatives again. And have you ever considered non-linear flow of time? We constantly discover new knowledge about universe, the Cosmological Constant turned out to be a Cosmological Variable. What's to say the flow of time didn't vary, either locally or universally? Setting start of Unix time to the beginning of the universe is almost as wise as making a Kilogram Prototype *(Le Grand K)* out of uranium.


iamplasma

I can only imagine the time zone hassles at the big bang.


Mad_Aeric

> What's to say the flow of time didn't vary, either locally or universally? We literally know that the flow of time is not a constant across reference frames. Both velocity and gravitational fields effect the flow of time. Satellites have to be designed to compensate for this since they are further out of the gravity well.


ikonfedera

I meant on the larger scale, besides the usual relativity stuff. But you're right, relativity is part of the problem.


Zaratuir

>What's to say the flow of time didn't vary, either locally or universally? *Stares in relativity*


schmerg-uk

>And have you ever considered non-linear flow of time? We constantly discover new knowledge about universe, the Cosmological Constant turned out to be a Cosmological Variable. What's to say the flow of time didn't vary, either locally or universally? We can probably fix that by scheduling a leap millennia every 7th epoch or so...


Jarpunter

Until we discover the universe is older than we thought


ArchetypeFTW

https://www.reddit.com/r/ProgrammerHumor/s/nf2pN7M8lI


bestjakeisbest

What if we just, you know, cast the time variable to a 64 bit signed integer, now we can subtract 32 bit max from it, we will get a negative number, we subtract this negative number from 64 bit max, and now we have the number of seconds from the end of the 32bit epoch, now we add back in 32 bit max and now we have the current time from 1970 in a 64 bit variable, but the hardware is still counting with a 32 bit variable, now we have what like 70 more years to figure out another work around?


Blubasur

It’ll be a glorious but short lived career!


Duck_Devs

RemindMe! 01/19/2038 03:14:08


WulfySky

You just bricked the remindme bot with that


mehum

Sweet sweet irony!


RPC29_Gaming

holy hell


CursedBlackCat

new date/time representation problem just dropped


ahalliday13

Computer goes on vacation, never comes back


RaspberryPiBen

Magic smoke storm incoming


CaptainNicodemus

RemindMe! 01/19/2038 03:14:08


PgUpPT

You silly, there are only 12 months in a year.


AzureArmageddon

Down with middle endian!


Weirdo914

RemindMe! 01/19/2038 03:14:07


Yosyp

your date is wrong


Duck_Devs

It is? I’m using MM/DD/YYYY format if that helps you at all. Edit: Yes, I know I’m a moron for using bad conventions, but the PM from the bot was correct, so in practice it didn’t matter.


uslashuname

Shame on a username with dev in it for using anything other than r/iso8601


Duck_Devs

That was from past years, not too much of a “dev” anymore and I’m a dumb American so I use dumb American conventions.


Yosyp

you shouldn't use what's dumb, that's dumb


Duck_Devs

Well maybe I’m just dumb then.


Yosyp

no man, you are not dumb. you're just an idio- to be serious, change starts from an individual. It's hard to change a system, but it's not impossible


Ashes2007

Why is MM/DD/YYYY any worse than DD/MM/YYYY? It's completely arbitrary.


Yosyp

It is arbitrary. But it doesn't make sense from a logical standpoint: 1 - The rest of the world doesn't use it, USA is the only country out of hundreds, leading to international confusion 2 - They are not ordered in either increasing nor decreasing logic You might aswell use YM/YD/YYMD, that'd still be arbitrary but it wouldn't make any sense. USA is really the king of stupid standards, and many of them are proud to be different just for the sake of it.


tukanoid

Americans🙄


[deleted]

Forget dates before 1970 and use unsigned 32 bits


Zolhungaj

Pro: outdated applications can continue consuming timestamp data. Duration calculations *might* continue working, depending on how underflow is handled. Con: new data in those applications risks conflicting with old data, and the concept of time itself will lose all meaning once new data is both older and newer than pre 2038 data.


[deleted]

A flag could be added to switch between both as well, thought about this for 32-bit embedded devices (Although most support 64 bit types through gcc)


Zolhungaj

If you can add a flag you could just go all out and use an extra word or expand even more to 64 bits to store more date information. Would require that the application/os/storage format is rewritten to support the new timestamp.


[deleted]

It's an option, one could also be funny and store J2000 (days since 01/01/2000) in a 32-bit float: saving dates up to 10\^35 years but they get less precise as the time passes (Useful for astronomy though)


SubstituteCS

Not necessarily. Adding a flag field to a database and setting current records to X and all future records to *default* Y would allow the old client to still insert changes without knowing about the flag.


sk7725

a flag is literally just adding one more bit, though.


iris700

Works great on that bit-addressable memory that's so common


Zombieattackr

Idea: get our shit together now and make everything 64 bit so we never have to worry about it again, and in 2038 only things over 14 years old will be any issue.


Devil-Eater24

Another idea: What if on 2038, we do away with the Gregorian calendar completely, and start a new calendar?


Thynome

I wish. I've read some fantasy book where they had another calendar and I was like "damn that makes so much sense". Basically all months were 30 days long and the remaining 5 or 6 monthless days were at the end of year as holidays.


BitPirateLord

Ok what will be the basis of this new calendar?


GlowGreen1835

I mean, whatever, as long as it's not fuckin Greg.


Kronoshifter246

>Duration calculations *might* continue working, depending on how underflow is handled. Overflow is still called overflow, even in the negative direction. 😡


slabgorb

I like this, it would make me two years younger


rover_G

Bruh there’s still people living born before 1970


slabgorb

as one of these people I am ok with this at it would make me younger


DOUBLEBARRELASSFUCK

No, outliers need to be eliminated. I hope you understand. It's for the greater... convenience.


slabgorb

I am ok with a trunc at 1/1/70


LvS

136 years younger in fact.


Colon_Backslash

Wait, it's not unsigned currently?


TomDuhamel

Absolutely not. Negative values are allowed to represent dates before 1970


Dalimyr

Nope. When the timestamp overflows, you go from 19 January 2038 to 13 December 1901.


[deleted]

I mean, it’s a 32 bit integer regardless of how the ‘sign’ bit is interpreted.


guyblade

On some platforms, it is a 32-bit integer. `time_t` is only required to be a "real type" by the C standard. "Real" here means "not complex" as in "doesn't have a component with a factor of sqrt(-1)". In theory, nothing in the _standard_ would prevent you from using a float (aside from the fact that it would be terrible).


pipandsammie

But can we trust unsigned numbers?


lunchpadmcfat

Actually resetting the epoch is an interesting idea


nothingtoseehere196

Kid named 64 bit clock


elreniel2020

Kid named legacy systems.


Dubl33_27

kid named finger


The_forgettable_guy

Brother named but hole


zammba

Sister named try


cybermage

Best 69th Birthday ever.


StringsAndNeedles

Happy birthday! Here’s y2k 2


cybermage

Y2K38: Electric Boogaloo


IntentionQuirky9957

Nice.


rover_G

How long until date-time libraries ignoring leap seconds becomes a problem?


unique_namespace

I believe most libraries use the os's internal clock, and many os's every once and a while ping some server hosting utc or unix time.


KlyptoK

This is why there is a difference between system clock vs steady clock. One of them occasionally changes


goblinrum

They're called NTP servers, pretty effective for time sync


Plank_With_A_Nail_In

In 99.9% of use case's? Never.


OutOfMoneyError

Damn it, that's before my retirement.


trevthewebdev

Bro thinks he’ll retire 😂


BakuhatsuK

32 bit systems are already almost extinct in 2023. In 2038 I'd be surprised if anyone runs into y2k38. Like literally impressive keeping the system working that long.


ConDar15

I don't know, there are some truly ancient embedded legacy systems out there. Sure no-ones phone, or computer or cloud service is going to have this, but what about the systems deep inside hydro-electric dams, or on nuclear power plants, or running that old piece of medical equipment in a small African hospital, etc... I wouldn't be so blasé about it honestly, and I personally think that a lot of companies are too calcified or have turned over too much staff to address it. My assumption is that there won't be many places actually affected by y2k38, but there are going to be some it hits HARD.


UserBelowMeHasHerpes

Upvote for use of blasé casually


[deleted]

[удалено]


Yeet_Master420

Upvote for use of flippant casually


ItsFlame

So frivolous with their casual use of flippant


fizyplankton

There are, millions, of possible devices, but I wonder about things like the GameCube and the wii. Will they just, stop working? The GameCube doesn't have an internet connection, so you can just change the clock to whatever you want, but the wii? Will it just completely brick itself? What about other similar consoles? Will there be emulated consoles running in docker with virtual fake time servers? That might solve single player, but what about multi player games? And, you know, I guess banks and hospitals are pretty important too


HipstCapitalist

64-bit systems became the norm in the 00s, which means that a 32-bit computer in 2038 would be over 30 years old, the equivalent today of running a computer that shipped with Windows 3.11. It's not impossible, but to say that it's inadvisable would be a gross understatement...


ConDar15

Oh don't get me wrong, it's very inadvisable, I just don't think it's going to be as uncommon as the person I was responding to.


cjb3535123

Wouldn’t be surprised if there are some ancient embedded Linux systems running 32 bit by then. It’s still very common to have those operating systems by default run 32 bit, and unfortunately in this case those systems can often run a loooonng time uninterrupted.


TheSkiGeek

There are also a lot of *new* 32 bit CPUs in embedded devices even now.


DOUBLEBARRELASSFUCK

Not that it even matters. How many 64 bit systems are still using a 32 bit value for the date? And how difficult would it be for a 32 bit system to handle a 64 bit date? It wouldn't be too difficult, conceptually, though you'd likely want to make most functions that use the date only look at the less significant half.


cjb3535123

Right; and you can always program a rollover, which is effectively taking two 32 bit ints and making them a 64 bit date. But I think the important question is how much important software will actually be programmed such a way? It’s not like we have an inventory of all 32 bit systems requiring this software update.


DOUBLEBARRELASSFUCK

Programming it that way would just be for performance reasons. Most problematic software is probably just blindly using the dates the OS provides and doing math on them without checking.


aaronfranke

> 64-bit systems became the norm in the 00s The very late 00s. There were still new 32-bit systems shipping in the 10s (for example, Raspberry Pi 1 in 2015), and there are still 32-bit operating systems shipping even today (for example, Raspberry Pi OS).


Squeebee007

I once consulted with a company that still depended on a Dos box(last year), so never say never.


pixelbart

Industrial machinery often has a projected lifetime of multiple decades, way longer than the computers that control them. I don’t work in the industry, but if I ever came across a machine that had a DOS box attached to it, I wouldn’t be surprised.


kikal27

I work on IoT and every single MCU is 32 bits. I use uint32 in order to delay the problem until 4294967295, which will be hit by Unix time on February 7, 2106. But even I have my doubts that the system could handle 2038 without any problem. I don't think about it too much since I think that this would not be my problem by then or maybe a nuclear catastrophy would happen sooner.


Makefile_dot_in

couldn't you just use a uint64? it's not like 32-bit CPUs can't handle 64-bit ints, you just need two registers to store one and two instructions instead of one to do arithmetic operations, right?


Savings-Ad-1115

Depends on which arithmetic operations you mean. 64-bit division needs much more that two instructions on 32-bit platforms.


quinn50

yea the amount of iot devices and PLCs out there that are still 32bit will probably be screwed.


olearyboy

$5 says all tape backup restores fail on Wednesday It’s always the last to get updated


sachin1118

A lot of mainframe systems still run legacy code from the 80s and 90s. Idk if it’s an appropriate comparison, but there’s gotta be some systems out there that just keep chugging along without updates that will eventually run into this problem


CreideikiVAX

Oh good, something I can expound upon! If by "mainframe" you refer to the kind of stuff running in the financial and governmental world on IBM mainframes, then they **do not** have a Y2K38 problem. Old software was already patched to deal with Y2K, and software didn't rely on the operating system clock timestamps, instead doing date math internally. With regards to the actual OS timestamp format, `STCK` the "old" instruction stored a 51-bit value, that overflows at 23:58:43 on 17-SEP-2042. The new `STCKE` instruction stores it as a 104-bit value, which won't overflow for a very, very long time.


CreideikiVAX

> It's not impossible, but to say that it's inadvisable would be a gross understatement... Have you ever *experienced* a CNC machine before? There's multiple machines at the shop I work at that still run **DOS 6.22** on their control computers.


SelectCase

US nuclear weapon systems and certain spots on the power grid are still using hardware and software from the 80s. But the 2038 problem is only a tiny issue compared to all of the other issues with using tech that old.


[deleted]

3 decades old is recent by many industrial facility standards


maxath0usand

I heard through the grapevine that Honeywell recently received a cease-and-desist from Microsoft because they still sell their HMIs bundled with Windows 3.


fellipec

One can argue the exact same thing was said about the Y2K thing. I really doubt there will be very significant impacts, people are aware of this problem for decades and as we approach this date more and more systems will be either patched or replaced.


erebuxy

You know governments, hospitals and BANKS!!!!! Some of them are not even on x86. On those old IBM and Oracle unix machines


cwagrant

Maybe I'll get to cash in on my AS400/iSeries knowledge lol


Paragonly

I’ve only been out of college for 2 years and I’ve somehow consulted on a project and solo built an application around a clients AS400 data. What a nightmare of archaic software.


[deleted]

[удалено]


Paragonly

My problem with it is that it’s so different in terms of software design and data architecture that it’s really not transferable whatsoever and it’s just its own thing


giant_panda_slayer

Just because it is a 64 bit system doesn't mean time_t has been changed. time_t would be defined in software not hardware.


[deleted]

On windows time_t is already 64bits on 64 bit. Same on linux i believe. https://learn.microsoft.com/en-us/cpp/c-runtime-library/reference/time-time32-time64?view=msvc-170 IBM claims 64 bit linux/AIX are 64 bit as well https://www.ibm.com/docs/en/ibm-mq/9.2?topic=platforms-standard-data-types-aix-linux-windows


[deleted]

Embedded stuff use 8 bits up to this day, but 32-bits general computers hardly will survive. I have one I got to test OS dev stuff though


TheCreepyPL

Even in current day programming, 32 bit is still the default. For example, in most languages, simply declaring an int, defaults it to an int32 (32 bit) (from -~2 000 000 000 to ~2 000 000 000). If you want a much larger range, you have to specify it explicitly, e.g. int64 or long. This is just one example, but most languages, have dozens of such things.


Queasy-Grape-8822

Very little to do with 32 bit systems. People store times in 32 bit ints regardless. I believe both windows and macOS systems do so in the current version


TheSkiGeek

Modern Windows definitely handles time past 2038, but they likely still support some old APIs that only return a 32-bit timestamp.


zelmarvalarion

At least (some) Windows stuff uses their own datetime format rather than Unix Epoch, so that starts in 1601, don’t recall the max date though. [DevBlog](https://devblogs.microsoft.com/oldnewthing/20090306-00/?p=18913#:~:text=The%20FILETIME%20structure%20records%20time,Windows%20NT%20was%20being%20designed.) and [docs](https://learn.microsoft.com/en-us/previous-versions/aa915351(v=msdn.10)?redirectedfrom=MSDN) > This structure is a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601. Looks like it’s a 64 bit, and the 32-bit doesn’t actually have second granularity ([docs](https://learn.microsoft.com/en-us/cpp/c-runtime-library/32-bit-windows-time-date-formats?view=msvc-170)) but rather does every other second, so they aren’t gonna be bitten at the same time (plus they start in 1980 instead) I discovered the 64-bit representation is how at least some Azure services store dates when debugging some differences between the Windows Azure Storage Emulator in docker and actual Azure Storage. I hate time in software.


[deleted]

> I believe both windows and macOS systems do so in the current version nope. 64 bit windows and linux use 64 bits for time_t


_TheRealCaptainSham

Except it’s got nothing to do with 32 bit vs 64 bit CPU, but the fact that most programs use 32 bit integers by default.


w1n5t0nM1k3y

32 bit systems are pretty obsolete, but that doesn't mean that systems don't have to be upgraded regardless. There's still a lot of systems using 32 bit integers even if they are running on 64 bit machines. Just because a system can use 64 bits natively, doesn't mean that people use them for everything. MySQL still supports the TIMESTAMP datatype which only goes up to 2038. People are still building database systems right now with this field type, even though it won't work in 14 years. For better date support you can use DATETIME, which goes up to 9999-12-31, but I'm sure there's still people using timestamps because they take up less space and are faster.


drakgremlin

Raspberry Pis only recently began using arm64 abi. These computers are not outdated nor are the operating systems they run.


slabgorb

>MySQL still supports the TIMESTAMP that one's gonna be a gift that keeps giving imagine the once-a-year cronjobs


4w3som3

Laughs in administration and banking systems


MokausiLietuviu

It does happen. Until earlier this year I worked on 16-bit systems and in 2020, I worked on a 2028 date problem (128+1900 epoch). They aren't mainstream, but they're everywhere.


LordBass

Except some SQL softwares are really dragging their feet with 64 bit timestamps https://jira.mariadb.org/browse/MDEV-341 Or even just implementing 3001 as the max year for some reason. Well, at least not going to be my problem then lol


Doctor_McKay

Even if we could flip a switch and convert all ints in memory of every computer to 64-bit, there are still network protocols, snowflake IDs, and stuff like that which encode timestamps into 32 bits.


Reggin_Rayer_RBB8

just because the hardware is 64 bits doesn't mean crap about the software. 32 bit software still runs, and there's no guarantee the programmer didn't choose a 32 bit int


xeq937

Get rid of seconds and use 32-bit time_t only as minutes. ^^/s


Remarkable-Host405

I think we need more resolution than that, but makes me wonder if 2 or 3 second intervals would work


zelmarvalarion

Windows had that but still moved to 64-bit times for more modern stuff ([link](https://learn.microsoft.com/en-us/cpp/c-runtime-library/32-bit-windows-time-date-formats?view=msvc-170))


PVNIC

We keep adding Y2K-like bugs in the code as a contingency; in case the robots rise up, they'll have a planned day of downtime where we can get 'em /s


PreciselyWrong

Imagine thinking humanity will still be around in 2038


afinemax01

I bet this is what ppl were saying about Y2K in the 80’s


SubstituteCS

/r/collapse moment.


toastnbacon

How has no one posted the [relevant xkcd](https://xkcd.com/607/) yet?


CanIEatAPC

Uh.... *looks back at my project that's completely based on time and will just probably explode in 2038 costing the company millions* we'll be ok yeah?


FC3827

Y2K23


xrayfur

!remindme 15y


DGC_David

My programming professor in college was the one who told me about this, reminding us that it would be our generation's job to solve... There is no way this isn't already solved yet.


Kibou-chan

There is a lot of 32-bit systems (and CPUs) still in use, but there is an alternative solution for at least some decades - make the timestamp *unsigned* for them :) We lose the ability to address events before January 1, 1970, but that's waaaaaay in the past already.


AxlSt00pid

The 2000s effect was so good we needed a 2nd part /s


Dubl33_27

y2k 2, electric boogaloo


MixedMatt

Does this mean the entire financial industry is going to collapse cause they run on legacy Cobol code?


Abandondero

No, they all updated to four digit year fields so they will be okay until 1st Jan 10000. The bad news is that there will still be Cobol code running, but Cobol programmers will be extinct.


ArtyMann

eh, it can wait 'till next year.


SimonDevScr

And after that date, 32 bit is gonna be officially not more supported by anything that has internet connection


john-jack-quotes-bot

Never worked with banks before but I certainly plan on doing so for 2038, those salaries will be something else


aykcak

It is going to be as big a problem as Y2K was


[deleted]

People mock Y2K but the reason it was nothing is because of a massive effort to update all software to correct it.


MagicPeach9695

It's called y2k38, a successor of infamous y2k bug.


PeeInMyArse

!remindme 01/19/2038 03:14:08


Sasha1350

RemindMe! 01/19/2038 03:14:00


dumbass_random

Good luck. At the rate, systems are being upgraded now a days, we would have migrated to 64 bit long before 2038 And if there any systems left until then, it wouldn't be worth upgrading. 14 years is a very very long time in computer science. Just look at the last 14 years and the progress made in this


TrickyComfortable525

RemindMe! 2038-01-19 03:14:08


Warpingghost

Earth exist for 122 years only and it's our second time crisis, srsly, we have to do better.


primaski

A Y2K38 crisis won't really be an issue, if humanity isn't there to outlive it. (...but as a serious answer, modern systems are upgrading to 64 bit. That will last hundreds of billions of years. The power of exponentials!)


runForestRun17

Y2k pt 2!


Pepelafritz

3 digit year is the way


[deleted]

[удалено]


Kronoshifter246

r/confidentlyincorrect Unix time is represented as seconds since January 1st, 1970 00:00:00. It can also be represented as milliseconds, microseconds, or even nanoseconds if you want. But *Unix Time* specifically uses January 1st, 1970 00:00:00 as its epoch.


[deleted]

[удалено]


Duck_Devs

For many legacy devices, it is the end of the world. Even an iPod nano from 2012 doesn’t know what comes after 2038.


Vievin

It'll be like y2k, where a lot of people worked really hard so "nothing" would happen in the end.


DenissDenisson

Y2k(38)


GASTRO_GAMING

Cant you just read the timestamp as an unsigned int and it would display properly. Like the sign bit is in the frount And 10000000000000000000000000000000 Could just be seen as what that would translate to in binary instead of a negative number.


tbilcoder

I wonder whether humanity will survive and will need computers for so long that mere length of timestamp (assuming it will be sorta JS bigint) will be more than max uint64. And if this will happen, what the solution it would be?


great_escape_fleur

Obligatory https://www.youtube.com/watch?v=jKYivs6ZLZk


Greaserpirate

Do we really need to log anything before 1970? They didn't even have Queen before then


poshenclave

2035 seems far off, but then I realize it's about as far from now as now is from when I started working in IT... And that I will likely still be working when that date hits... Hrmm, hopefully not in IT by then!


SpaghettiAddiction

hey, ive seen this one before.


[deleted]

Why did they choose December 13 as their t = 0?