• frezik@midwest.social
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      4 days ago

      A common method of storing dates is the number of seconds since midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

      A 32-bit signed integer means it can store numbers between 231 through 231 - 1 (subtracting one comes from zero being effectively a positive number for these purposes). 231 - 1 seconds added to Jan 1, 1970 gets you to Jan 19, 2038.

      The solution is to jump to 64-bit integers, but as with Y2K, there’s a lot of old systems that need to be updated to 64-bit integers (and no, they don’t necessarily have to have 64-bit CPUs to make that work). For the most part, this has been done already. That would put the date out to 292,277,026,596 CE. Which is orders of magnitude past the time for the sun to turn into a red giant.

      • gandalf_der_12te@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

        well not so much, as far as I remember the first end-user computers became available in 1971 or 1972 or something, and the internet also underwent some rapid developments in that time, so the date has a certain reasoning to it.

    • teije9@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      Unix computers store time in seconds that have passed since january first 1970. one there have been too many seconds since 1970, it starts breaking. ‘signed’ is a way to store negative numbers in binary. the basics of it are: when the leftmost bit is a 1, it’s a negative number (and then you do some other things to the rest of the number so that it acts like a negative number) so when there have been 09999999 seconds since 1970, if there’s one more second it’ll be 10000000, which a computer sees as -9999999.