• scarecrw@lemmy.one
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    2 months ago

    Why restrict to 54-bit signed integers? Is there some common language I’m not thinking of that has this as its limit?

    Edit: Found it myself, it’s the range where you can store an integer in a double precision float without error. I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types. I don’t come from a web-dev/js background, though, so maybe it makes more sense there.

    • lysdexic@programming.devOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Why restrict to 54-bit signed integers?

      Because number is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.

      Meaning, it’s the highest integer precision that a double-precision object can express.

      I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types.

      It’s not about compatibility. It’s because JSON only has a number type which covers both floating point and integers, and number is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      I didn’t think you realize just how much code is written in JavaScript these days.