572

Meme transcription:

Panel 1: Bilbo Baggins ponders, “After all… why should I care about the difference between int and String?

Panel 2: Bilbo Baggins is revealed to be an API developer. He continues, “JSON is always String, anyways…”

you are viewing a single comment's thread
view the rest of the comments
[-] Rednax@lemmy.world 9 points 6 months ago

The worst thing is: you can't even put an int in a json file. Only doubles. For most people that is fine, since a double can function as a 32 bit int. But not when you are using 64 bit identifiers or timestamps.

[-] firelizzard@programming.dev 34 points 6 months ago

That’s an artifact of JavaScript, not JSON. The JSON spec states that numbers are a sequence of digits with up to one decimal point. Implementations are not obligated to decode numbers as floating point. Go will happily decode into a 64-bit int, or into an arbitrary precision number.

[-] Aux@lemmy.world -3 points 6 months ago

What that means is that you cannot rely on numbers in JSON. Just use strings.

[-] JackbyDev@programming.dev 1 points 6 months ago

Unless you're dealing with some insanely flexible schema, you should be able to know what kind of number (int, double, and so on) a field should contain when deserializing a number field in JSON. Using a string does not provide any benefits here unless there's some big in your deserialzation process.

[-] Aux@lemmy.world 1 points 5 months ago

What's the point of your schema if the receiving end is JavaScript, for example? You can convert a string to BigNumber, but you'll get wrong data if you're sending a number.

[-] JackbyDev@programming.dev 1 points 5 months ago

I'm not following your point so I think I might be misunderstanding it. If the types of numbers you want to express are literally incapable of being expressed using JSON numbers then yes, you should absolutely use string (or maybe even an object of multiple fields).

[-] sukhmel@programming.dev 1 points 5 months ago* (last edited 5 months ago)

The point is that everything is expressable as JSON numbers, it's when those numbers are read by JS there's an issue

[-] JackbyDev@programming.dev 1 points 5 months ago

Can you give a specific example? Might help me understand your point.

[-] sukhmel@programming.dev 1 points 5 months ago

I am not sure what could be the example, my point was that the spec and the RFC are very abstract and never mention any limitations on the number content. Of course the implementations in the language will be more limited than that, and if limitations are different, it will create dissimilar experience for the user, like this: Why does JSON.parse corrupt large numbers and how to solve this

[-] JackbyDev@programming.dev 1 points 5 months ago

This is what I was getting at here https://programming.dev/comment/10849419 (although I had a typo and said big instead of bug). The problem is with the parser in those circumstances, not the serialization format or language.

[-] sukhmel@programming.dev 1 points 5 months ago

I disagree a bit in that the schema often doesn't specify limits and operates in JSON standard's terms, it will say that you should get/send a number, but will not usually say at what point will it break.

This is the opposite of what C language does, being so specific that it is not even turing complete (in a theoretical sense, it is practically)

[-] JackbyDev@programming.dev 2 points 5 months ago* (last edited 5 months ago)

Then the problem is the schema being under specified. Take the classic pet store example. It says that the I'd is int64. https://petstore3.swagger.io/#/store/placeOrder

If some API is so underspecified that it just says "number" then I'd say the schema is wrong. If your JSON parser has no way of passing numbers as arbitrary length number types (like BigDecimal in Java) then that's a problem with your parser.

I don't think the truly truly extreme edge case of things like C not technically being able to simulate a truly infinite tape in a Turing machine is the sort of thing we need to worry about. I'm sure if the JSON object you're parsing is some astronomically large series of nested objects that specifications might begin to fall apart too (things like the maximum amount of memory any specific processor can have being a finite amount), but that doesn't mean the format is wrong.

And simply choosing to "use string instead" won't solve any of these crazy hypotheticals.

[-] sukhmel@programming.dev 1 points 5 months ago

Underspecified schema is indeed a problem, but I find it too common to just shrug it off

Also, you're very right that just using strings will not improve the situation 🤝

[-] bleistift2@sopuli.xyz 1 points 5 months ago

What makes you think so?

const bigJSON = '{"gross_gdp": 12345678901234567890}';
JSON.parse(bigJSON, (key, value, context) => {
  if (key === "gross_gdp") {
    // Ignore the value because it has already lost precision
    return BigInt(context.source);
  }
  return value;
});
> {gross_gdp: 12345678901234567890n}

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse

[-] Aux@lemmy.world 0 points 5 months ago

Because no one is using JSON.parse directly. Do you guys even code?

[-] bleistift2@sopuli.xyz 0 points 5 months ago

It’s neither JSON’s nor JavaScript’s fault that you don’t want to make a simple function call to properly deserialize the data.

[-] Aux@lemmy.world 0 points 5 months ago

It's not up to me. Or you.

this post was submitted on 30 Jun 2024
572 points (98.1% liked)

Programmer Humor

19843 readers
37 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS