NaN behavior in JS is defined in the IEEE standard for floating-point numbers and should be the same in any language.
Any two values can be less than, equal, greater than, or unordered. The > operation, for example, evaluates to true if its operands are greater than. Per IEEE 754, "every NaN shall compare unordered with everything, including itself." By making NaN comparisons unordered, positive comparisons always evaluate as false. "Is 5 > NaN? No. Is NaN > 5? No." Same for ==. But negated comparisons are always true by consequence; since A != B is the same as !(A == B), anything != NaN is true.
One could argue that the spec should have special-cased NaN for the == operator (which as defined is only true if the comparison is equal, and not unordered), but they did not.
hajile 5 days ago [-]
The biggest number comparison oddity IMO is actually negative zero (in my experience, you usually get one by multiplying a negative number by zero on accident).
0 === -0 //=> true
Object.is(0, -0) //=> false
senfiaj 5 days ago [-]
Yeah, you are right. The same thing with 0.2 + 0.1 = 0.30000000000000004
loige 5 days ago [-]
Thanks for the thoughtful write-up—always interesting to revisit JavaScript's quirks and strengths through someone else's lens.
That said, I think a few of the issues highlighted, while valid, might not be as disruptive in modern practice as they used to be. For instance, the typeof null === "object" behavior is certainly unfortunate, but it's well-understood at this point, and most tooling/linting setups help you avoid tripping over it. Similarly, Array.prototype.sort() sorting lexicographically by default feels odd until you remember JavaScript’s history as a primarily string-focused scripting language. These days, providing a comparator is second nature for most devs working with numeric arrays.
The point about accidentally declaring globals is a good one—but thankfully, with ES modules, use strict as the default, and tools like TypeScript, it’s much harder to make that mistake in modern codebases.
That said, I really appreciate the focus on JavaScript’s virtues, like closures, async/await, and overall flexibility—those really are part of what makes the language so enjoyable despite its rough edges.
I'm curious—have you built or maintained any large-scale JavaScript or TypeScript codebases recently? If so, what were the biggest pain points you ran into? Do the quirks you mention tend to surface in real-world projects at scale, or are they more theoretical annoyances that pop up in isolated cases?
There are definitely still many challenges in the language, but the ecosystem has come a long way, and it seems like it's still evolving in the right direction. Hopefully JavaScript keeps improving—both in language design and developer ergonomics—without losing what makes it so approachable in the first place.
wruza 5 days ago [-]
What I don't like is lack of features that would erase whole classes of software around it. For example, function environments (substitute the global scope with another object for a single invocation) could delete lots of code and paradigm from vue/rx-likes.
It also neither supports operator overloads (good thing in general) nor has a decimal type which is a must for human numbers like euros and kilograms. Rather it includes the mostly useless bigint.
It also does nothing to fix the mentioned quirks like sort(). It could e.g. include cmp(a, b), and propcmp(name[,cmp]) off the shelf so the issue wasn't wordy at least: nums.sort(cmp), objs.sort(propcmp("id")).
hajile 5 days ago [-]
There is a decimal proposal for JS. It almost certainly won't use normal operators though because of the JIT implementor complaints about adding BigInt (not to mention recently dropping the record/tuple proposal over `===`).
Then it's of little use imo. Library decimals already exist in multiple optimized forms.
hajile 4 days ago [-]
It would still use faster JIT mechanisms internally and make strong guarantees, but the interface would suck.
JS really needs to add a "use type" or typed module feature with strong Hindley-Milner types. This would allow them to cut the rest of the really bad parts like the bad type coercion, but definite types would also mean that adding these new features to typed modules only would get rid of the performance issues for stuff like operator overloading because the types would be known in advance.
queenkjuul 5 days ago [-]
Can't you `function.bind(context, ...)` or am i misunderstanding
wruza 4 days ago [-]
Bind substitutes `this`, not global scope. With function envs you can write functions like function () {a += 1} and have it increment `a` in some object.
5 days ago [-]
Rendered at 07:03:46 GMT+0000 (Coordinated Universal Time) with Vercel.
Any two values can be less than, equal, greater than, or unordered. The > operation, for example, evaluates to true if its operands are greater than. Per IEEE 754, "every NaN shall compare unordered with everything, including itself." By making NaN comparisons unordered, positive comparisons always evaluate as false. "Is 5 > NaN? No. Is NaN > 5? No." Same for ==. But negated comparisons are always true by consequence; since A != B is the same as !(A == B), anything != NaN is true.
One could argue that the spec should have special-cased NaN for the == operator (which as defined is only true if the comparison is equal, and not unordered), but they did not.
That said, I think a few of the issues highlighted, while valid, might not be as disruptive in modern practice as they used to be. For instance, the typeof null === "object" behavior is certainly unfortunate, but it's well-understood at this point, and most tooling/linting setups help you avoid tripping over it. Similarly, Array.prototype.sort() sorting lexicographically by default feels odd until you remember JavaScript’s history as a primarily string-focused scripting language. These days, providing a comparator is second nature for most devs working with numeric arrays.
The point about accidentally declaring globals is a good one—but thankfully, with ES modules, use strict as the default, and tools like TypeScript, it’s much harder to make that mistake in modern codebases.
That said, I really appreciate the focus on JavaScript’s virtues, like closures, async/await, and overall flexibility—those really are part of what makes the language so enjoyable despite its rough edges.
I'm curious—have you built or maintained any large-scale JavaScript or TypeScript codebases recently? If so, what were the biggest pain points you ran into? Do the quirks you mention tend to surface in real-world projects at scale, or are they more theoretical annoyances that pop up in isolated cases?
There are definitely still many challenges in the language, but the ecosystem has come a long way, and it seems like it's still evolving in the right direction. Hopefully JavaScript keeps improving—both in language design and developer ergonomics—without losing what makes it so approachable in the first place.
It also neither supports operator overloads (good thing in general) nor has a decimal type which is a must for human numbers like euros and kilograms. Rather it includes the mostly useless bigint.
It also does nothing to fix the mentioned quirks like sort(). It could e.g. include cmp(a, b), and propcmp(name[,cmp]) off the shelf so the issue wasn't wordy at least: nums.sort(cmp), objs.sort(propcmp("id")).
https://github.com/tc39/proposal-decimal
JS really needs to add a "use type" or typed module feature with strong Hindley-Milner types. This would allow them to cut the rest of the really bad parts like the bad type coercion, but definite types would also mean that adding these new features to typed modules only would get rid of the performance issues for stuff like operator overloading because the types would be known in advance.