what is the output for console.log(0.1 + 0.2); console.log(0.1 + 0.2 == 0.3); in JavaScript?
An educated answer to this question would simply be: “ print out “0.3” and “true”. But in case of JavaScript its different, lets find the mystery
console.log(0.1+0.2)
A Number is a floating point number with a limited precision of 64 bits, about 16 digits. The largest integer number which can be represented by a JavaScript Number is +/- 9007199254740992 (+/- 2^53). Because of the limited precision of floating point numbers round-off errors can occur during calculations. This can be easily demonstrated:
console.log(0.1 + 0.2); // 0.30000000000000004
In most cases, round-off errors don’t matter: they have no significant impact on the results. However, it looks ugly when displaying output to a user. A solution is to limit the precision just below the actual precision of 16 digits in the displayed output:
// prevent round-off errors showing up in output var ans = math.add(0.1, 0.2); // 0.30000000000000004 math.format(ans, {precision: 14}); // '0.3'
Alternatives are to use Fractions which store a number as a numerator and denominator, or using BigNumber because of the IEEE 754 Double Precision floating point.
console.log(0.1 + 0.2 == 0.3);
As already discussed above JavaScript are all treated with floating point precision, so they may not always yield the expected results. The code above, for example, will print out:
console.log(0.1 + 0.2 == 0.3) console.log(0.30000000000000004 == 0.3) false
Thus, you should always be careful when dealing with numbers in JavaScript.
Happy coding….!!!