0.1 + 0.2 Does Not Equal 0.3
Most JavaScript developers have probably heard of this issue, but it's not unique to JavaScript. I've tested it in JavaScript, Python, Java, C/C++, and it seems that most programming languages today yield the same result:
0.1 + 0.2 == 0.3 // false
0.1 + 0.2 == 0.3 // false
The Root Cause
These languages adhere to the IEEE754 standard (https://www.geeksforgeeks.org/ieeestandard754floatingpointnumbers/) for representing floatingpoint numbers. The IEEE754 standard has limitations in accurately representing these numbers in binary format.
Example:
It's difficult to represent 1/3 precisely in decimal form (it's approximately 0.333333...), as the number 3 repeats infinitely.
Similarly, in the IEEE754 standard, 0.1 and 0.2 are also difficult to represent accurately in binary. 0.1 (decimal) is 0.0001100110011... (binary), and 0.2 (decimal) is 0.001100110011... (binary), with the sequence 0011 repeating infinitely.
Therefore, when adding numbers with representation errors, the result will also have an error!
An interesting fact: 0.1 + 0.2 == 0.3 is false, but 0.1 + 0.1 == 0.2 is true. Do you know why? (This is an open question for you.)
Why Still Use the IEEE754 Standard?
Overall, the IEEE754 standard provides a balance between performance, accuracy, and compatibility (regardless of the processor or system architecture) for most practical applications.
However, we still need to be aware of its limitations and have appropriate solutions.
Some Solutions in JavaScript

 Rounding: Use the toFixed() function. Example:
(0.1 + 0.2).toFixed(1) // "0.3"
(0.1 + 0.2).toFixed(1) // "0.3"
Note that toFixed() returns a string.

 Use Libraries: decimal.js, big.js, and dinero.js are libraries that support calculations with higher precision.
This concludes the post. If you have any thoughts or other solutions, please leave a comment below.
References: