This math equation...
(4195835 / 3145727) * 3145727 - 4195835
is supposed to equate to 0
. According to the book Software Testing (2nd Ed),
If you get anything else, you have an old Intel Pentium CPU with a floating-point division buga software bug burned into a computer chip and reproduced over and over in the manufacturing process.
Using Python 2.7, I get -1050108
in command line.
Using Node I get 0
, just as I would in the calculator
Any deeper explanation for this? This was originally brought up due to a bug in the video game Disney's Lion King, 1994. I thought I would test the equation on a few things.