In this case you are dividing by a Zero Constant but even if you divided by a variable with a value of Zero how do you expect the compiler to know the contents of the variable?
The compiler is doing what you told it to do and flagging it as an error in the compiler would only be detecting typing errors, it would do nothing to detect division by zero at runtime.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
If you did divisions and kept reducing the denominator, your answer would keep increasing.
For example:
10/1 = 10 10/0.1 = 100 10/0.01 = 1000 10/ 0.001 = 10000 10/0 = ? (infinity or undefined?
An answer of zero seems completely wrong.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
If GCBASIC used Floating Point Arithmetic you would be correct, but in this case you are performing Integer Arithmetic so Zero is the "Most" correct result.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Personally, I try to check for cases of division where the result is likely to be "inaccurate", where I might be dividing 9 by 105 for example. While an answer of zero might not be correct, given integer variables, would the given result be more accurate than zero? My HP 16C (in integer mode) also returns zero for this calculation (9 / 109) with the carry flag set.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I'm using Great Cow BASIC (0.99.01 2022-01-27 (Windows 64 bit) : Build 1073) with an 18F4550.
I wrote this simple program to test how GCB responds to division by zero:
dim AA as Byte
dim BB as Word
dim CC as Integer
dim DD as Long
'******
The program compiled without any errors or warnings.
I got 0 for AA, BB, CC, and DD.
Since division by 0 is undefined:
Shouldn't the compiler give a division by zero error?
When the 18F4550 runs the program why do I get 0?
In this case you are dividing by a Zero Constant but even if you divided by a variable with a value of Zero how do you expect the compiler to know the contents of the variable?
The compiler is doing what you told it to do and flagging it as an error in the compiler would only be detecting typing errors, it would do nothing to detect division by zero at runtime.
Isn't an answer of 0 the wrong result?
If you did divisions and kept reducing the denominator, your answer would keep increasing.
For example:
10/1 = 10 10/0.1 = 100 10/0.01 = 1000 10/ 0.001 = 10000 10/0 = ? (infinity or undefined?
An answer of zero seems completely wrong.
If GCBASIC used Floating Point Arithmetic you would be correct, but in this case you are performing Integer Arithmetic so Zero is the "Most" correct result.
And a far more eloquent answer from Scientific American
Personally, I try to check for cases of division where the result is likely to be "inaccurate", where I might be dividing 9 by 105 for example. While an answer of zero might not be correct, given integer variables, would the given result be more accurate than zero? My HP 16C (in integer mode) also returns zero for this calculation (9 / 109) with the carry flag set.
Thanks!