Looks like using a "bit" variable as a function
parameter actually gets converted to an 8 bit
char. SDCC allocates a bit in BSEG which never
gets used.
For example:
char bit2sign(bit my_flag)
{
if (!my_flag) return -1;
return 1;
}
.area BSEG (BIT)
_bit2sign_my_flag_1_1::
.ds 1 ; <--- wasted bit memory
mov r2,dpl
mov a,r2
jnz 00102$
00106$:
mov dpl,#0xFF
sjmp 00103$
00102$:
mov dpl,#0x01
00103$:
ret
When the user calls this function and uses
something other than a bit, they probably
expect SDCC to cast their varialbe to a "bit"
by testing for zero vs non-zero, but what
really happens in closer to a cast to unsigned
char, so a 16 or 32 bit variable with the lower
8 bits all zeros evaluated to false.
Not sure if that's a bug, but if not it should
probably be in the manual.
Logged In: YES
user_id=63512
fixed in:
SDCCglue.c:1.72 I am not 100% sure about this
mcs51/gen.c:1.90