Setting bits using defined bits generates correct code:
PORTCbits.RC0 = 1; PORTCbits.RC0 = 0;
; .line 11; "newmain.c" PORTCbits.RC0 = 1; BSF _PORTCbits,0 ; .line 12; "newmain.c" PORTCbits.RC0 = 0; BCF _PORTCbits,0
Setting bits in a portable way does not generate the expected result:
#define bit_set(var,bitno) ((var) |= 1 << (bitno) ) #define bit_clear(var,bitno) ((var) &= ~(1 << (bitno))) bit_set(PORTC,0); bit_clear(PORTC,0);
; .line 14; "newmain.c" bit_set(PORTC,0); BSF _PORTC,0 ; .line 15; "newmain.c" bit_clear(PORTC,0); MOVF _PORTC,W BANKSEL r0x1000 MOVWF r0x1000 MOVLW 0xfe ANDWF r0x1000,W BANKSEL _PORTC MOVWF _PORTC
As you can see, set bit operation is recognized correctly, but clear bit operation results in a long sequence anding the port with a temporal register and writing it back to the port.
This is far from optimal code and can cause problems if the same port is being used in the interrupt code.
sdcc.exe --use-non-free -c -mpic14 -p16f1824 newmain.c -onewmain.o
sdcc-snapshot-i586-mingw32msvc-20140301-8956 3.4.0/rc1/ #8956 (Mar 1 2014)(MINGW32)
tested also with sdcc-3.3.0
Attached files with c sample code and resulting asm file.
Log in to post a comment.