From: Anton P. <an...@om...> - 2006-05-12 12:30:04
|
Hi, I have the following test code: void main() { UINT32 largeWait = 0xddeeaaff; BYTE *p; int k = 0; SETUP_DEBUG(); k = 0; while(1) { p = (BYTE *)&largeWait; DEBUG_PRINT("%d : %x - %x - %x - %x", k, p[0], p[1], p[2], p[3]); k++; largeWait++; } The code works as expected, which means that increasing largeWait with 1 during each itteration increases the values pointed to by p. However, if the initalization of p is performed during declaration, like this: void main() { UINT32 largeWait = 0xddeeaaff; BYTE *p = (BYTE *)&largeWait; int k = 0; SETUP_DEBUG(); k = 0; while(1) { DEBUG_PRINT("%d : %x - %x - %x - %x", k, p[0], p[1], p[2], p[3]); k++; largeWait++; } Then the printed value of p is correct the first time, but it does not increase. I would guess that it is due to a problem with optimization, perhaps it is a known bug? Kind regards, Anton Persson, Omicron Ceti AB |