From: Paul S. <pm...@gm...> - 2013-04-22 14:37:40
|
Hello, On Sun, 21 Apr 2013 20:15:45 -0400 Przemek Klosowski <prz...@gm...> wrote: > As you say, the reason for the undefined behaviour in the standard is > because different ISA (instruction set architectures) behave > differently. The underlying assumption here is that C is a 'bare > metal' language, and implements high level operations that map well > to the machine language facilities. > > As a result, you should not expect any commonality in behaviour > between different ISA if there isn't commonality between their > respective natural binary operations. If the overall fastest > generated code for a standard-defined case is to mask out the shift > count, it would be against the spirit of C to do anything else. I'm sorry if my original mail didn't make it too clear, but my whole report was regarding evaluation rules used during compile-time constant subexpression elimination. It's obvious that runtime behavior should be the most optimal, leaving corner cases to programmer. But what would be the most obvious behavior for compile-time? Feel free to argue that it's not the mathematical definition, but some obscure platform-dependent factors. > This reminds me of the section of GCC manual that said something like > 'because the behaviour in this case is undefined and left to the > implementer, our choice is to launch a game of life'. Yep, then people started to rely on it, then someone compiled stuff for MSP430, saw no game of life launched, and reported the compiler as broken. Solution? http://en.wikipedia.org/wiki/Principle_of_least_astonishment [] -- Best regards, Paul mailto:pm...@gm... |