It is common for bitwise and operations to have one operator, that contains many bytes that are 0xff, e.g.
x &= 0x7fffffff;
I implemented it, but two regression tests fail. Both regression tests are about printf with integer arguments. I have verified, that if printf_large.c is compiled withut the optimization, no regressiont tests fail, even when when rest of the library and the regresson tests themselves are compiled with the optimization.
Looking at printf_large.asm, I see the optimization mostly results in changed offsets for on-stack variables and the only substantial difference affects lines lines 731 to 743 of printf_large.c. I have read through the generated asm code and the diff multiple times, but I really can't see the bug.
Maybe someone else should have a look at this.
As of revision #8998, the optimization can be enabled by uncommenting lines 4894 to 4899 of src/stm8/gen.c.