The previous post was written in jest
Pascal Cuoq - 25th Jul 2012Just a quick update to provide context for the previous post.
The previous post assumes the reader is familiar with the notion of undefined behavior and how C compilers have started to justify their more aggressive optimizations along the lines of “this program has always been undefined”.
Long ago, a C programmer trying to write a portable C program had to avoid signed overflows because different architectures could exhibit different behaviors depending on whether signed numbers were represented as sign-magnitude, in one's complement or in two's complement. These days are gone. A few dedicated architectures (DSP) may give you saturating behavior for overflows, but mostly every architecture use two's complement.
A modern programmer might think “I know my target architecture uses two's complement representation, and I want wrap-around behavior here, so I will just use signed addition”.
The problem nowadays is that compilers take advantage of undefined behavior for optimization [removed dead link]. An expression such as X+1 > X
with X
of type int
may be optimized to 1
because the only case when it is not true is when X+1
overflows which is undefined behavior and therefore the compiler can do what it wants then. Incidentally this means the same compiler compiles X+1 > X
into code that produces different results for INT_MAX
on different optimization levels.
The previous post suggested to use a statement that is undefined when p
and q
alias in order to free the compiler of any constraints in these circumstances. The compiler can effectively do anything it wants including returning the incorrect result 3
if the same address is passed as both arguments of f2()
because the function is undefined then. This was not a serious suggestion but should be understood as an argument in the debate about the exploitation of undefined behavior in compiler optimization.
This debate is interesting to me as someone who works on the static analysis of critical embedded C because embedded code has constraints that make some kinds of undefined behaviors unavoidable. And critical code shouldn't use any of the dangerous undefined behaviors. And the frontier between the unavoidable undefined behaviors and the dangerous undefined behaviors is not written anywhere and it is always moving.
The previous post was largely written in jest. I was not suggesting to substitute the respected restrict
keyword with an awkward replacement. All the arguments I put forward were in bad faith.