Take the following snippet:
#include <stddef.h>
_Bool
foo(char * p) {
return (p - 1) == NULL;
}
Both GCC and LLVM optimize the result to false.
What in the standard allows the compilers to assume p
was not 1
?
1
Take the following snippet:
#include <stddef.h>
_Bool
foo(char * p) {
return (p - 1) == NULL;
}
Both GCC and LLVM optimize the result to false.
What in the standard allows the compilers to assume p
was not 1
?
1