I’m learning about two’s complement and wanted to see what the negation of TMIN (the minimum value for a signed type) is. I wrote the following code for int:
#include <stdio.h>
int main()
{
int i = -2147483648;
printf("%dn", i);
printf("Size of int: %zu bytesn", sizeof(int));
if (i < 0)
{
printf("%dn", -i);
}
else
{
printf("%dn", i);
}
return 0;
}
The output is as expected:
-2147483648
Size of int: 4 bytes
-2147483648
This indicates that the negation of -2147483648 is still -2147483648.
However, when I tried this with char, the behavior was different:
#include <stdio.h>
int main()
{
char i = -128;
printf("%dn", i);
printf("Size of int: %zu bytesn", sizeof(char));
if (i < 0)
{
printf("%dn", -i);
}
else
{
printf("%dn", i);
}
return 0;
}
The output was:
-128
Size of int: 1 byte
128
I’m using Clang on a Macbook M1.
Why does the negation of -128 for char result in 128 instead of the -128 I was expecting? Is it a compiler problem or a special feature of char?