In the architecture where sign defaults to unsigned, the 'f' will be zero
extended to int type in the expression 'd = ~(f & ~
2880764155)', then the
'd' will become -1 wich cause the case to fail.
So it's ok for the architectures where sign defaults to signed like x86,
but failed for the architectures where sign defaults to unsigned like arm
and csky. Change char to signed char to avoid this problem.
PR testsuite/108604
gcc/testsuite:
* gcc.dg/torture/pr108574-3.c (b, f): Change type from char to
signed char.
/* { dg-do run } */
int a = 3557301289, d;
-char b, f;
+signed char b, f;
unsigned short c = 241;
short e, g;
static void h() {