Thus retains essentially the same definition for EBCDIC platforms, but
substitutes a simpler one for ASCII platforms. On my system, the new
definition compiles to about half the assembly instructions that the old
one did (non-optimized)
A bomb-proof definition of ASCII is to make sure that the value is
unsigned in the largest possible unsigned for the platform so there is
no possible loss of information, and then the ord must be < 128.
#define FITS_IN_8_BITS(c) ((sizeof(c) == 1) \
|| (((WIDEST_UTYPE)(c) & 0xFF) == (WIDEST_UTYPE)(c)))
-#define isASCII(c) (FITS_IN_8_BITS(c) ? NATIVE_TO_UNI((U8) c) <= 127 : 0)
+#ifdef EBCDIC
+# define isASCII(c) (FITS_IN_8_BITS(c) ? NATIVE_TO_UNI((U8) (c)) < 128 : 0)
+#else
+# define isASCII(c) ((WIDEST_UTYPE)(c) < 128)
+#endif
+
#define isASCII_A(c) isASCII(c)
/* ASCII range only */