From: Jesse Brandeburg Date: Tue, 10 Mar 2020 22:17:46 +0000 (-0700) Subject: x86: Fix bitops.h warning with a moved cast X-Git-Tag: v5.15~4176^2 X-Git-Url: http://review.tizen.org/git/?a=commitdiff_plain;h=1651e700664b4597ddf4f8adfe435252a0d11277;p=platform%2Fkernel%2Flinux-starfive.git x86: Fix bitops.h warning with a moved cast Fix many sparse warnings when building with C=1. These are useless noise from the bitops.h file and getting rid of them helps developers make more use of the tools and possibly find real bugs. When the kernel is compiled with C=1, there are lots of messages like: arch/x86/include/asm/bitops.h:77:37: warning: cast truncates bits from constant value (ffffff7f becomes 7f) CONST_MASK() is using a signed integer "1" to create the mask which is later cast to (u8), in order to yield an 8-bit value for the assembly instructions to use. Simplify the expressions used to clearly indicate they are working on 8-bit values only, which still keeps sparse happy without an accidental promotion to a 32 bit integer. The warning was occurring because certain bitmasks that end with a bit set next to a natural boundary like 7, 15, 23, 31, end up with a mask like 0x7f, which then results in sign extension due to the integer type promotion rules[1]. It was really only clear_bit() that was having problems, and it was only on some bit checks that resulted in a mask like 0xffffff7f being generated after the inversion. Verify with a test module (see next patch) and assembly inspection that the fix doesn't introduce any change in generated code. [ bp: Massage. ] Signed-off-by: Jesse Brandeburg Signed-off-by: Borislav Petkov Reviewed-by: Andy Shevchenko Acked-by: Luc Van Oostenryck Acked-by: Peter Zijlstra (Intel) Link: https://stackoverflow.com/questions/46073295/implicit-type-promotion-rules [1] Link: https://lkml.kernel.org/r/20200310221747.2848474-1-jesse.brandeburg@intel.com --- diff --git a/arch/x86/include/asm/bitops.h b/arch/x86/include/asm/bitops.h index 062cdec..53f246e 100644 --- a/arch/x86/include/asm/bitops.h +++ b/arch/x86/include/asm/bitops.h @@ -54,7 +54,7 @@ arch_set_bit(long nr, volatile unsigned long *addr) if (__builtin_constant_p(nr)) { asm volatile(LOCK_PREFIX "orb %1,%0" : CONST_MASK_ADDR(nr, addr) - : "iq" ((u8)CONST_MASK(nr)) + : "iq" (CONST_MASK(nr) & 0xff) : "memory"); } else { asm volatile(LOCK_PREFIX __ASM_SIZE(bts) " %1,%0" @@ -74,7 +74,7 @@ arch_clear_bit(long nr, volatile unsigned long *addr) if (__builtin_constant_p(nr)) { asm volatile(LOCK_PREFIX "andb %1,%0" : CONST_MASK_ADDR(nr, addr) - : "iq" ((u8)~CONST_MASK(nr))); + : "iq" (CONST_MASK(nr) ^ 0xff)); } else { asm volatile(LOCK_PREFIX __ASM_SIZE(btr) " %1,%0" : : RLONG_ADDR(addr), "Ir" (nr) : "memory");