Re: READ_ONCE() + STACKPROTECTOR_STRONG == :/ (was Re: [GIT PULL] Please pull powerpc/linux.git powerpc-5.5-2 tag (topic/kasan-bitops))

From: Linus Torvalds
Date: Tue Dec 17 2019 - 13:32:59 EST


On Tue, Dec 17, 2019 at 10:04 AM Linus Torvalds
<torvalds@xxxxxxxxxxxxxxxxxxxx> wrote:
>
> Let me think about it.

How about we just get rid of the union entirely, and just use
'unsigned long' or 'unsigned long long' depending on the size.

Something like the attached patch - it still requires that it be an
arithmetic type, but now because of the final cast.

But it might still be a cast to a volatile type, of course. Then the
result will be volatile, but at least now READ_ONCE() won't be taking
the address of a volatile variable on the stack - does that at least
fix some of the horrible code generation. Hmm?

This is untested, because I obviously still have the cases of
structures (page table entries) being accessed once..

Linus
include/linux/compiler.h | 33 +++++++++++++++++----------------
1 file changed, 17 insertions(+), 16 deletions(-)

diff --git a/include/linux/compiler.h b/include/linux/compiler.h
index 5e88e7e33abe..8b4282194f16 100644
--- a/include/linux/compiler.h
+++ b/include/linux/compiler.h
@@ -179,18 +179,18 @@ void ftrace_likely_update(struct ftrace_likely_data *f, int val,

#include <uapi/linux/types.h>

-#define __READ_ONCE_SIZE \
-({ \
- switch (size) { \
- case 1: *(__u8 *)res = *(volatile __u8 *)p; break; \
- case 2: *(__u16 *)res = *(volatile __u16 *)p; break; \
- case 4: *(__u32 *)res = *(volatile __u32 *)p; break; \
- case 8: *(__u64 *)res = *(volatile __u64 *)p; break; \
- default: \
- barrier(); \
- __builtin_memcpy((void *)res, (const void *)p, size); \
- barrier(); \
- } \
+/* "unsigned long" or "unsigned long long" - make it fit in a register if possible */
+#define __READ_ONCE_TYPE(size) \
+ __typeof__(__builtin_choose_expr(size > sizeof(0UL), 0ULL, 0UL))
+
+#define __READ_ONCE_SIZE \
+({ \
+ switch (size) { \
+ case 1: *(unsigned long *)res = *(volatile __u8 *)p; break; \
+ case 2: *(unsigned long *)res = *(volatile __u16 *)p; break; \
+ case 4: *(unsigned long *)res = *(volatile __u32 *)p; break; \
+ case 8: *(unsigned long long *)res = *(volatile __u64 *)p; break; \
+ } \
})

static __always_inline
@@ -258,13 +258,14 @@ static __always_inline void __write_once_size(volatile void *p, void *res, int s

#define __READ_ONCE(x, check) \
({ \
- union { typeof(x) __val; char __c[1]; } __u; \
+ __READ_ONCE_TYPE(sizeof(x)) __u; \
+ compiletime_assert(sizeof(x) <= sizeof(__u), "READ_ONCE type"); \
if (check) \
- __read_once_size(&(x), __u.__c, sizeof(x)); \
+ __read_once_size(&(x), &__u, sizeof(x)); \
else \
- __read_once_size_nocheck(&(x), __u.__c, sizeof(x)); \
+ __read_once_size_nocheck(&(x), &__u, sizeof(x)); \
smp_read_barrier_depends(); /* Enforce dependency ordering from x */ \
- __u.__val; \
+ (__typeof__(x))__u; \
})
#define READ_ONCE(x) __READ_ONCE(x, 1)