Re: include/net/sock.h:2100:16: sparse: sparse: cast to non-scalar

From: Al Viro

Date: Mon Jan 12 2026 - 14:28:56 EST


On Mon, Jan 12, 2026 at 01:32:24PM +0100, Peter Zijlstra wrote:
> On Sat, Jan 10, 2026 at 10:35:48PM +0000, Al Viro wrote:
>
> > #define __READ_ONCE(x) \
> > ({ \
> > __unqual_scalar_typeof(x) __x = \
> > (*(volatile typeof(__x) *)(&(x))); \
> > mb(); \
> > (typeof(x))__x; \
> > })
> > combined with
> > typedef struct {
> > uid_t val;
> > } kuid_t;
> >
> > IOW, it complains about a cast from structure to itself, which is fair
> > enough - C is pretty clear about not allowing any typecasts to or from
> > non-scalar types, tautological or not.
> >
> > Why do we even need that cast? Seeing that generic __READ_ONCE() is
> > #define __READ_ONCE(x) (*(const volatile __unqual_scalar_typeof(x) *)&(x))
> > the cast added on alpha seems to be pointless.
>
> The problem was things like test_bit() that take a volatile argument,
> doing READ_ONCE() on them would instantiate a volatile temporary and GCC
> would end up generating shit code.
>
> The __unqual_scalar_typeof() was the result of trying to remove CV
> qualifiers from a type.

Sure, but what does that have to do with the final cast in there? Note
that generic __READ_ONCE() ends up with __unqual_scalar_typeof(x) for the
type of result and it seems to be working fine.

Why would the same type of result be a problem for alpha? Sure, we need
a barrier there, so it can't be literal copy of generic __READ_ONCE(),
but what's the problem with using

#define __READ_ONCE(x) \
({ \
__unqual_scalar_typeof(x) __x = \
(*(volatile typeof(__x) *)(&(x))); \
mb(); \
__x; \
})

there? What could possibly need those qualifiers added back to the result?
It is, after all, an r-value, so...

IDGI...