Re: [PATCH 1/2] uapi: Define GENMASK_U128

From: Anshuman Khandual
Date: Wed Jul 24 2024 - 07:59:18 EST




On 7/24/24 16:33, Arnd Bergmann wrote:
> On Wed, Jul 24, 2024, at 12:31, Anshuman Khandual wrote:
>> --- a/include/uapi/asm-generic/bitsperlong.h
>> +++ b/include/uapi/asm-generic/bitsperlong.h
>> @@ -28,4 +28,8 @@
>> #define __BITS_PER_LONG_LONG 64
>> #endif
>>
>> +#ifndef __BITS_PER_U128
>> +#define __BITS_PER_U128 128
>> +#endif
>
> I would hope we don't need this definition. Not that it
> hurts at all, but __BITS_PER_LONG_LONG was already kind
> of pointless since we don't run on anything else and
> __BITS_PER_U128 clearly can't have any other sensible
> definition than a plain 128.

Agreed, although this just followed __BITS_PER_LONG_LONG.
But sure __BITS_PER_U128 can be plain 128.

So would you like to have #ifndef __BITS_PER_LONG_LONG dropped here
as well ? But should that be folded or in a separate patch ?

>
>> #define __AC(X,Y) (X##Y)
>> #define _AC(X,Y) __AC(X,Y)
>> #define _AT(T,X) ((T)(X))
>> +#define _AC128(X) ((unsigned __int128)(X))
>
> I just tried using this syntax and it doesn't seem to do
> what you expected. gcc silently truncates the constant

But numbers passed into _AC128() are smaller in the range [128..0].
Hence the truncation might not be problematic in this context ? OR
could it be ?

> to a 64-bit value here, while clang fails the build.

Should this be disabled for CC_IS_CLANG ?

> See also https://godbolt.org/z/rzEqra7nY
> https://stackoverflow.com/questions/63328802/unsigned-int128-literal-gcc

So unless the value in there is beyond 64 bits, it should be good ?
OR am I missing something.

>
> The __GENMASK_U128() macro however seems to work correctly
> since you start out with a smaller number and then shift
> it after the type conversion.

_U128() never receives anything beyond [127..0] range. So then this
should be good ?