Re: [PATCH] drm/fourcc: Add DOC: overview comment

From: Brian Starkey
Date: Thu Aug 23 2018 - 11:40:34 EST

Hi Matthew,

On Thu, Aug 23, 2018 at 07:34:45AM -0700, Matthew Wilcox wrote:
On Wed, Aug 22, 2018 at 04:57:33PM +0100, Brian Starkey wrote:
On Wed, Aug 22, 2018 at 05:11:55PM +0200, Daniel Vetter wrote:
> On Wed, Aug 22, 2018 at 4:59 PM, Eric Engestrom
> <eric.engestrom@xxxxxxxxx> wrote:
> > On Tuesday, 2018-08-21 17:44:17 +0100, Brian Starkey wrote:
> > > On Tue, Aug 21, 2018 at 09:26:39AM -0700, Matthew Wilcox wrote:
> > > > Can you turn them into enums? This seems to work ok:

I'm not sure that swapping out explicit 32-bit unsigned integers for
enums (unspecified width, signed integers) is necessarily a good idea,
it seems like Bad Things could happen.

The C spec says:

"the value of an enumeration constant shall be an integer constant
expression that has a value representable as an int"

Which likely gives us 4 bytes to play with on all machines
that run Linux, but if drm_fourcc.h is ever going to be some kind of
standard reference, making it non-portable seems like a fail.

And even if you do have 4 bytes in an enum, signed integers act
differently from unsigned ones, and compilers do love to invoke the UB

I think you're exaggerating how much latitude C compilers have here.
Further down in, it says:

Each enumerated type shall be compatible with char, a signed
integer type, or an unsigned integer type. The choice of type is
implementation-defined, but shall be capable of representing the values
of all the members of the enumeration.

So if we include an integer which isn't representable in a plain int,
then the compiler _must_ choose a larger type.

I don't think so... the sentence I pasted says that including a value
which isn't representable in a plain int would be illegal, and so the
compiler doesn't _have_ to do anything (nasal demons, right?).

It could choose a
signed-64-bit type rather than an unsigned-32-bit type, but I can't
imagine any compiler being quite so insane.

The paragraph about the implementation choosing a representation is
separate from the valid range of values - the compiler can pick
whatever storage it likes (smaller or even larger than an int), so
long as that storage can fit all the defined values. However,
providing a value in an enum definition which is not representable as
an int would still be invalid (irrespective of how large the storage
is) - it's a separate restriction.

Anyhow, I'm not dying to replace all the current definitions with
enums, so if someone else wants to pick that up, be my guest.