On 2/23/07, Richard Knutsson <ricknu-0@xxxxxxxxxxxxxx> wrote:Both because I find the name not as expressive as simple "BIT(x % something)", but mainly since it only enables wrapping of the long-type. But that is just my opinion.Milind Choudhary wrote:
> On 2/23/07, Richard Knutsson <ricknu-0@xxxxxxxxxxxxxx> wrote:
>> > +#define BITWRAP(nr) (1UL << ((nr) % BITS_PER_LONG))
>> >
>> > & make the whole input subsystem use it
>> > The change is huge, more than 125 files using input.h
>> > & almost all use the BIT macro.
>> It is as a big of change, but have you dismissed the "BIT(nr %
>> BITS_PER_LONG)" approach?
>
> no..
> but just looking at the number of places it is being used,
> it seems that adding a new macro would be good
> which makes it look short n sweet
You have a point there but I still don't think it should be in bitops.h.
Why should we favor long-wrap before byte-wrap, so what do you think
about doing:
#define BITWRAP(x) BIT((x) % BITS_PER_LONG)
in input.h? Otherwise I think it should be call LBITWRAP (or something)
to both show what kind it is and enable us to add others later.
Why would you not want to have what you call bitwrap as a standard
behavior? Most placed to not use modulus because they know the kind of
data they are working with but should still be fine if generic
implementation did that.
grep -Enr "BIT\(.*\%" *include/asm-arm/arch-h720x/irqs.h:114:#define IRQ_TO_BIT(irq) (1 << ((irq - NR_GLBL_IRQS) % 32))