Re: __bad_udelay in 2.2.18pre15

From: Andreas Schwab (schwab@suse.de)
Date: Wed Oct 11 2000 - 14:46:57 EST


"Chris Swiedler" <ceswiedler_lk@mindspring.com> writes:

|> > > 2.2.18pre15 defines udelay as (in file include/asm-i386/delay.h) :
|> > > extern void __bad_udelay(void);
|> > >
|> > > #define udelay(n) (__builtin_constant_p(n) ? \
|> > > ((n) > 20000 ? __bad_udelay() : __const_udelay((n) *
|> > > 0x10c6ul)) : \
|> > > __udelay(n))
|> > >
|> > > ...
|> > > It seems __bad_udelay is not defined anywhere in the kernel source.
|> >
|> > Correct. Its a compile time error trap
|>
|> Wouldn't it be better to use an #error directive?

There is no way to test the condition in the preprocessor.

Andreas.

-- 
Andreas Schwab                                  "And now for something
SuSE Labs                                        completely different."
Andreas.Schwab@suse.de
SuSE GmbH, Schanzäckerstr. 10, D-90443 Nürnberg
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
Please read the FAQ at http://www.tux.org/lkml/



This archive was generated by hypermail 2b29 : Sun Oct 15 2000 - 21:00:19 EST