> > I think a "stable" kernel should be tested by some ten people with
> > reasonably average configurations for about 24 hours.
> > A "production" kernel should at least compile.
>
> 10 people. Some of the bugs that show up in 2.0.x show up in configurations
> that maybe 1 in 1000 or 1 in 10,000 users have and perhaps only after a
> week of continual load. On a 10 user sample 2.0.30 is probably rock solid
This is not valid for what I said about development kernels. In order to
test if it compiles, it is only necessary to turn on all options that
are not mutually exclusive and compile. Kernels 2.1.29 up to 2.1.41 (I
might be off by 1 or 2) did not even compile with ISDN options activated,
at least in my case and some other people's. ld complained about
"undefined references".
If this would be checked by about 10 people with reasonable configurations
(recent GCC, binutils), it would not go unnoticed.
> A lot of people don't seem to realise just how tricky it is building a
> stable across all platforms/configurations system.
I do realize that. I'm saying there should be at least *SOME* sanity
checks before releasing even a "development" kernel.
If there is a reason why non-compiling code has to be in the kernel
sources (I would be hard-pressed to find one), then at least there should
be a file "LATEST_COMPILING_<whatever>_IS_<version number>" in the kernel
source directory.
I don't think it is a good service to the advancement of Linux if released
kernels do not compile.
Like I said, I would volunteer to test-compile 2.1.X kernels.
If kernels are not tested in any way, they should be in a different
directory and be called pre-<Version>.
- Daniel
Curly says "GO"!
http://cyclone.snafu.de/