actually ted, i'm a graduate of the midvale school for the gifted ;-)
(Far Side cartoon where small child is pushing on a door marked "pull",
next to which is a sign that says "Entrance: Midvale School for the
Gifted")
> yes, the defect rate of Linux is remarkable considering there are few
> debugging tools available. but i think the reason it is so low is because
> the lack of debugging tools *prevents* people from getting involved and
> fixing problems, so *only* the experts can fix problems. preventing a
> flood of fixes and modifications helps keep the change rate lower, and
> limits the amount of parallel work that can proceed on the kernel. but
> there is still a control issue here, whether overt or not.
>
> Actually, it's more than just only letting experts fix problems. More
> importantly, it means that (for the most part) discourages non-experts
> from writing new kernel code. This is not necessarily a bad thing; I've
> personally been in projects where new code (to add new features) which
> had been "donated" to the project ended up doing more harm than good in
> the long term, because the donated code was badly designed or
> implemented.
agreed that it's a complicated mix of costs and benefits. there's always
that kind of risk involved in working with volunteers. but leaving out
interactive kernel debugging tools raises the bar too high, IMO. one of
Linux's success stories is the kernel source code review process. it's
not perfect, but it does contribute significantly to the overall
consistency of the kernel, moreso than i think the availability and style
of debugging tools does.
not having debugging tools also means that some code is not as well
tested, since the existing testing methodologies are intrusive and
painful. even though Linux doesn't have lots of unnecessary checks for
NULL pointers, for instance, the problems and bugs that do exist seem to
languish for a very long time, often cropping up later after significant
changes happen to the kernel.
> Surely quality is more important than quantity in terms of amount of
> code (or features) added to the kernel. Or do people think that Windows
> 2000 with its 35-40 million lines of code and bloat is a good thing? :-)
i know the entrance bar has always been high for Unix-like systems; it's
kind of a tradition, and the pay-offs are obvious. but not having
debugging tools integrated into every release of the kernel ties
everyone's hands, don't you think?
i'm "mr. change management"... anything that controls the rate of change
is a good idea. i'd like the change management to be more direct and
considered than artificial and arbitrary, though, and it seems like
leaving out debugging tools is an artificial throttle on positive changes.
for example, more folks could be providing fixes and new features of
quality if there was a lower bar and a stronger, more consistent review
process. although, i do realize there is a practical limit on Linus' time
to handle a stream of reviewed fixes/features.
in other words, if what you're trying to accomplish is:
+ change control
+ long meaningful learning curves for new hackers
+ weeding out ill-designed new features
then you should state it overtly rather than sneak it in by leaving out
the tools to help hackers do their jobs. i basically agree with the ends,
but not the means.
- Chuck Lever
-- corporate: <chuckl@netscape.com> personal: <chucklever@netscape.net> or <cel@monkey.org>The Linux Scalability project: http://www.citi.umich.edu/projects/linux-scalability/
- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/