Not true. There is no need to get a magic value, because it does not need
to be constant. That is, FD_SETSIZE needs to be constant because it's used
in the __fd_set data type definition, but the size in that definition
constrains nothing except statically-sized variable definitions and the
FD_ZERO macro.
If you dynamically allocate a bit vector big enough for the maxfd you are
passing to select, you are fine. This is the way select was originally
intended to be used, since 4.2 BSD.
That is what I think glibc ought to do. The poll interface is very
wasteful of memory.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu
Please read the FAQ at http://www.tux.org/lkml/