> In terms of a pure speed web server for static data (where mmap is a big win)
> the right way[TM] is probably to preindex all the files you can serve and
> to mmap the lot into memory at setup time with a one page gap between them
> that contains precomputed HTML headers for each page. You then build an
> index of the pointers to each URL, feed all the URL's through the GNU
> perfect hash program to generate a hash and each web page serve becomes
I think you get most of the way to this perfection by using squid to
accelerate an apache server behind it (and configuring everything
correctly). Acceleration is way cool, it lets you keep apache's
configurability, and offloads the "easy stuff" to a lightweight daemon.
images in particular get offloaded real fast.
> Now that would be THE web server of choice for porn sites 8)
The daemon could be named smutd :)
BTW I think you can build a daemon that's almost as configurable as
apache, which uses lexical scanning techniques to grok the request
producing longest match, merged "per_dir_configs". I've tried to figure
out how to do this for Apache, but it's language is just too general and I
kept on having to impose restrictions to get at the real good
optimizations. In the end I gave up because there's more fun things to
optimize first.
Dean