Is any file system on Linux appropriate for very large directories?

Eric Benson (eb@amazon.com)
Fri, 12 Jul 1996 12:24:28 -0700


We have an application here that uses lots of files in a single
directory. At the time it was set up, it didn't seem to be a problem.
However, due to Amazon.com's 30 percent per month growth rate, this is
now getting to be a serious problem due to the time (and kernel lockup)
required for linear searching of directories. (By the way, this
application is currently running on Suns, not on Linux, but moving it to
Linux is an option we are considering.) The "right" solution to this
problem is to reimplement our application using a "real" database, but
it is possible that it could be solved simply by using a file system
that uses some kind of hashing for name lookup! A quick review of the
file systems currently available on Linux suggests that the only one
that uses hashing is the Amiga file system. I don't mean to be
prejudiced, but it's hard to imagine that the Amiga FS is the going to
be the best choice for us. Is there any other file system choice that
will solve this problem, or any other approach that you can suggest?
-- 
Eric Benson
eb@amazon.com
http://www.amazon.com/
Earth's biggest bookstore.