On Tue, Feb 29, 2000 at 08:39:10AM -0500, nbecker@fred.net wrote:
> A coworker has an application that produces 10's of thousands of small
> files. It seems the performance of even a simple operation, such as
> 'rm -rf' or find | xargs rm is very very slow.
>
> What are the mechanisms that limit performance? Is it the type of
> data structure used to represent directories?
Things are done by linear search. If you want to remove 10000 files
you have to search 10000 times over an average of 5000 entries.
Quadratic behaviour.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu
Please read the FAQ at http://www.tux.org/lkml/
This archive was generated by hypermail 2b29 : Tue Feb 29 2000 - 21:00:22 EST