Re: Linux Kernel Dump Summit 2005
From: Pavel Machek
Date: Mon Oct 10 2005 - 03:49:58 EST
> FULL DUMP WITH COMPRESSION
> Those who still want a full dump, including me, are interested
> in dump compression. For example, the LKCD format (at least v7
> format) supports pagewise compression with the deflate
> algorithm. A dump analyze tool "crash" can transparently
> analyze the compressed dump file in this format.
> The compression will reduce the storage space at certain degree,
> and may also reduce the time if a dump process were I/O bounded.
I'd say that compression does not help much, it can only speed it up
> WHICH IS BETTER?
> I wrote a small compression tool for LKCD v7 format to see how
> effective the compression is, and it turned out that the time
> and size of compression were very much similar to that of gzip,
> not surprisingly.
> Compressing a 32GB dump file took about 40 minutes on Pentium 4
> Xeon 3.0GHz, which is not good enough because the dump without
> compression took only 5 minutes; eight times slower.
....you probably want to look at suspend2.net project. They have
special compressor aimed at compressing exactly this kind of data,
fast enough to be improvement.
if you have sharp zaurus hardware you don't need... you know my address
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/