> Is someone working on a way to overcome the 2GB file limitation in Linux? I
> currently backup several servers using a dedicated hard drive for the
> backups. Recently I saw one backup die saying the the file size had been
> exceeded. I've never had good luck with tape backups, yes they backup, but
> whenever I really need a file, it can't be retrieved.
I use the -B flag to insure the archives are just under 2GB. This
generates multiple volumes by appending a sequence number if necessary.
For example, on a /home dump of >2GB, the commands
ddate=`date +%m%d%y-%H%M%S`
dump -0uM -b 32 -B 2097142 -f /mnt/home.$ddate.L0.dump -L "abit:/home"
/home
generate
/mnt/home.092801-130439.L0.dump001
/mnt/home.092801-130439.L0.dump002
each guaranteed to be 10K under 2GB. Fixes potential portability issues
also.
rgds,
tim.
-- - To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/
This archive was generated by hypermail 2b29 : Sun Sep 30 2001 - 21:01:05 EST