On Thu, Aug 01, 2002 at 11:48:56AM -0600, Andreas Dilger wrote:
> > Problem: The pget -n feature of lftp is very nice if you want to maximize
> > your download bandwidth, however, if getting a large file, such
> > as the one I am getting, once the file is successfully
> > retrived, transferring it to another HDD or FTPing it to another
> > computer is very slow (800KB-1600KB/s).
>
> I find it hard to believe that this would actually make a huge
> difference, except in the case where the source is throttling bandwidth
> on a per-connection basis. Either your network is saturated by the
> transfer, or some point in between is saturated. I could be wrong, of
> course, and it would be interesting to hear the reasoning behind the
> speedup.
If some link is saturated with 1000 connections, you will get 1% of the
bandwith instead of 0.1% if you use 10 concurrent connections. right?
-- Ragnar Kjørstad - To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/
This archive was generated by hypermail 2b29 : Wed Aug 07 2002 - 22:00:16 EST