Alternative to scp, transferring files between linux machines by opening parallel connections Alternative to scp, transferring files between linux machines by opening parallel connections shell shell

Alternative to scp, transferring files between linux machines by opening parallel connections


You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.

# on local hostsplit -b 1M large.file large.file. # split into 1MiB chunksfor f in large.file.*; do scp $f remote_host: & done# on remote hostcat large.file.* > large.file


Take a look at rsync to see if it will meet your needs.

The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.


Similar to Mike K's answer, check out https://code.google.com/p/scp-tsunami/ - it handles splitting the file, starting several scp processes to copy the parts and then joins them again...it can also copy to multiple hosts...

 ./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host

That splits the file into 10MB chunks and copies them using 9 scp processes...