How can I send single file to multiple remote sites at the same time?

13,713

Solution 1

pdcp from the pdsh package is one option. pdsh was written to help with management of HPC clusters - I've used it for that, and I've also used it for management of multiple non-clustered machines.

pdsh and pdcp use genders to define hosts and groups of hosts (a "group" is any arbitrary tag you choose to assign to a host, and hosts can have as many tags as you want.)

For example, if you had a group called 'webservers' in /etc/genders that included hostA, hostB, hostC, then pdcp -g webservers myscript.sh /usr/local/bin would copy myscript.sh into /usr/local/bin/ on all three hosts.

Similarly, pdsh -g all uname -r would run uname -r on every host tagged with "all" in /etc/genders, with the output from each host prefixed with the host's name.

$ pdsh -g all uname -r
indra: 3.2.0-3-amd64
kali: 3.2.0-3-amd64
ganesh: 3.2.0-3-amd64
hanuman: 3.2.0-2-686-pae

pdsh commands and pdcp copies are executed in parallel (with limits and timeouts to prevent overloading of the originating system).

When the command being run produces multi-line output, it can get quote confusing to read. Another program in the pdsh package called dshbak can group the output by hostname for easier reading.


after seeing all your comments, it's possible that pdsh & pdcp may be overkill for your needs...it's really designed to be a system admin's tool rather than a normal non-root user's tool.

It may be that writing a simple shell script wrapper around scp may be good enough for you. e.g. here's an extremely simple, minimalist version of such a wrapper script.

#! /bin/bash 

# a better version would use a command line arg (e.g. -h) to get a
# comma-separated list of hostnames, but hard-coding it here illustrates
# the concept well enough.
HOSTS="[email protected] [email protected]"

# last argument is the target directory on the remote hosts
target_dir="$BASH_ARGV"

# all but the last arg are the files to copy
files=${@:1:$((${#@} - 1))}

for h in $HOSTS; do
    scp $files "$h:$target_dir"
done

Solution 2

Instead of sending a file to multiple targets at the same time you could read it multiple times at the same time.

Transfer your file to NFS, mount that filesystem on your targets and copy it from NFS to your local destination.

To do so manually and concurrent, you could use cluster-ssh (cssh) on the targets.

Share:
13,713

Related videos on Youtube

N. F.
Author by

N. F.

1/0

Updated on September 18, 2022

Comments

  • N. F.
    N. F. over 1 year

    Can scp be used to send a single file to multiple remote servers at the same time? If so, how? If not, what's the alternative?