How do I run a local bash script on remote machines via ssh?
Solution 1
You can pass a script and have it execute ephemerally by piping it in and executing a shell.
e.g.
echo "ls -l; echo 'Hello World'" | ssh me@myserver /bin/bash
Naturally, the "ls -l; echo 'Hello World'"
part could be replaced with a bash script stored in a file on the local machine.
e.g.
cat script.sh | ssh me@myserver /bin/bash
Cheers!
Solution 2
There are several ways to do it.
1:
ssh user@remote_server 'bash -s' < localfile
2:
cat localfile | ssh user@remote_server
3:
ssh user@remote_server "$(< localfile)"
number 3 is my prefered way, it allows interactive commands e.g. sudo -S service nginx restart
(#1 and #2 will consume the rest of the script as input for the password question when you use sudo -S
.)
Solution 3
I would recommend python's Fabric for this purpose:
#!/usr/bin/python
# ~/fabfile.py
from fabric_api import *
env.hosts = ['host1', 'host2']
def deploy_script():
put('your_script.sh', 'your_script.sh', mode=0755)
sudo('./your_script.sh')
# from shell
$ fab deploy_script
You should be able to use the above to get started. Consult Fabric's excellent documentation to do the rest. As an addendum, it's totally possible to write your script wholly within Fabric -- no copying needed, however it should be noted that to change the script on all machines, you would only need to edit the local copy and redeploy. Furthermore, with a little more than basic usage of the API, you can modify the script based on which host it is currently running on and/or other variables. It's a sort of pythonic Expect.
Solution 4
This is exactly what Ansible is used for. There is no agent, you just have to create a text file called:
/etc/ansible/hosts
with content that looks something like:
[webhosts]
web[1-8]
This would specify that machines "web1, web2...web8" are in the group "webhosts". Then you can do things like:
ansible webhosts -m service -a "name=apache2 state=restarted" --sudo
to restart the apache2 service on all your machines, using sudo.
You can do on the fly commands like:
ansible webhosts -m shell -a "df -h"
or you can run a local script on the remote machine:
ansible webhosts -m script -a "./script.sh"
or you can create a playbook (look up the documentation for details) with a complete configuration that you want your servers to conform to and deploy it with:
ansible-playbook webplaybook.yml
Basically you can start using it as a command line tool for running commands on multiple servers and expand its usage out into a complete configuration tool as you see fit.
Solution 5
As explained in this answer you can use heredoc :
ssh user@host <<'ENDSSH'
#commands to run on remote host
ENDSSH
You have to be careful with heredoc, because it just sends text, but it doesn't really wait for the response. That means it will not wait for your commands to be executed.
Related videos on Youtube
tremoloqui
Updated on September 17, 2022Comments
-
tremoloqui over 1 year
I am looking for a way to push configuration from one central machine to several remote machines without the need to install anything on the remote machines.
The aim is to do something like you would find with tools like
cfengine
, but on a set of machines that don't have agents set up. This might actually be a good technique of setting upcfagent
on a set of existing remote machines.-
AlikElzin-kilaka almost 9 yearsSimilar on SO: stackoverflow.com/q/305035/435605
-
MoJo over 8 yearsThe actual questions has 23 upvotes where the duplicate on SO has 55 :P
-
-
tremoloqui over 13 yearsThe reason I don't want to upload the script is so it doesn't have to be managed and have the risks that you mentioned. Also, it seems to me that it's simpler than the multi-step process of uploading, processing and (optionally) deleting.
-
tremoloqui over 13 yearsI don't think this exactly answers the question, but I like the idea and I could see Fabric as a useful tool.
-
Izkata over 12 years@tremoloqui Fabric is a python wrapper around ssh - nothing needs to be installed on the target machines, save the script being pushed. Which, if rewritten as a series of Fabric commands (using
run
andsudo
), isn't even needed. -
Sharjeel about 11 yearsHow do I make this run as sudo on the remote system.e.g. If I was logged on the remote server I would usually run this as sudo -u testuser script.sh
-
Abhimanyu Srivastava about 11 yearsWhat if the script I am calling involves user interactions???
-
ptman almost 10 yearsI like ansible as much as the next guy, but if he's asking about a script, then ansible has a really nice script module:
ansible webhosts -m script script.sh
-
seumasmac almost 10 yearsAll the other answers involve a bash script, but it's not what he specifically asked for. He just said pushing config to remote machines. But good mention of the script module :)
-
Admin over 9 yearsRegarding running local script on remote machine -- Is there a way to send variable as an argument to the remote machine.. ? i.e along with the script, i want to send a variable (having multiple lines) as argument to the remote machine. The script then intends to use the variable.
-
chicks over 8 yearsPlease vote up this answer! This is the way to do it!
-
seumasmac over 8 years@ptman I have only just noticed that while he doesn't mention a script in the question, he does in the title! Sorry. I've updated.
-
Gqqnbig about 3 yearsHow would I suppress the banner then?
-
Robin A. Meade about 3 yearsWhat benefit does this have over:
ssh vm24 -t "$(< shell-test.sh )"
. Is it to ensure that the remote shell isbash
instead ofksh
or something? -
Robin A. Meade almost 3 yearsThe
/bin/bash
part is not necessary unless you are worried that the user's login shell on the remote machine might be something other than bash and you want to ensure that bash is used or if you want to supply invocation options to bash, like-s
. If that's really the case, you can make it more efficient by changing it toexec /bin/bash
. -
EndlosSchleife about 2 yearsI highly doubt "it doesn't really wait" in general. This just passes commands to
ssh
's stdin like other approaches in other answers. As with other commands / scripts, the (remote) shell exits when those are done (except for background jobs with&
or commands that send themselves in the background, but that has nothing to do with neitherssh
nor here documents).