How to run commands as in a queue

55,578

Solution 1

Append & to the end of your command to send it to the background, and then wait on it before running the next. For example:

$ command1 &
$ wait; command2 &
$ wait; command3 &
$ ...

Solution 2

The at utility, best known for running commands at a specified time, also has a queue feature and can be asked to start running commands now. It reads the command to run from standard input.

echo 'command1 --option arg1 arg2' | at -q myqueue now
echo 'command2 ...' | at -q myqueue now

The batch command is equivalent to at -q b -m now (-m meaning the command output, if any, will be mailed to you, like cron does). Not all unix variants support queue names (-q myqueue); you may be limited to a single queue called b. Linux's at is limited to single-letter queue names.

Solution 3

Both Brad and Mankoff's solutions are good suggestions. Another that's similar to a combination of both them would be to use GNU Screen to implement your queue. This has the advantage of being able to run in the background, you can check on it whenever, and queueing up new commands just pastes them into the buffer to be executed after the previous commands exit.

First, run:

$ screen -d -m -S queue

(incidentally, now's a good time to play with some awesome .screenrc files)

That will spawn up a background screen session for you named queue.

Now, queue up as many commands as you like:

screen -S queue -X stuff "echo first; sleep 4; echo second^M"

I'm doing multiple commands in the above just for testing. Your use case would probably look more like:

screen -S queue -X stuff "echo first^M"
screen -S queue -X stuff "echo second^M"

Note that the "^M" in my line above is a way to get an embedded newline that will be interpreted later after screen stuffs it into your existing bash shell. Use "CTL-V, " to get that sequence.

It'd be pretty easy to make some simple shell-scripts to automate that and queue up commands. Then, whenever you want to check the status of your background queue, you re-attach via:

screen -S queue -r

Technically, you don't even need to name your screen session and it will work fine, but once you get hooked on it, you're going to want to leave one running all the time anyway. ;-)

Of course, if you do that, another good way to do it would be to name one of the current windows "queue" and use:

screen -S queue -p queue -X stuff "command"

Solution 4

There is a utility that I have used with great success for exactly the use case that you are describing. Recently I moved my primary laptop to new hardware, which involved moving some files to a NAS and the rest of them to the new machine.

This is how I did it.

  1. Set up all machines involved with a network connection so that they can reach each other.

  2. On the machine where you are moving files from (hereafter called source machine), install rsync with apt-get install rsync and the secret-sauce Task Spooler (website http://vicerveza.homeunix.net/~viric/soft/ts/). If you are on Debian the package name for ts is task-spooler and the executable is renamed to tsp to avoid name clash with the ts executable from the moreutils package. The Debian package linked from the website is perfectly installable on Ubuntu with dpkg -i task-spooler_0.7.3-1_amd64.deb or similar.

  3. Also make sure that all machines have SSH installed with apt-get install openssh-server. On the source machine you need to set up SSH to allow passwordless login to the target machines. The method most will use is public-key authentication with ssh-agent (see https://www.google.se/search?q=ssh+public+key+authentication for examples of that), but I sometimes use an easier method which works just as well with password authentication. Add the following to your SSH client configuration (either ~/.ssh/config or /etc/ssh/ssh_config):

    Host *
         ControlMaster auto
         ControlPath ~/.ssh/master-%r@%h:%p
    

    Then open one terminal on the source machine, login with ssh target1, authenticate as usual, and then leave this terminal open. Notice that there is a socket file named ~/.ssh/master-user@target1:22, this is the file that will keep an authenticated master session open and allow subsequent passwordless connections for user (as long as the connection uses the same target hostname and port).

    At this point you need to verify that you can login without being prompted for authentication to the target machines.

  4. Now run ts rsync -ave ssh bigfile user@target1: for a single file or ts rsync -ave ssh bigdir user@target1: for a directory. With rsync it is important not to include a trailing slash on the directory (bigdir compared to bigdir/) or rsync will assume that you meant the equivalent of bigdir/* in most other tools.

    Task Spooler will return the prompt and let you queue many of these commands in succession. Inspect the run queue with ts without arguments.

Task Spooler has many features such as rearranging the run queue, run a specific job only if another job ran successfully, etc. View the help with ts -h. I sometimes inspect the command output while it is running with ts -c.

There are other methods of doing this but for your use case they all include Task Spooler. I choose to use rsync over SSH to preserve file timestamps, which copying over SMB would not have.

Solution 5

I need stuff like this fairly frequently too. I wrote a little utility called after that executes a command whenever some other process has finished. It looks like this:

#!/usr/bin/perl

my $pid = shift;
die "Usage: $0 <pid> <command...>" unless $pid =~ /^\d+$/ && @ARGV;

print STDERR "Queueing process $$ after process $pid\n";
sleep 1 while -e "/proc/$pid";
exec @ARGV;

You then run it like so:

% command1 arg1 arg2 ...  # creates pid=2853
% after 2853 command2 arg1 arg2 ... # creates pid=9564
% after 9564 command3 arg1 arg2 ...

The big advantage to this over some other approaches is that the first job doesn't need to be run in any special way, you can follow any process with your new job.

Share:
55,578

Related videos on Youtube

Svish
Author by

Svish

Software Developer, Geek, HSP, SDA, ..., open, honest, careful, perfectionist, ... Currently into indoor rowing and rock climbing, just to mention something non-computer-related... Not the best at bragging about myself... so... not sure what more to write... 🤔

Updated on September 17, 2022

Comments

  • Svish
    Svish over 1 year

    I need to do a lot of copying of various files to various folders. I can add all my copy commands to a bash script and then run that, but then I must wait until it finishes if I want to add more commands to that copying "queue".

    Is there a way I can run commands as a queue and sort of add more commands to that queue while things are running?

    Explained in a different way, I want to start a long running task. While that is running I want to start another one that does not actually start until the first one is done. And then add another after that last one and so on. Is this possible somehow?

    • holmb
      holmb about 11 years
      May I suggest that you accept an answer that solves your problem. It seems that there are a couple of them that would work for you.
  • Erik Aronesty
    Erik Aronesty over 11 years
    yay! this syntax is awesome when you've kicked something off and then go looking at stackoverflow for a way to queue up something after it
  • Ken Williams
    Ken Williams over 11 years
    It seems like this basically just "types" the command in the running screen session, so that when the shell gains control again after the first command finishes, this command will start. In other words, it's equivalent to @mankoff's solution, but uses a fancier tool to do the typing.
  • Ken Williams
    Ken Williams over 11 years
    If you change your mind on any of those commands in the meantime, what can be done to abort the wait? It seems impervious to all my killshots.
  • Ken Williams
    Ken Williams over 11 years
    Difficult to use in practice, since it requires that every job you want to run needs to be modified to keep lock files, and needs to know (or accept as parameters) which lock files to use. Whenever possible, it's best to have the queueing external to the job.
  • Jordan
    Jordan over 11 years
    Pretty much -- the main difference is that the screen solution supports better automation. You can do some stuff by hand, other things with scripts, all with a simple interface.
  • holmb
    holmb about 11 years
    Nice script @ken. Though I'd recommend trying out Task Spooler since you seem to require this from time to time. See my answer.
  • Ken Williams
    Ken Williams about 11 years
    Looks neat, thanks. The one thing missing from TS seems to be the ability to follow jobs that weren't started under TS. Though I guess you could start a new TS job whose purpose is just to wait for an existing process to finish.
  • Alex
    Alex over 9 years
    you know I've to change my glasses, I lost the part of your answer about the existing task-spooler :-) ... anyway, that's in your comment is a very good suggestion, I'll do very soon.
  • Alex
    Alex over 9 years
    done, deleted and re-created the repository with progressive commits, only a couple of mistakes (should metabolize more local commits before push)
  • wds
    wds about 9 years
    On linux at least this does not seem to wait for the previous command to finish.
  • Gilles 'SO- stop being evil'
    Gilles 'SO- stop being evil' about 9 years
    @wds Works for me on Debian wheezy. Where and how did it break for you? Note that at only waits for the command to finish; if the command launches subprocesses that outlive their parent, at doesn't wait for the subprocesses.
  • wds
    wds about 9 years
    I tried on CentOS 6. I tested with just a simple "sleep x". I'm not sure what at does to run that but I assume it starts a subshell and that subshell should wait for the result (a sleep call does not return immediately).
  • Gert Sønderby
    Gert Sønderby about 9 years
    You just kill the command. The wait simply stops the job until previous ones finish.
  • winwaed
    winwaed over 8 years
    As well as queue names, 'now' doesn't appear standard either: bash on ubuntu is complaining about it...
  • Gilles 'SO- stop being evil'
    Gilles 'SO- stop being evil' over 8 years
    @winwaed Oh, it's now, not -t now, sorry. I hadn't noticed I had it wrong in the sample code.
  • winwaed
    winwaed over 8 years
    @Gilles: Quick reply on an old thread. In the end I implemented a Python workaround for what I wanted, but can come back to this if it turns out a pain in testing! (Bash calling Python calling Bash could be cleaner...)
  • Michael
    Michael about 6 years
    Will this work with nohup?
  • Michael
    Michael about 6 years
    It doesn't seem to respect the order of the queue though...
  • Konstantin
    Konstantin about 5 years
    @Jordan Very good, it works for me. But what is "stuff" for after -X ?
  • Jordan
    Jordan about 5 years
    @Konstantin gnu.org/software/screen/manual/html_node/Paste.html#Paste (TL;DR -- stuff whatever follows into the terminal as if you typed it)
  • Konstantin
    Konstantin about 5 years
    @Jordan Oh, I see now. It is a command. I thought it is an arbitrary string.
  • Sam Sirry
    Sam Sirry about 4 years
    And what if I don't care what's the return code?
  • Sam Sirry
    Sam Sirry about 4 years
    Oh, I found it: Semicolon separates commands on a single line.