How to run given function in Bash in parallel? How to run given function in Bash in parallel? bash bash

How to run given function in Bash in parallel?


sem is part of GNU Parallel and is made for this kind of situation.

for i in "${list[@]}"do    for j in "${other[@]}"    do        # some processing in here - 20-30 lines of almost pure bash        sem -j 4 dolong task    donedone

If you like the function better GNU Parallel can do the dual for loop in one go:

dowork() {   echo "Starting i=$1, j=$2"  sleep 5  echo "Done i=$1, j=$2"}export -f doworkparallel dowork ::: "${list[@]}" ::: "${other[@]}"


Edit: Please consider Ole's answer instead.

Instead of a separate script, you can put your code in a separate bash function. You can then export it, and run it via xargs:

#!/bin/bashdowork() {     sleep $((RANDOM % 10 + 1))    echo "Processing i=$1, j=$2"}export -f doworkfor i in "${list[@]}"do    for j in "${other[@]}"    do        printf "%s\0%s\0" "$i" "$j"    donedone | xargs -0 -n 2 -P 4 bash -c 'dowork "$@"' -- 


Solution to run multi-line commands in parallel:

for ...your_loop...; do  if test "$(jobs | wc -l)" -ge 8; then    wait -n  fi  {    any bash commands here  } &donewait

In your case:

for i in "${list[@]}"do  for j in "${other[@]}"  do    if test "$(jobs | wc -l)" -ge 8; then      wait -n    fi    {      your      multi-line      commands      here    } &  donedonewait

If there are 8 bash jobs already running, wait will wait for at least one job to complete. If/when there are less jobs, it starts new ones asynchronously.

Benefits of this approach:

  1. It's very easy for multi-line commands. All your variables are automatically "captured" in scope, no need to pass them around as arguments
  2. It's relatively fast. Compare this, for example, to parallel (I'm quoting official man):

parallel is slow at starting up - around 250 ms the first time and 150 ms after that.

  1. Only needs bash to work.

Downsides:

  1. There is a possibility that there were 8 jobs when we counted them, but less when we started waiting. (It happens if a jobs finishes in those milliseconds between the two commands.) This can make us wait with fewer jobs than required. However, it will resume when at least one job completes, or immediately if there are 0 jobs running (wait -n exits immediately in this case).
  2. If you already have some commands running asynchronously (&) within the same bash script, you'll have fewer worker processes in the loop.