Run Multiple Bash Tasks Asynchronously to Save Time

Tue Mar 01 2022

I happen to have a ci job consisting of multiple tasks that requires a lot of time. If I run all the tasks synchronously, it will absolutely take a long time. So I am thinking of running them in multiple background processes to save the time. I mean to do it like Promise.all() in JavaScript.

Run Command in Background Process

To do so, I simply need to add a & symbol at the end of my command, then my command runs in background process:

bash
|
#!/usr/bin/env bash (npm install) &

Background Process Control

Processes can be controlled and awaited by using the pid:

bash
|
#!/usr/bin/env bash # Run process in background (npm install) & # Get the pid of the latest background process pid=$! # do something else like killing the process if needed # kill $pid # Wait for the process to finish wait $pid

Waiting for Multiple Processes to Finish

My ci job is to build multiple packages, so I need to wait for all the background processes to finish. To do so, I wrote this script to manage the processes:

bash
|
#!/usr/bin/env bash # Declare a variable to count failures FAIL=0 (cd ./package0 && npm install && npm run build) & (cd ./package1 && npm install && npm run build) & (cd ./package2 && npm install && npm run build) & # Look for all pid of background processes for job in $(jobs -p) do # Wait for all processes to finish, if one of them fails, increase the FAIL variable wait "$job" || ((FAIL++)) done if [ "$FAIL" == "0" ]; then echo "Building successfully finished." else # Exit the program with error code if any of the processes failed echo "$FAIL build(s) failed" exit 1 fi

References