I am trying to run a function through multiple processes in bash (on CentOs).
The function does different filtering over curl:
STEP=50
function filtering() {
local i="$1"
curl -s -g -X GET 'https://url.com?filter='$i'_filter'
}
for (( i=0; i<=TOTAL; i+=STEP )); do
filtering "$i"
done
wait
It works when runs as is. However, when i run the function in 10 processes it behaves odd, bringing every time different results, mostly unreliable (according to the produced numbers in comparison to the serial filtering above). My code is:
PARALLEL_PROCESSES=10
ACTIVE_PROCESSES=0
for (( i=0; i<=TOTAL; i+=STEP )); do
filtering "$i" > /dev/null 2>&1 &
(( ACTIVE_PROCESSES++ ))
if (( ACTIVE_PROCESSES >= PARALLEL_PROCESSES )); then
wait -n 2> /dev/null
(( ACTIVE_PROCESSES-- ))
fi
done
wait
I’d very appreciate practical advices how to debug what is going on/wrong inside, and any alternative suggestion for achieving my goal.