使用多线程加速 bash 脚本?

speed up bash script with multithreading?

我有一个 bash 脚本,我将其放在一起以根据通用过滤器合并多个数据包捕获。我是后端的 运行 daemonlogger,它根据大小滚动 pcap 文件,因此有时很难获得全貌,因为我正在寻找的数据可能在一个 pcap 文件中,其余的在另一个文件中。 . 我最大的抱怨是无法加速这个过程。它一次只能处理一个 pcap。有没有人对如何使用多个子进程或多个线程加快速度有任何建议?

#!/bin/bash
echo '[+] example tcp dump filters:'
echo '[+] host 1.1.1.1'
echo '[+] host 1.1.1.1 dst port 80'
echo '[+] host 1.1.1.1 and host 2.2.2.2 and dst port 80'
echo 'tcpdump filter:'
read FILTER
cd /var/mycaps/
DATESTAMP=$(date +"%m-%d-%Y-%H:%M")
# make a specific folder to drop the filtered pcaps in
mkdir /var/mycaps/temp/$DATESTAMP
# iterate over all pcaps and check for an instance of your filter
for file in $(ls *.pcap); do
        tcpdump -nn -A -w temp/$DATESTAMP/$file -r $file $FILTER
        # remove empty pcaps that dont match
        if [ "`ls -l temp/$DATESTAMP/$file | awk '{print }'`" = "24" ]; then
                rm -f "temp/$DATESTAMP/$file"
        fi
done
echo '[+] Merging pcaps'
# cd to your pcap directory 
cd /var/mycaps/temp/${DATESTAMP}
# merge all of the pcaps into one file and remove the seperated files
mergecap *.pcap -w merged.pcap
rm -f original.*
echo "[+] Done. your files are in $(pwd)"

运行后台循环体,然后等待所有后台作业完成再继续。

max_jobs=10   # For example
job_count=0
for file in *.pcap; do   # Don't iterate over the output of ls
    (tcpdump -nn -A -w temp/"$DATESTAMP"/"$file" -r "$file" $FILTER
    # remove empty pcaps that don't match. Use stat to get the file size
    if [ "$(stat -c "%s")" = 24 ]; then
            rm -f "temp/$DATESTAMP/$file"
    fi
    ) &
    job_count=$((job_count+1))
    if [ "$job_count" -gt "$max_jobs" ]; then
        wait
        job_count=0
    fi
done
wait