我已经组合了一个简单的小bash脚本,它迭代通过100-200+ 2 2GB的数据包捕获(来自daemonlogger),它提示用户在tcpdump中匹配过滤器……然后将来自每个单独捕获的所有分组编译成1个合并的cap。我遇到的一件事是,我希望通过一次搜索多个包捕获来更快地执行……但并不只是简单地使用&作为背景。(尝试了一下,它基本上使系统瘫痪,试图将大量2 2GB的pcaps加载到内存中。(哈哈!)谁能告诉我如何在for循环中说“我想一次为每个遍运行两次或三次迭代”。等?
#!/bin/bash
echo '[+] example tcp dump filters:'
echo '[+] host 1.1.1.1'
echo '[+] host 1.1.1.1 dst port 80'
echo '[+] host 1.1.1.1 and host 2.2.2.2 and dst port 80'
echo 'tcpdump filter:'
read FILTER
cd /var/packet_recorder/
DATESTAMP=$(date +"%m-%d-%Y-%H:%M")
# make a specific folder to drop the filtered pcaps in
mkdir /var/packet_recorder/temp/$DATESTAMP
# iterate over all pcaps and check for an instance of your filter
for file in $(ls *.pcap); do
tcpdump -nn -A -w temp/$DATESTAMP/$file -r $file $FILTER
# remove empty pcaps that dont match
if [ "`ls -l temp/$DATESTAMP/$file | awk '{print $5}'`" = "24" ]; then
rm -f "temp/$DATESTAMP/$file"
fi
done
echo '[+] Merging pcaps'
# cd to your pcap directory
cd /var/packet_recorder/temp/${DATESTAMP}
# merge all of the pcaps into one file and remove the seperated files
mergecap *.pcap -w merged.pcap
rm -f InternetBorder.*
echo "\[\+\] Done. your files are in $(pwd)"发布于 2015-08-02 03:49:29
最近,我从this question那里学到了在GNU Findutils中使用GNU Parallel或xargs -P来解决这样的问题。
使用xargs -P (假设路径中没有空格)
# iterate over all pcaps and check for an instance of your filter
# process up to 3 files at a time
ls *.pcap | xargs -n1 -P3 -I{} tcpdump -nn -A -w temp/$DATESTAMP/{} -r {} $FILTER
# remove empty pcaps that dont match (remove files whose size is 24)
wc -c temp/$DATESTAMP/*.pcap | head -n -1 |
while read size path; do
if [[ "$size" = 24 ]]; then
rm -f "$path"
fi
donehttps://stackoverflow.com/questions/31764957
复制相似问题