开发者

bash pipe limit

开发者 https://www.devze.com 2023-03-14 23:05 出处:网络
i got a txt list of开发者_如何学JAVA urls i want to download n=1 end=`cat done1 |wc -l` while [ $n -lt $end ]

i got a txt list of开发者_如何学JAVA urls i want to download

n=1
end=`cat done1 |wc -l`
while [ $n -lt $end ]
do
nextUrls=`sed -n "${n}p" < done1`
wget -N nH --random-wait -t 3 -a download.log -A$1 $nextUrls 
let "n++"
done

i want to do it faster with pipes but if i do this

 wget -N nH --random-wait -t 3 -a download.log -A$1 $nextUrls &

my ram fills up and blocks my PC completely. Any1 know how to limit pipes created to like 10 at the same time?


You are not creating pipes (|), you are creating background processes (&). Everytime your while executes its body, you create a new wget process and don't wait for it to exit, which (depending on the value of end) may create lot of wget processes very fast. Either do sequentially (remove the &) or you can try executing n processes in parallel and wait for them.

BTW, useless use of cat: you can simply do:

end=`wc -l done1`


i got a txt list of urls i want to download... i want to do it faster..

So here's a shortest way to do that. The following command downloads the URL from the list contained in file *txt_list_of_urls* parallely running 10 threads:

xargs -a txt_list_of_urls -P 10 -r -n 1 wget -nv
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号