I Have website I need to save a large amount开发者_运维问答 of pages from. The pages are in incremental order, index.php?id=1, index.php?id=2.... Is there a shell script(using mac) I could run to loop through all of these pages and save them individually into a directory?
In bash:
for i in {1..100}; do wget http://www.example.com/index.php?id=${i}; done
#!/bin/bash
url='http://example.com/index.php?='
dir='path/to/dir'
filename=file
extension=ext
for i in {1..100}
do
    wget "$url$i" -O "$dir$filename$i.$ext"
done
 
         
                                         
                                         
                                         
                                        ![Interactive visualization of a graph in python [closed]](https://www.devze.com/res/2023/04-10/09/92d32fe8c0d22fb96bd6f6e8b7d1f457.gif) 
                                         
                                         
                                         
                                         加载中,请稍侯......
 加载中,请稍侯......
      
精彩评论