开发者

Using wget to download all the hulkshare/mediafire linked files on a page

开发者 https://www.devze.com 2023-04-05 10:57 出处:网络
So I\'ve been trying to set up wget to download all the mp3s from www.goodmusicallday.com. Unfortunately, rather than the mp3s being hosted by the site, the site puts them up on www.hulkshare.com and

So I've been trying to set up wget to download all the mp3s from www.goodmusicallday.com. Unfortunately, rather than the mp3s being hosted by the site, the site puts them up on www.hulkshare.com and then links to the download pages. Is there a way to use the recursive and filtering abilities of wget to开发者_开发技巧 make it go to each hulkshare page and download the linked mp3?

Any help is much appreciated


So, a friend of mine actually figured out an awesome way to do this, just enter the code below in Terminal:

   IFS="";function r { echo $1|sed "s/.*$2=\([^\'\"\&;]*\).*/\1/";};for l in `wget goodmusicallday.com -O-|grep soundFile`;do wget -c `r $l soundFile` -O "`r $l titles`";done


I guess not!!!

I have tried on several occasion to do scripted downloads from mediafire, but in vain.

and that's the reason why they don't have a simple download link, instead have a timer attached to!

If you have noticed carefully, you will see that the download links(i mean the actual file hosting server is not www.mediafire.com! but rather something like download666.com).

So, i don't think it is possible with wget!!

Wget can only save the day if download links are simple html links, the a tags.

Regards,

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号