tag:blogger.com,1999:blog-12320946.post5156529296945709615..comments2024-01-24T08:28:13.011+08:00Comments on Plantagenet Penguinista: How to use your unlimited quota 2AM-8AMLeon RJ Brookshttp://www.blogger.com/profile/01425209164513409690noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-12320946.post-61961672724150959322010-07-12T06:58:23.851+08:002010-07-12T06:58:23.851+08:00Yo, that’s what the second cron job does: kills of...Yo, that’s what the second cron job does: kills off the downloading process.<br /><br />One amendment needed for the grabyoutube.sh script is that ${RANDOM} fails, (resolves to an empty string), it needs to be $RANDOM instead (to resolve to a random number).Leon RJ Brookshttps://www.blogger.com/profile/01425209164513409690noreply@blogger.comtag:blogger.com,1999:blog-12320946.post-51041082172776866912010-07-11T22:10:41.095+08:002010-07-11T22:10:41.095+08:00Thanks I have 20 gig off-peak 95% unused to date u...Thanks I have 20 gig off-peak 95% unused to date usually use "at" command<br />and wget with a text file. Prone to running past 8am if links are slow to download. The best would be for me if as each line of text file is done it is deleted then at 8am something to kill wget. Then I could grab the rest on following nights.Anonymoushttps://www.blogger.com/profile/15491925108647932279noreply@blogger.comtag:blogger.com,1999:blog-12320946.post-50563593608537430292010-07-11T20:29:19.537+08:002010-07-11T20:29:19.537+08:00Brett, $list is a variable name, not a filename. W...Brett, $list is a variable name, not a filename. Within the “for” loop, it represents each text file in the /home/k~p/grabqueue directory, one file at a time.<br /><br />Each file is read, each non-empty line in the file is treated as a URL & downloaded.<br /><br />YouTube URLs will only fetch the page around the video, rather than the video file itself, hence the need for the grabyoutube.sh mucking around.<br /><br />This process is only useful if your ISP has a no-quota download period (which ExeTel does from 02:00 to 08:00).Leon RJ Brookshttps://www.blogger.com/profile/01425209164513409690noreply@blogger.comtag:blogger.com,1999:blog-12320946.post-85304860724524845802010-07-11T09:42:45.899+08:002010-07-11T09:42:45.899+08:00Here is my attempt
#!/bin/sh
for list in /home/k~p...Here is my attempt<br />#!/bin/sh<br />for list in /home/k~p/grabqueue/*; do<br /> N=$(wc -l $list | sed -e 's/ .*$//')<br /> for url in $(seq $N); do<br /> L=$(tail -n +$url $list | head -n 1 -)<br /> if [ "$L" != "" ]; then<br /> wget -c "$L"<br /> fi<br /> done<br /> rm -f $list<br /> done<br /> rm -f /tmp/grabqueue.pid<br /><br />I put a file named "list" in /home/k~p/grabqueue<br />Do I need<br />rm -f /home/k~p/grabqueue/list<br />as last line so it does not download over and over each night?kundip@hotmail.comnoreply@blogger.com