WGET: Is it possible to automatically RESUME the download of an HTTP URL?

I use cron jobs to execute shell scripts which are very simple invocations of wget. Here's one such example:

#!/bin/sh
fileName="WRTI-"$(date +"%m-%d-%Y-%H%M")
directory=/volume1/multimedia/Internet\ Radio\ Recordings/WRTI/$1/
wget -O "$directory$fileName.mp3" -q&
nWgetPID=$!
sleep 3600
kill $nWgetPID
echo "Done."

This is an internet radio stream. My problem is that while this works fine, there are times very frequently where the stream gets "interrupted" (as best as i can tell) and I end up with a truncated copy of the stream. If there was a switch I could throw at WGET to say "resume automatically after the interruption", I would be much, much, much, (MUCH!) happier.

Any ideas?

2 Answers

As C0deDaedalus wrote, wget -c means to resume downloading a partially downloaded file by sending the "Range" header. Since you're dealing with a live internet stream, this flag won't work as intended. At best, it may enable you to continue appending the stream into the same downloaded file, ignoring the fact that you lost a chunk when the stream got interrupted, and at best, the missing chunk won't cause trouble for your MP3 player.

Regardless, you seem to have a different problem as well: Your script needs to actually detect when the wget process ends too soon, so that it can even try to resume the download. Here is a way to detect whether a process has ended, using the exit status of kill -0. This means you can't just use sleep 3600 either, so you'll need a different way to end after 1 hour. So you'd have something like this:

#!/usr/bin/bash
directory=/volume1/multimedia/Internet\ Radio\ Recordings/WRTI/$1/
endSeconds=$((SECONDS + 3600))
while [ $SECONDS -lt $endSeconds ]; do fileName="WRTI-"$(date +"%m-%d-%Y-%H%M%S") wget -O "$directory$fileName.mp3" -q& nWgetPID=$! while kill -0 "$nWgetPID" >/dev/null 2>&1; do if [ $SECONDS -gt $endSeconds ] then kill "$nWgetPID" fi sleep 1 done
done
echo "Done."

As written, this starts a new file after every interruption. You could also probably incorporate wget -c to put everything into the same file, but then you won't be able to easily diagnose how many interruptions there were and when they happened.

0

Well, To continue interrupted downloads with wget you can use -c option that means to continue.

wget -c -O "$directory$fileName.mp3" -q

Some points here :

  • It assumes that you have partial downloaded file on your local system.
  • If the download file with earlier invocation is empty, It will refuse to continue. In that case you need to remove empty file first.
  • While using -c , Any file that's bigger on the server than locally will be considered as an incomplete download and only

    (length(remote) - length(local)) bytes

    will be downloaded and tacked onto the end of the local file.

  • Important

    -c only works (See last line under -c option) with FTP servers and with HTTP servers that support the Range header and in your case It's HTTP.

So, Goodluck with that !

Feel free to add-in more details.

1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like