BASH script: Downloading consecutive numbered files with wget
#!/bin/shif [ $# -lt 3 ]; then echo "Usage: $0 url_format seq_start seq_end [wget_args]" exitfiurl_format=$1seq_start=$2seq_end=$3shift 3printf "$url_format\\n" `seq $seq_start $seq_end` | wget -i- "$@"
Save the above as seq_wget
, give it execution permission (chmod +x seq_wget
), and then run, for example:
$ ./seq_wget http://someaddress.com/logs/dbsclog01s%03d.log 1 50
Or, if you have Bash 4.0, you could just type
$ wget http://someaddress.com/logs/dbsclog01s{001..050}.log
Or, if you have curl
instead of wget
, you could follow Dennis Williamson's answer.
curl
seems to support ranges. From the man
page:
URL The URL syntax is protocol dependent. You’ll find a detailed descrip‐ tion in RFC 3986. You can specify multiple URLs or parts of URLs by writing part sets within braces as in: http://site.{one,two,three}.com or you can get sequences of alphanumeric series by using [] as in: ftp://ftp.numericals.com/file[1-100].txt ftp://ftp.numericals.com/file[001-100].txt (with leading zeros) ftp://ftp.letters.com/file[a-z].txt No nesting of the sequences is supported at the moment, but you can use several ones next to each other: http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order. Since curl 7.15.1 you can also specify step counter for the ranges, so that you can get every Nth number or letter: http://www.numericals.com/file[1-100:10].txt http://www.letters.com/file[a-z:2].txt
You may have noticed that it says "with leading zeros"!