How to check status of URLs from text file using bash shell script How to check status of URLs from text file using bash shell script linux linux

How to check status of URLs from text file using bash shell script


I created a file "checkurls.sh" and placed it in my home directory where the urls.txt file is also located. I gave execute privileges to the file using

$chmod +x checkurls.sh

The contents of checkurls.sh is given below:

#!/bin/bashwhile read urldo    urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "$url" )    echo "$url  $urlstatus" >> urlstatus.txtdone < $1

Finally, I executed it from command line using the following -

$./checkurls.sh urls.txt

Voila! It works.


#!/bin/bashwhile read -ru 4 LINE; do    read -r REP < <(exec curl -IsS "$LINE" 2>&1)    echo "$LINE: $REP"done 4< "$1"

Usage:

bash script.sh urls-list.txt

Sample:

http://not-exist.com/abc.htmlhttps://kernel.org/nothing.htmlhttp://kernel.org/index.htmlhttps://kernel.org/index.html

Output:

http://not-exist.com/abc.html: curl: (6) Couldn't resolve host 'not-exist.com'https://kernel.org/nothing.html: HTTP/1.1 404 Not Foundhttp://kernel.org/index.html: HTTP/1.1 301 Moved Permanentlyhttps://kernel.org/index.html: HTTP/1.1 200 OK

For everything, read the Bash Manual. See man curl, help, man bash as well.


What about to add some parallelism to the accepted solution. Lets modify the script chkurl.sh to be little easier to read and to handle just one request at a time:

#!/bin/bashURL=${1?Pass URL as parameter!}curl -o /dev/null --silent --head --write-out "$URL %{http_code} %{redirect_url}\n" "$URL"

And now you check your list using:

cat URL.txt | xargs -P 4 -L1 ./chkurl.sh

This could finish the job up to 4 times faster.