Is there a way to perform a "tail -f" from an url? Is there a way to perform a "tail -f" from an url? linux linux

Is there a way to perform a "tail -f" from an url?


You can do auto-refresh with help of watch combined with wget.It won't show history, like tail -f, rather update screen like top.Example of command, that shows content on file.txt on the screen, and update output every five seconds:

watch -n 5 wget -qO-  http://fake.link/file.txt

Also, you can output n last lines, instead of the whole file:

watch -n 5 "wget -qO-  http://fake.link/file.txt | tail"

In case if you still need behaviour like "tail -f" (with keeping history), I think you need to write a script that will download log file each time period, compare it to previous downloaded version, and then print new lines. Should be quite easy.


I wrote a simple bash script to fetch URL content each 2 seconds and compare with local file output.txt then append the diff to the same file

I wanted to stream AWS amplify logs in my Jenkins pipeline

while true; do comm -13 --output-delimiter="" <(cat output.txt) <(curl -s "$URL") >> output.txt; sleep 2; done

don't forget to create empty file output.txt file first

: > output.txt

view the stream :

tail -f output.txt

original comment : https://stackoverflow.com/a/62347827/2073339

UPDATE:

I found better solution using wget here:

while true; do wget -ca -o /dev/null -O output.txt "$URL"; sleep 2; done

https://superuser.com/a/514078/603774


The proposed solutions periodically download the full file.

To avoid that I've created a package and published in NPM that does a HEAD request ( getting the size of the file ) and requesting only the last bytes.

Check it out and let me know if you need any help.

https://www.npmjs.com/package/@imdt-os/url-tail