Retrieving the last modified file in a directory over FTP using a bash script with curl Retrieving the last modified file in a directory over FTP using a bash script with curl curl curl

Retrieving the last modified file in a directory over FTP using a bash script with curl


You can sort the filenames in one shot with a multi-key sort command and grab the last line with tail to get the latest file.

You'll need to specify -t- to use a dash as sort's field separator, -n to get a numeric sort, and list each field in the order of its priority. The format for a field specifier is:

-k, --key=POS1[,POS2]     start a key at POS1 (origin 1), end it at POS2                          (default end of line)

So for the year, field 3, you'll need to list it with its 4-character width as -k3,4.

If you sort by the year, month, and day fields in that order, you'll end up with a list that has all the files in date order.

So instead of the for loop above, you can use:

FILE=`curl -u << SERVER INFO >> 2> /dev/null | grep ${FILEPATTERN} | awk -F\  '{print $9}'    | sort -n -t- -k3,4 -k1,2 -k2,2 |tail -1`


Edit: Sorry I just realised that the files you need were on the remote FTP server. I had thought they were local, and you were hoping to upload to FTP. So everything below is irrelevant.

Typically I do something like:ls -1rt /path/to/zips/*.zip | tail -n1

This is not always a good idea, spaces in file names etc. But it will return the most recent file name in the directory.

There's also find. You can specify a date range, and name. Depending on what you are doing, you might opt to scan a directory every x minutes for files created in the last x minutes. This has the advantage that it will pick up multiple new files.