How do I capture the output from the ls or find command to store all file names in an array? How do I capture the output from the ls or find command to store all file names in an array? arrays arrays

How do I capture the output from the ls or find command to store all file names in an array?


To answer your exact question, use the following:

arr=( $(find /path/to/toplevel/dir -type f) )

Example

$ find . -type f./test1.txt./test2.txt./test3.txt$ arr=( $(find . -type f) )$ echo ${#arr[@]}3$ echo ${arr[@]}./test1.txt ./test2.txt ./test3.txt$ echo ${arr[0]}./test1.txt

However, if you just want to process files one at a time, you can either use find's -exec option if the script is somewhat simple, or you can do a loop over what find returns like so:

while IFS= read -r -d $'\0' file; do  # stuff with "$file" heredone < <(find /path/to/toplevel/dir -type f -print0)


for i in `ls`; do echo $i; done;

can't get simpler than that!

edit: hmm - as per Dennis Williamson's comment, it seems you can!

edit 2: although the OP specifically asks how to parse the output of ls, I just wanted to point out that, as the commentators below have said, the correct answer is "you don't". Use for i in * or similar instead.


You actually don't need to use ls/find for files in current directory.

Just use a for loop:

for files in *; do     if [ -f "$files" ]; then        # do something    fidone

And if you want to process hidden files too, you can set the relative option:

shopt -s dotglob

This last command works in bash only.