How to get the list of files in a directory in a shell script?
where $search_dir is a relative path. However, $search_dir contains many files with whitespaces in their names. In that case, this script does not run as expected. I know I could use for entry in * , but that would only work for my current directory. I know I can change to that directory, use for entry in * then change back, but my particular situation prevents me from doing that. I have two relative paths $search_dir and $work_dir , and I have to work on both simultaneously, reading them creating/deleting files in them etc. So what do I do now? PS: I use bash.
13 Answers 13
search_dir=/the/path/to/base/dir for entry in "$search_dir"/* do echo "$entry" done
Can you explain why for entry in «$search_dir/*» don’t work? Why we need to place /* outside of quotes?
This doesn’t work (in bash with the default settings) if the folder is empty or if some files start with a period.
This is a way to do it where the syntax is simpler for me to understand:
yourfilenames=`ls ./*.txt` for eachfile in $yourfilenames do echo $eachfile done
./ is the current working directory but could be replaced with any path
*.txt returns anything.txt
You can check what will be listed easily by typing the ls command straight into the terminal.
Basically, you create a variable yourfilenames containing everything the list command returns as a separate element, and then you loop through it. The loop creates a temporary variable eachfile that contains a single element of the variable it’s looping through, in this case a filename. This isn’t necessarily better than the other answers, but I find it intuitive because I’m already familiar with the ls command and the for loop syntax.
This works OK for a quick, informal script or one-liner, but it will break if a filename contains newlines, unlike the glob-based solutions.
@SorenBjornstad thanks for the advice! I didn’t know newlines were permitted in filenames- what kind of files might have them? Like, is this something that occurs commonly?
Newlines in filenames are evil for this reason and as far as I know there’s no legitimate reason to use them. I’ve never seen one in the wild myself. That said, it’s totally possible to maliciously construct filenames with newlines in such a way as to exploit this. (For instance, imagine a directory containing files A , B , and C . You create files called B\nC and D , then choose to delete them. Software that doesn’t handle this right could end up deleting preexisting files B and C instead even if you didn’t have permission to do that.)
mywiki.wooledge.org/ParsingLs explains a large number of pitfalls with this approach. You should basically never use ls in scripts. It’s silly anyway; the shell has already expanded the wildcard by the time ls runs.
The other answers on here are great and answer your question, but this is the top google result for «bash get list of files in directory», (which I was looking for to save a list of files) so I thought I would post an answer to that problem:
ls $search_path > filename.txt
If you want only a certain type (e.g. any .txt files):
ls $search_path | grep *.txt > filename.txt
Note that $search_path is optional; ls > filename.txt will do the current directory.
No need to use grep to get only .txt files: `ls $search_path/*.txt > filename.txt’. But more importantly, one should not use the output of the ls command to parse file names.
@VictorZamanian, can you elaborate why we should not use the output of ls to parse filenames? Haven’t heard of this before.
@samurai_jane There’s a lot of links to provide regarding this topic, but here’s one first search result: mywiki.wooledge.org/ParsingLs. I even saw a question here on SO claiming the reasons for not parsing the output of ls were BS and was very elaborative about it. But the replies/answers still claimed it was a bad idea. Have a look: unix.stackexchange.com/questions/128985/…
for entry in "$search_dir"/* "$work_dir"/* do if [ -f "$entry" ];then echo "$entry" fi done
$ pwd; ls -l /home/victoria/test total 12 -rw-r--r-- 1 victoria victoria 0 Apr 23 11:31 a -rw-r--r-- 1 victoria victoria 0 Apr 23 11:31 b -rw-r--r-- 1 victoria victoria 0 Apr 23 11:31 c -rw-r--r-- 1 victoria victoria 0 Apr 23 11:32 'c d' -rw-r--r-- 1 victoria victoria 0 Apr 23 11:31 d drwxr-xr-x 2 victoria victoria 4096 Apr 23 11:32 dir_a drwxr-xr-x 2 victoria victoria 4096 Apr 23 11:32 dir_b -rw-r--r-- 1 victoria victoria 0 Apr 23 11:32 'e; f' $ find . -type f ./c ./b ./a ./d ./c d ./e; f $ find . -type f | sed 's/^\.\///g' | sort a b c c d d e; f $ find . -type f | sed 's/^\.\///g' | sort > tmp $ cat tmp a b c c d d e; f
$ pwd /home/victoria $ find $(pwd) -maxdepth 1 -type f -not -path '*/\.*' | sort /home/victoria/new /home/victoria/new1 /home/victoria/new2 /home/victoria/new3 /home/victoria/new3.md /home/victoria/new.md /home/victoria/package.json /home/victoria/Untitled Document 1 /home/victoria/Untitled Document 2 $ find . -maxdepth 1 -type f -not -path '*/\.*' | sed 's/^\.\///g' | sort new new1 new2 new3 new3.md new.md package.json Untitled Document 1 Untitled Document 2
- . : current folder
- remove -maxdepth 1 to search recursively
- -type f : find files, not directories ( d )
- -not -path ‘*/\.*’ : do not return .hidden_files
- sed ‘s/^\.\///g’ : remove the prepended ./ from the result list
find "$" "$" -mindepth 1 -maxdepth 1 -type f -print0 | xargs -0 -I <> echo "<>"
I know this is pretty old but I can’t seem to get the last xargs -0 -i echo «<>» command, care to explain me a bit? In particular what is the -i echo «<>» part do? Also I read from the man page that -i is deprecated now and we should use -I insted.
Thanks! This is useful, also for the slow minded like me I think that the <> is the string that is replaced with the matches by the find command.
why do you use xargs ? by default, find prints what it finds. you could delete everything from -print0 .
Similar to Accepted answer — but lists only file names instead of full paths:
This seems to have been answered a while ago, but I guess I want to also contribute an answer that just lists the files in the desired directory, as opposed to the full paths.
#search_dir=/the/path/to/base/dir/ IFS=$'\n' #for in $() splits based on IFS search_dir="$(pwd)" for entry in $(ls $search_dir) do echo $entry done
If you also wanted to filter for a specific file you would add a grep -q statement.
#search_dir=/the/path/to/base/dir/ IFS=$'\n' #for in $() splits based on IFS search_dir="$(pwd)" for entry in $(ls $search_dir) do if grep -q "File should contain this entire string"
More information about IFS can be found here.
More information about finding substrings in shell can be found here.
The accepted answer will not return files prefix with a ' . ' To do that use
for entry in "$search_dir"/* "$search_dir"/.[!.]* "$search_dir"/. * do echo "$entry" done
How to get the list of files in a directory in a shell script?
In addition to the most-upvoted answer by @Ignacio Vazquez-Abrams, consider the following solutions which also all work, depending on what you are trying to do. Note that you can replace "path/to/some/dir" with . in order to search in the current directory.
1. List different types of files using find and ls
Tip: for any of the find examples below, you can pipe the output to sort -V if you'd like it sorted.
find . -maxdepth 1 -type f | sort -V
List only regular files ( -type f ) 1 level deep:
# General form find "path/to/some/dir" -maxdepth 1 -type f # In current directory find . -maxdepth 1 -type f
List only symbolic links ( -type l ) 1 level deep:
# General form find "path/to/some/dir" -maxdepth 1 -type l # In current directory find . -maxdepth 1 -type l
List only directories ( -type d ) 1 level deep:
Note that for the find example here, we also add -mindepth 1 in order to exclude the current directory, . , which would be printed as . at the top of the directory list otherwise. See here: How to exclude this / current / dot folder from find "type d"
# General form find "path/to/some/dir" -mindepth 1 -maxdepth 1 -type d # In current directory find . -mindepth 1 -maxdepth 1 -type d # OR, using `ls`: ls -d
Combine some of the above: list only regular files and symbolic links ( -type f,l ) 1 level deep:
Use a comma ( , ) to separate arguments to -type :
# General form find "path/to/some/dir" -maxdepth 1 -type f,l # In current directory find . -maxdepth 1 -type f,l
2. Capture the output of any command into a bash indexed array, with elements separated by the newline char ( \n )
However, $search_dir contains many files with whitespaces in their names. In that case, this script does not run as expected.
This is solved by telling bash to separate elements in the string based on the newline char \n instead of the space char--which is the default IFS (Internal Field Separator--see The Meaning of IFS in Bash Scripting) variable used by bash. To do this, I recommend using the mapfile command.
The bash script static code analyzer tool named shellscript recommends using mapfile or read -r whenever you want to read in a string into a bash array, separating elements based on the newline char ( \n ). See: https://github.com/koalaman/shellcheck/wiki/SC2206.
Update: to see examples of how to do this with both mapfile and read -r see my answer here: How to read a multi-line string into a regular bash "indexed" array. I now prefer to use read -r instead of mapfile , because mapfile will KEEP any empty lines as elements in the array, if any exist, which I do NOT want, whereas read -r [again, my preference now] will NOT keep empty lines as elements in the array.
(Back to my original answer:)
Here is how to convert a newline-separated string into a regular bash "indexed" array with the mapfile command.
# Capture the output of `ls -1` into a regular bash "indexed" array. # - includes both files AND directories! mapfile -t allfilenames_array
- We use ls -1 (that's a "dash numeral_one") in order to put each filename on its own line, thereby separating them all by the newline \n char.
- If you'd like to Google it,
- See mapfile --help , or help mapfile , for help.
Full code example:
echo "Output of 'ls -1'" echo "-----------------" ls -1 echo "" # Capture the output of `ls -1` into a regular bash "indexed" array. # - includes both files AND directories! mapfile -t allfilenames_array " dirnames_array_len="$" # 1. Now manually print all elements in each array echo "All filenames (files AND dirs) (count = $allfilenames_array_len):" for filename in "$"; do echo " $filename" done echo "Dirnames ONLY (count = $dirnames_array_len):" for dirname in "$"; do # remove the `./` from the beginning of each dirname dirname="$(basename "$dirname")" echo " $dirname" done echo "" # OR, 2. manually print the index number followed by all elements in the array echo "All filenames (files AND dirs) (count = $allfilenames_array_len):" for i in "$"; do printf " %3i: %s\n" "$i" "$" done echo "Dirnames ONLY (count = $dirnames_array_len):" for i in "$"; do # remove the `./` from the beginning of each dirname dirname="$(basename "$")" printf " %3i: %s\n" "$i" "$dirname" done echo ""
Here is the example output of the code block just above being run inside the eRCaGuy_hello_world/python dir of my eRCaGuy_hello_world repo:
eRCaGuy_hello_world/python$ ../bash/array_list_all_files_and_directories.sh Output of 'ls -1' ----------------- autogenerate_c_or_cpp_code.py autogenerated auto_white_balance_img.py enum_practice.py raw_bytes_practice.py slots_practice socket_talk_to_ethernet_device.py textwrap_practice_1.py yaml_import All filenames (files AND dirs) (count = 9): autogenerate_c_or_cpp_code.py autogenerated auto_white_balance_img.py enum_practice.py raw_bytes_practice.py slots_practice socket_talk_to_ethernet_device.py textwrap_practice_1.py yaml_import Dirnames ONLY (count = 3): autogenerated slots_practice yaml_import All filenames (files AND dirs) (count = 9): 0: autogenerate_c_or_cpp_code.py 1: autogenerated 2: auto_white_balance_img.py 3: enum_practice.py 4: raw_bytes_practice.py 5: slots_practice 6: socket_talk_to_ethernet_device.py 7: textwrap_practice_1.py 8: yaml_import Dirnames ONLY (count = 3): 0: autogenerated 1: slots_practice 2: yaml_import