How can I get a recursive full-path listing, one line per file?
How can I spit out a flat list of recursive one-per-line paths? For example, I just want a flat listing of files with their full paths:
/home/dreftymac/. /home/dreftymac/foo.txt /home/dreftymac/bar.txt /home/dreftymac/stackoverflow /home/dreftymac/stackoverflow/alpha.txt /home/dreftymac/stackoverflow/bravo.txt /home/dreftymac/stackoverflow/charlie.txt
tree -aflix —noreport
26 Answers 26
If you want files only (omit directories, devices, etc):
find . -type f find /home/dreftymac -type f
You can even use printf output in order to display needed contextual info (e.g. find . -type f -printf ‘%p %u\n’)
@Shayan find with the -printf predicate allows you to do everyting ls does, and then some. However, it is not standard. You can use find -exec stat <> \; but unfortunately the options to stat are not standardized, either.
If you really want to use ls , then format its output using awk:
The other answer should be accepted. This is a good second answer and does indeed answer the question as written, but the best solution is the one using find . The intention of the question is clear (which is not that it has to done using ls ), and this answer is better only for those who can’t use find . The claim this answer is more «right» because it answers the written question instead of the intended one is pretty ridiculous.
Do you mind explaining the awk code? It looks like you are using a regex to catch lines that end in «:» (the «headers» with parent directory paths), but I get lost after that and definitely don’t understand the part where the last field NF is being evaluated as true/false. Thanks!
if you want to sort your output by modification time:
Thanks. I wouldn’t have thought by myself of that (nice and short) syntax — i would have used find . -name «*» -exec ls -ld ‘<>‘ \; (that one works whatever the number of files is), but your command is way shorter to write 😉
ls -ld $(find .) breaks for me if I’m listing a ntfs disk where files have spaces: ls: cannot access ./System: No such file or directory however find with quotes by @SRG works
-f print the full path prefix for each file
-i don’t print indentations
$ tree -fi . ./README.md ./node_modules ./package.json ./src ./src/datasources ./src/datasources/bookmarks.js ./src/example.json ./src/index.js ./src/resolvers.js ./src/schema.js
In order to use the files but not the links, you have to remove > from your output:
If you want to know the nature of each file, (to read only ASCII files for example) try a while loop:
tree -fi | grep -v \> | while read -r first ; do file "$" done | grep ASCII
@Nakilon what’s the closest thing? Does it display output similarly? How would you easily display similar output with a short command?
Oh, really a long list of answers. It helped a lot and finally, I created my own which I was looking for :
To List All the Files in a directory and its sub-directories:
To List All the Directories in a directory and its sub-directories:
To List All the Directories and Files in a directory and its sub-directories:
No Need of post-processing with grep, use -name in find like: find «$PWD» -type f -name *.json and if you want to delete the files listed : find «$PWD» -type f -name *.json -exec rm <> \; similarly, if you want to copy it then replace rm with cp and destination: -exec cp <> destination
Try the following simpler way:
@SalmanPK If you give find an absolute path like pwd to start with, it will print absolute paths. By the way, «How is this any different than find » 😉
find without an argument is a syntax error on some platforms. Where it isn’t, just find is equivalent to find . .
Handy for some limited appliance shells where find/locate aren’t available.
I don’t know about the full path, but you can use -R for recursion. Alternatively, if you’re not bent on ls , you can just do find * .
Using no external commands other than ls:
ls -R1 /path | while read l; do case $l in *:) d=$;; "") d=;; *) echo "$d/$l";; esac; done
@ilw That’s weird; I’d think ls -1 is fairly standard; but try just leaving it out if it’s unsupported. The purpose of that option is to force ls to print one line per file but that’s usually its behavior out of the box anyway. (But then of course, don’t use ls in scripts.) (Looking at the POSIX doco, this option was traditionally BSD only, but was introduced in POSIX in 2017.)
Run a bash command with the following format:
find /path -type f -exec ls -l \ \;
Likewise, to trim away -l details and return only the absolute paths:
You don´t need the -exec ls \ \; part, since the default behavior of find is to print the full path. That is, find /path -type f does the job if you don´t need the file attributes from ls -l .\>
The easiest way for all you future people is simply:
This however, also shows the size of whats contained in each folder You can use awk to output only the folder name:
Edit- Sorry sorry, my bad. I thought it was only folders that were needed. Ill leave this here in case anyone in the future needs it anyways.
Interesting, because it shows me stuff I didn’t know I wanted to know — kind of like Google suggest. It turns out, I like knowing how much space each file takes.
With having the freedom of using all possible ls options:
Don’t make it complicated. I just used this and got a beautiful output:
Sorry man, doesn’t fit the bill. If you need a list of full paths, you won’t get it this way. At least not with bash or zsh on BSD or MacOS
I think for a flat list the best way is:
find -D tree /fullpath/to-dir/
(or in order to save it in a txt file)
find -D tree /fullpath/to-dir/ > file.txt
Here is a partial answer that shows the directory names.
ls -mR * lists the full directory names ending in a ‘:’, then lists the files in that directory separately
sed -n ‘s/://p’ finds lines that end in a colon, strip off the colon and print the line
By iterating over the list of directories, we should be able to find the directories as well. Still workin on it. It is a challenge to get the wildcards through xargs.
Adding a wildcard to the end of an ls directory forces full paths. Right now you have this:
$ ls /home/dreftymac/ foo.txt bar.txt stackoverflow stackoverflow/alpha.txt stackoverflow/bravo.txt stackoverflow/charlie.txt
You could do this instead:
$ ls /home/dreftymac/* /home/dreftymac/. /home/dreftymac/foo.txt /home/dreftymac/bar.txt /home/dreftymac/stackoverflow: alpha.txt bravo.txt charlie.txt
Unfortunately this does not print the full path for directories recursed into, so it may not be the full solution you’re looking for.
Also unfortunately you can’t sudo ls with a wildcard (because the wildcard is expanded as the normal user).
This is slow but works recursively and prints both directories and files. You can pipe it with awk/grep if you just want the file names without all the other info/directories:
tar cf - $PWD|tar tvf -|awk ''|grep -v "/$"
A lot of answers I see. This is mine, and I think quite useful if you are working on Mac.
I’m sure you know there are some «bundle» files (.app, .rtfd, .workflow, and so on). And looking at Finder’s window they seem single files. But they are not. And $ ls or $ find see them as directories. So, unless you need list their contents as well, this works for me:
find . -not -name ".*" -not -name "." | egrep -v "\.rtfd/|\.app/|\.lpdf/|\.workflow/"
Of course this is for the working dir, and you could add other bundles’ extensions (but always with a / after them). Or any other extensions if not bundle’s without the / .
Rather interesting the » .lpdf/ » (multilingual pdf). It has normal » .pdf » extension (!!) or none in Finder. This way you get (or it just counts 1 file) for this pdf and not a bunch of stuff…
Рекурсивный перебор файлов в каталоге в Bash
Часто в терминале или shell-скрипте может потребоваться рекурсивный просмотр файлов в каталоге. В этой статье мы узнаем, как рекурсивно просматривать файлы в каталоге в Linux. Вы можете использовать эти шаги практически во всех оболочках Linux.
Рекурсивный перебор файлов в каталоге в Bash
Find — одна из лучших команд для поиска файлов, удовлетворяющих определенным критериям. Вот команда для поиска всех файлов в текущем рабочем каталоге.
В приведенной выше команде мы используем точку(.), чтобы указать, что мы хотим найти файлы в текущем рабочем каталоге. Вы можете указать другой каталог, если хотите искать в другой папке.
find /home/data -type f -print0
Мы также используем -type f, чтобы указать, что мы хотим искать только файлы, а не папки.
Приведенная выше команда выведет список всех относительных путей к файлам. Мы используем вывод вышеприведенной команды в цикле for для перебора этих файлов и работы с ними. В следующем коде вы можете заполнить часть между do. done кодом для работы с файлами. Мы используем опцию -print0 для отображения всех имен файлов, даже если они включают пробелы и другие специальные символы. Если вы используете только опцию -print, она не будет работать с файлами, содержащими пробелы и специальные символы.
for i in $(find . -type f -print0) do #код для выполнения задания для каждого файла done
Если вы хотите найти файлы определенного типа, например, pdf, вы можете использовать опцию -name в команде find, как показано ниже. Для опции name можно указать форматы имен расширений файлов, используя символы подстановки.
for i in $(find . -type f -print0 -name "*.pdf") do #код для выполнения задания над каждым файлом done
Аналогично, вот пример для поиска файлов в форматах pdf и .doc в вашей папке.
for i in $(find . -type f -print0 -name "*.pdf" or -name ".doc") do #код для выполнения задания для каждого файла done
Вы можете выполнить приведенную выше команду непосредственно в терминале или добавить ее как часть сценария оболочки.
Похожие записи:
Recursively List all directories and files
I would like to receive the following output. Suppose the directory structure on the file system is like this:
-dir1 -dir2 -file1 -file2 -dir3 -file3 -file4 -dir4 -file5 -dir5 -dir6 -dir7
/dir1 /dir1/dir2 /dir1/dir2/dir3 /dir1/dir2/dir4 /dir1/dir5 /dir1/dir5/dir6 /dir1/dir5/dir7
/dir1 /dir1/dir2/file1 /dir1/dir2/file2 /dir1/dir2/dir3/file3 /dir1/dir2/dir3/file4 /dir1/dir2/dir4/file5 /dir1/dir5/dir6 /dir1/dir5/dir7
7 Answers 7
In windows, to list only directories:
to list all files (and no directories):
redirect the output to a file:
Is it possible to list file sizes without grouping by folder? I’d like to import into Excel and do some reports.
Bash/Linux Shell
Bash/Shell Into a file
find ./ -type d > somefile.txt
find ./ -type f > somefile.txt
gives directories from current working directory, and:
gives files from current working directory.
Replace . by your directory of interest.
On Windows, you can do it like this as most flexibile solution that allows you to additionally process dir names.
You use FOR /R to recursively execute batch commands.
Check out this batch file.
@echo off SETLOCAL EnableDelayedExpansion SET N=0 for /R %%i in (.) do ( SET DIR=%%i ::put anything here, for instance the following code add dir numbers. SET /A N=!N!+1 echo !N! !DIR! )
Similary for files you can add pattern as a set instead of dot, in your case
will give you a list of all the contained items, with directories and files mixed. You can save this output to a temporary file, then extract all lines that start with ‘d’; those will be the directories. Lines that start with an ‘f’ are files.
This is an old question, but I thought I’d add something anyhow.
DIR doesn’t traverse correctly all the directory trees you want, in particular not the ones on C:. It simply gives up in places because of different protections.
ATTRIB works much better, because it finds more. (Why this difference? Why would MS make one utility work one way and another work different in this respect? Damned if I know.) In my experience the most effective way to handle this, although it’s a kludge, is to get two listings:
attrib /s /d C:\ >%TEMP%\C-with-directories.txt
attrib /s C:\ >%TEMP%\C-without-directories.txt
and get the difference between them. That difference is the directories on C: (except the ones that are too well hidden). For C:, I’d usually do this running as administrator.