Print list of files in a directory to a text file (but not the text file itself) from terminal
I would like to print all the filenames of every file in a directory to a .txt file. Let’s assume that I had a directory with 3 files:
file1.txt file2.txt file3.txt
file1.txt file2.txt file3.txt output.txt
Is there a way to avoid printing the name of the file where I’m redirecting the output? Or better is there a command able to print all the filenames of files in a directory except one?
@Jeremy It’s just easier for me to work with it. But yes, I could create the file in another directory completely avoiding the problem. I didn’t think about that
6 Answers 6
Note that this assumes that there’s no preexisting output.txt file — if so, delete it first.
- printf ‘%s\n’ * uses globbing (filename expansion) to robustly print the names of all files and subdirectories located in the current directory, line by line.
- Globbing happens before output.txt is created via output redirection > output.txt (which still happens before the command is executed, which explains your problem), so its name is not included in the output.
- Globbing also avoids the use of ls , whose use in scripting is generally discouraged.
In general, it is not good to parse the output of ls , especially while writing production quality scripts that need to be in good standing for a long time. See this page to find out why: Don’t parse ls output
In your example, output.txt is a part of the output in ls > output.txt because shell arranges the redirection (to output.txt) before running ls .
The simplest way to get the right behavior for your case would be:
ls file*txt > output.txt # as long as you are looking for files named that way
or, store the output in a hidden file (or in a normal file in some other directory) and then move it to the final place:
ls > .output.txt && mv .output.txt output.txt
A more generic solution would be using grep -v :
ls | grep -vFx output.txt > output.txt
files=( "$(ls)" ) printf '%s\n' "$" > output.txt
I think that still runs afoul of multiline filenames where one of the lines is output.txt (admittedly an exotic case). As for the other solution: Why not just use a string variable instead of a single-element array? files=»($ls)»; printf ‘%s’ «$files» > output.txt
ls has an ignore option and we can use find command also.
- Using ls with ignore option
ls -I "output.txt" > output.txt ls --ignore "output.txt" > output.txt
-I, —ignore are same. This option says, as in the man page, do not list implied entries matching shell PATTERN.
find \! -name "output.txt" > output.txt
-name option in find finds files/directories whose name match the pattern. ! -name excludes whose name match the pattern.
find \! -name "output.txt" -printf '%P\n' > output.txt
%P strips the path and gives only names.
Your commands assume the use of GNU utilities (Linux) — please make that prerequisite clear in your answer. For the find commands to be equivalent, you (a) need -maxdepth 1 to prevent recursion, and (b) you must sort the list of filenames.
The most safe way, without assuming anything about the file names, is to use bash arrays (in memory) or a temporary file. A temporary file does not need memory, so it may be even safer. Something like:
#!/bin/bash tmp=$(tempfile) ls > $tmp mv $tmp output.txt
Using ls and awk commands you can get the correct output.
ls -ltr | awk '/txt/ ' > output.txt
This will print only filenames.
Note that shell will always expand all globs before running it. In your specific case, the glob expansion process goes like:
# "ls *.txt > output.txt" will be expanded as ls file1.txt file2.txt file3.txt > output.txt
The reason why you get «output.txt» in your final output file is that redirection actually works among all connected programs SIMULTANEOUSLY.
That means the redirection process does not occur at the end of the program ls , but happens each time ls yields a line of output. In your case, when ls finishing yield the very first line, the file «output.txt» would be created, which will finally be return by ls anyway.
Get a list of all files in folder and sub-folder in a file
How do I get a list of all files in a folder, including all the files within all the subfolders and put the output in a file?
7 Answers 7
You can do this on command line, using the -R switch (recursive) and then piping the output to a file thus:
this will make a file called filename1 in the current directory, containing a full directory listing of the current directory and all of the sub-directories under it.
You can list directories other than the current one by specifying the full path eg:
will list everything in and under /var and put the results in a file in the current directory called filename2. This works on directories owned by another user including root as long as you have read access for the directories.
You can also list directories you don’t have access to such as /root with the use of the sudo command. eg:
sudo ls -R /root > filename3
Would list everything in /root, putting the results in a file called filename3 in the current directory. Since most Ubuntu systems have nothing in this directory filename3 will not contain anything, but it would work if it did.
Maybe telling the person to cd into the directory first could be added to answer.Also this works fine if i own the directory but if trying in a directory say owned by root it didnt.I got the usual permission denied and sudo followed by your command also gave permission denied. IS there a work around without logging in as root?
Well I did say «current» directory. The correct use of CD might the subject of another question, and I’m sure it has been. You can list directories owned by root as long as you have read access to them. Directories owned by root to which the user has read access can be listed with ls -R. It’s hard to imagine why you’d want to list directories owned by root to which you don’t have read access, but sudo does indeed work if you give the full path. I’m adding examples for both of these, but excluding the use of CD.
Just use the find command with the directory name. For example to see the files and all files within folders in your home directory, use
Also check find GNU info page by using info find command in a terminal.
This is the most powerful approach. find has many parameters to customize output format and file selection.
That’s the best approach in my opinion. Simple and practical. Could also do $ find . > output if there’s many directories.
tree
An alternative to recursive ls is the command line tool tree that comes with quite a lot of options to customize the format of the output diplayed. See the manpage for tree for all options.
will give you the same as tree using other characters for the lines.
to display hidden files too
- Go to the folder you want to get a content list from.
- Select the files you want in your list ( Ctrl + A if you want the entire folder).
- Copy the content with Ctrl + C .
- Open gedit and paste the content using Ctrl + V . It will be pasted as a list and you can then save the file.
This method will not include subfolder, content though.
You could also use the GUI counterpart to Takkat’s tree suggestion which is Baobab. It is used to view folders and subfolders, often for the purpose of analysing disk usage. You may have it installed already if you are using a GNOME desktop (it is often called disk usage analyser).
sudo apt-get install baobab
You can select a folder and also view all its subfolders, while also getting the sizes of the folders and their contents as the screenshot below shows. You just click the small down arrow to view a subfolder within a folder. It is very useful for gaining a quick insight into what you’ve got in your folders and can produce viewable lists, but at the present moment it cannot export them to file. It has been requested as a feature, however, at Launchpad. You can even use it to view the root filesystem if you use gksudo baobab .
(You can also get a list of files with their sizes by using ls -shR ~/myfolder and then export that to file.)
How to display contents of all files under a directory on the screen using unix commands
Using cat command as follows we can display content of multiple files on screen cat file1 file2 file3 But in a directory if there are more than 20 files and I want content of all those files to be displayed on the screen without using the cat command as above by mentioning the names of all files. How can I do this?
7 Answers 7
You can use the * character to match all the files in your current directory.
cat * will display the content of all the files.
If you want to display only files with .txt extension, you can use cat *.txt , or if you want to display all the files whose filenames start with «file» like your example, you can use cat file*
It is working for current directory but If I want to display the content of files under a sub directory of current directory then how?
Otherwise do find . -type f -exec cat <> \; Be careful you don’t have any non-character files, as you might screw up your display if you cat those.
@bvb that wasn’t in your question, but yes, if you want to go to one subdirectory below too, you can do a cat * */* .
If it’s just one level of subdirectory, use cat * */* Otherwise,
which means run the find command, to search the current directory (.) for all ordinary files (-type f). For each file found, run the application (-exec) cat, with the current file name as a parameter (the <> is a placeholder for the filename). The escaped semicolon is required to terminate the -exec clause.