Linux exec all files in directory

Recursively iterate through files in a directory

However, the above does not work for more complex things, where a lot of conditional branches, looping etc. needs to be done. I used to use this for the above:

while read line; do [. ]; done < <(find . -type f) 
$ touch $'a\nb' $ find . -type f ./a?b 

You can solve this, and keep your original design, by changing your read line to IFS= read -r line . The only character that will break it then is a newline.

4 Answers 4

while IFS= read -r -d '' -u 9 do [Do something with "$REPLY"] done 9 < <( find . -type f -exec printf '%s\0' <>+ ) 

(This works with any POSIX find , but the shell part requires bash. With *BSD and GNU find, you can use -print0 instead of -exec printf '%s\0' <> + , it will be slightly faster.)

This makes it possible to use standard input within the loop, and it works with any path.

Because I had to look it up: "read . If no names are supplied, the line read is assigned to the variable REPLY." So do echo "Filename is '$REPLY'"

Doing this is as simple as:

find -exec sh -c 'inline script "$0"' <> \; 
find -exec executable_script <> \; 

The first example gave a whole bunch of inline: not found errors for me, but this does what I expect find -exec sh -c 'echo inline script "$0"' <> \; .

The simplest (yet safe) approach is to use shell globbing:

$ for f in *; do printf ":%s:\n" "$f"; done :a b: :c d: :-e: :e f: h: 

To make the above recurse into subdirectories (in bash), you can use the globstar option; also set dotglob to match files whose name begins with . :

$ shopt -s globstar dotglob $ for f in **/*; do printf ":%s:\n" "$f"; done :a b: :c d: :-e: :e f: :foo: :foo/file1: :foo/file two: h: 

Beware that up to bash 4.2, **/ recurses into symbolic links to directories. Since bash 4.3, **/ recurses only into directories, like find .

Читайте также:  Linux посмотреть загрузку ядер процессора

Another common solution is to use find -print0 with xargs -0 :

$ touch -- 'a b' $'c\nd' $'e\tf' $'g\rh' '-e' $ find . -type f -print0 | xargs -0 -I<> printf ":%s:\n" <> h:/g :./e f: :./a b: :./-e: :./c d: 

Note that the h:/g is actually correct since the file name contains a \r .

Источник

bash script read all the files in directory

How do I loop through a directory? I know there is for f in /var/files;do echo $f;done; The problem with that is it will spit out all the files inside the directory all at once. I want to go one by one and be able to do something with the $f variable. I think the while loop would be best suited for that but I cannot figure out how to actually write the while loop. Any help would be appreciated.

The for loop is exactly right, but you are looping over a single item, the literal directory name /var/files . Your problem description is incorrect; the program you posted will simply echo /var/files . I suspect you may want for f in /var/files/* . Take care to use double quotes around "$f" everywhere.

3 Answers 3

A simple loop should be working:

for file in /var/* do #whatever you need with "$file" done 

@Mu_Qiao - I have two commands in the shell script after the do. The first is to echo $file and then echo "hi". I have 10 files in the directory. I'm getting the ten filenames and then the hi rather than 1,hi,2,hi,3,hi. etc

To write it with a while loop you can do:

ls -f /var | while read -r file; do cmd $file; done

The primary disadvantage of this is that cmd is run in a subshell, which causes some difficulty if you are trying to set variables. The main advantages are that the shell does not need to load all of the filenames into memory, and there is no globbing. When you have a lot of files in the directory, those advantages are important (that's why I use -f on ls; in a large directory ls itself can take several tens of seconds to run and -f speeds that up appreciably. In such cases 'for file in /var/*' will likely fail with a glob error.)

Читайте также:  Running linux applications on android

Источник

How to perform grep operation on all files in a directory?

Working with xenserver, and I want to perform a command on each file that is in a directory, grepping some stuff out of the output of the command and appending it in a file. I'm clear on the command I want to use and how to grep out string(s) as needed. But what I'm not clear on is how do I have it perform this command on each file, going to the next, until no more files are found.

5 Answers 5

In Linux, I normally use this command to recursively grep for a particular text within a directory:

  • r = recursive i.e, search subdirectories within the current directory
  • n = to print the line numbers to stdout
  • i = case insensitive search

grep $PATTERN * would be sufficient. By default, grep would skip all subdirectories. However, if you want to grep through them, grep -r $PATTERN * is the case.

@Tomáš Zato, just supply all your file patterns instead of *: grep $PATTERN *.cpp *.h . If you need more specific rules for what files should be grepped, use find command (check Rob's answer).

@Chris it's possible you don't have *.scss files in current directory but somewhere deeper in subdirs so grep does not look in all the files you wanted. You should use --include option to tell grep to look recursively for files that matches specific patterns: grep -r x --include '*.scss' . (note the quotes, they prevent the pattern from being expanded by the shell). Or just use find (see Rob's answer).

You want grep -s so you don't get a warning for each subdirectory that grep skips. You should probably double-quote "$PATTERN" here.

Читайте также:  Создание кластера серверов linux

Источник

Recursively read folders and executes command on each of them

I am trying to recurse into folders and then run commands on them, using bash script. Any suggestions?

The problem is unclear. Do you simply want to use find to execute a command on all the files beneath a given directory?

7 Answers 7

If you want to recurse into directories, executing a command on each file found in those, I would use the find command, instead of writing anything using shell-script, I think.

That command can receive lots of parameters, like type to filter the types of files returned, or exec to execute a command on each result.

For instance, to find directories that are under the one I'm currently in :

find . -type d -exec echo "Hello, '<>'" \; 

Which will get me somehthing like :

Hello, '.' Hello, './.libs' Hello, './include' Hello, './autom4te.cache' Hello, './build' Hello, './modules' 

Same to find the files under the current directory :

find . -type f -exec echo "Hello, '<>'" \; 

which will get me something like this :

Hello, './config.guess' Hello, './config.sub' Hello, './.libs/memcache_session.o' Hello, './.libs/memcache_standard_hash.o' Hello, './.libs/memcache_consistent_hash.o' Hello, './.libs/memcache.so' Hello, './.libs/memcache.lai' Hello, './.libs/memcache.o' Hello, './.libs/memcache_queue.o' Hello, './install-sh' Hello, './config.h.in' Hello, './php_memcache.h' . 

Some would say "it's not shell". But why re-invent the wheel ?
(And, in a way, it is shell ^^ )

For more informations, you can take a look at :

Источник

Оцените статью
Adblock
detector