Linux run all scripts directory

Run Bash scripts in folder all at the same time

Let’s suppose I have five Bash ( .sh ) scripts in a folder ( my_folder ), and they are named as follows:

script_1.sh script_2.sh script_3.sh script_4.sh script_5.sh 

How can I write a sixth Bash script or just a one liner that will start running all these scripts together? I need the five scripts to start running all together at the same time and not one after the other.

4 Answers 4

GNU parallel is perfect for this sort of thing. Install with sudo apt install parallel and then:

The -j controls how many processes t run in parallel, and -j0 means «all of them». Set it to another value (e.g. -j20 ) to run the scripts in batches instead.

note: there is also the parallel command from the moreutils package. The example usage: parallel -j 20 — my_folder/*.sh (Unlike GNU parallel , -j0 would run the scripts one at a time).

@jfs good point. But note that sudo apt install parallel will replace the file in /usr/bin/parallel with the one from GNU parallel. So as long as you install it, as my answer suggests, it should work as expected.

Yes, I’d found out it the hard way when trying unsuccessfully parallel commands from the manual while writing my answer to suggest GNU parallel.

To run all scripts at the same time (in parallel) use:

script_1.sh & script_2.sh & script_3.sh & script_4.sh & script_5.sh & 

To run the one after the other (sequentially) use:

script_1.sh && script_2.sh && script_3.sh && script_4.sh && script_5.sh 

Enhancement for comments

If you have 200 scripts you want to run at the same time (which might bog down the machine BTW) use this script:

#!/bin/bash for Script in my_folder/*.sh ; do echo bash "$Script" & done 

Set the script attributes to executable with the command:

The first time you run the script it will only echo the names of the 200 scripts it will be executing. When you are happy the right names are being selected edit the script and change this line:

There are three ways you can call a bash script from another as answered here:

  1. Make the other script executable, add the #!/bin/bash line at the top, and the path where the file is to the $PATH environment variable. Then you can call it as a normal command;
  2. Or call it with the source command (alias is .) like this: source /path/to/script ;
  3. Or use the bash command to execute it: /bin/bash /path/to/script ;
Читайте также:  Операционная система opensuse linux

In OP’s case one or more of the 200 scripts did not contain the shebang #!/bin/bash first line in the file. As such option 3. had to be used.

200 Scripts running at the same time

A comment has been raised about whether they are «running at the same time». On a typical 8 CPU system 25 scripts will be sharing one CPU at the same time but only one one script will execute at a time until it’s time slice (measured in milliseconds) runs out. Then the next job will receive its fair share of milliseconds, then the next job, etc., etc.

In loose terms we can say 200 jobs are running «concurrently» but not «simultaneously» across 8 CPUs which equates to 25 jobs per CPU:

thread states.png

Above image and comments below from Linux kernel scheduler

Источник

sanely run all scripts in a directory

I find this a common pattern in split-configurations (on my Debian system), and I quite like it.

Now I’ve created an simple startup script that seems to do all this:

#!/bin/sh SCRIPTDIR=/etc/scripts/up.d for SCRIPT in "$/"* do case "$" in *~|*.bak) continue ;; *) if [ -f "$" -a -x "$" ]; then "$" $@ fi ;; esac done 

Since the pattern is so common, I wonder whether there is already such a script installed on my system, which supposedly has seen more testing and bug-fixing than mine. however I cannot find one.

Do you know of such a starter-script on Debian systems?

You might want to rename all the scripts to *.sh *.bash or something. Then you can use *.sh to get your files and you will be sure nothing else in the directory will be executed. Dumb, but it works.

No, you can only pass a common argument to all the scripts (not separate arguments to separate scripts).

1 Answer 1

run-parts runs all the executable files named within constraints described below, found in directory directory. Other files and directories are silently ignored. If neither the --lsbsysinit option nor the --regex option is given then the names must consist entirely of ASCII upper- and lower-case letters, ASCII digits, ASCII underscores, and ASCII minus-hyphens. 

The default restrictions ignore files with extensions, tildes, etc. You can pass multiple arguments by passing multiple —arg options to run-parts :

-a, --arg=argument pass argument to the scripts. Use --arg once for each argument you want passed. 

You can build the argument list to be passed:

for i; do args+=(" -a '$i'"); done run-parts . "$" . 

It is used, for example, by the default crontab to execute the scripts in the various cron.* directories.

Читайте также:  Создать файл сервер linux

Источник

Run all shell scripts in folder

I have many .sh scripts in a single folder and would like to run them one after another. A single script can be executed as:

bash wget-some_long_number.sh -H 

Assume my directory is /dat/dat1/files How can I run bash wget-some_long_number.sh -H one after another? I understand something in these lines should work: for i in *.sh;. do . ; done

just put the command after do , your loop variable will be the filename without extension. Obviously, this will be missing any kind of error handling. What happens if one of the scripts fail?

3 Answers 3

for f in *.sh; do bash "$f" done 

If you want to stop the whole execution when a script fails:

for f in *.sh; do bash "$f" || break # execute successfully or break # Or more explicitly: if this execution fails, then stop the `for`: # if ! bash "$f"; then break; fi done 

It you want to run, e.g., x1.sh , x2.sh , . x10.sh :

for i in `seq 1 10`; do bash "x$i.sh" done 

To preserve exit code of failed script (responding to @VespaQQ):

#!/bin/bash set -e for f in *.sh; do bash "$f" done 

@Kirril can you explain what the break does? I do not want the job to stop at any time. Skip any failed script and continue to the next.

why -H ? ..to «Enable ! style history substitution» from man bash , or is it a conditional not listed in man bash?

@alchemy This is an option not of bash but rather of the command used in the original question ( wget-some_long_number.sh -H ).

There is a much simpler way, you can use the run-parts command which will execute all scripts in the folder:

It won’t execute all scripts. From man run-parts on Debian: . the names must consist entirely of ASCII upper- and lower-case letters, ASCII digits, ASCII underscores, and ASCII minus-hyphens (note that .sh or . is not allowed), Other files and directories are silently ignored . run-parts is primarily used by cron and is not reliable for manual usage.

run-parts was designed for a specific purpose; if you are trying to use it for something sufficiently different, it will not be a very good solution. And of course if you are not on a Debian-based platform, this particular tool doesn’t exist there. printf ‘%s\n’ ./* | sh does much the same thing anyway.

Читайте также:  Различие между системами linux

I ran into this problem where I couldn’t use loops and run-parts works with cron.

Answer:

foo () < bash -H $1 #echo $1 #cat $1 >cd /dat/dat1/files #change directory export -f foo #export foo parallel foo . *.sh #equivalent to putting a & in between each script 

You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn’t just with script execution, you could put any command in the function and it’ll work.

Источник

Running a directory full of .sh files with one command

I’ve got a directory with lots of .sh files in it, all of which, when run, open a terminal and begin a youtube-dl download. Since there are so many, I was hoping to find a way to activate them all at once, opening several different terminals immediately, as opposed to executing them all separately. I’m still very new to programming, so I’m not exactly sure how to do this and whether or not I could create a script to run them all, use a command, etc. Any help is appreciated, thanks.

Please note that you can pass many URLs to Youtube-dl and they will be downloaded sequentially. Or you could add all the commands one after the other.

5 Answers 5

While you can indeed run all .sh files in a directory with a for loop as suggested by Yaron, this is really an over complex approach to take. Don’t write many scripts if the only difference between them is the RL they will download! And there’s absolutely no reason to spawn a terminal window for each of them either!

Instead, write your YouTube URLs into a file, one per line:

http://youtube.com/foo http://youtube.com/bar http://youtube.com/baz 

Then, to download them, use ( file is the name of the file with the URLs):

while read url; do youtube-dl "$url" done < file 

That will download each video in the file, one after the other. If you want to download them all at the same time (not a good idea if you have too many), you can run each download command in the background by adding & :

while read url; do youtube-dl "$url" & done < file 

Источник

Оцените статью
Adblock
detector