Redirect all output to file in Bash [duplicate]
I know that in Linux, to redirect output from the screen to a file, I can either use the > or tee . However, I’m not sure why part of the output is still output to the screen and not written to the file. Is there a way to redirect all output to file?
9 Answers 9
That part is written to stderr, use 2> to redirect it. For example:
foo > stdout.txt 2> stderr.txt
or if you want in same file:
Note: this works in (ba)sh, check your shell for proper syntax
well, i found the reference and have deleted my post for having incorrect information. from the bash manual: ‘»ls 2>&1 > dirlist» directs only the standard output to dirlist, because the standard error was duplicated from the standard output before the standard output was redirected to dirlist» 🙂
also from the bash man «There are two formats for redirecting standard output and standard error: &>word and >&word Of the two forms, the first is preferred. This is semantically equivalent to >word 2>&1»
Two important addenda: If you want to pipe both stdout and stderr, you have to write the redirections in the opposite order from what works for files, cmd1 2>&1 | cmd2 ; putting the 2>&1 after the | will redirect stderr for cmd2 instead. If both stdout and stderr are redirected, a program can still access the terminal (if any) by opening /dev/tty ; this is normally done only for password prompts (e.g. by ssh ). If you need to redirect that too, the shell cannot help you, but expect can.
All POSIX operating systems have 3 streams: stdin, stdout, and stderr. stdin is the input, which can accept the stdout or stderr. stdout is the primary output, which is redirected with > , >> , or | . stderr is the error output, which is handled separately so that any exceptions do not get passed to a command or written to a file that it might break; normally, this is sent to a log of some kind, or dumped directly, even when the stdout is redirected. To redirect both to the same place, use:
EDIT: thanks to Zack for pointing out that the above solution is not portable—use instead:
If you want to silence the error, do:
Save all the terminal output to a file
Many GUI terminal emulators allow to save the scroll buffer, but this is not accessible to commands (leaving aside xdotool and this sort of black art).
4 Answers 4
You can use script . It will basically save everything printed on the terminal in that script session.
script makes a typescript of everything printed on your terminal. It is useful for students who need a hardcopy record of an interactive session as proof of an assignment, as the typescript file can be printed out later with lpr(1).
You can start a script session by just typing script in the terminal, all the subsequent commands and their outputs will all be saved in a file named typescript in the current directory. You can save the result to a different file too by just starting script like:
To logout of the script session (stop saving the contents), just type exit .
$ script output.txt Script started, file is output.txt $ ls output.txt testfile.txt foo.txt $ exit exit Script done, file is output.txt
$ cat output.txt Script started on Mon 20 Apr 2015 08:00:14 AM BDT $ ls output.txt testfile.txt foo.txt $ exit exit Script done on Mon 20 Apr 2015 08:00:21 AM BDT
script also has many options e.g. running quietly -q ( —quiet ) without showing/saving program messages, it can also run a specific command -c ( —command ) rather than a session, it also has many other options. Check man script to get more ideas.
Can it be invoked after the fact? (i.e. At the end of a session) Or does it have to be invoked before the content you want logged?
To export it retroactively, try Terminal menu -> Shell -> Export text as, like here: mactricksandtips.com/2013/04/save-terminals-text-output.html
@Magne you should note in your comment that it is specific to the terminal utility in Mac. gnome-terminal, for one, does not have that menu, so we should assume that other linux terminal windows won’t have it either.
I too faced the same problem and after some search came up with this solution:
Add to your .bash_aliases this:
# Execute "script" command just once smart_script() < # if there's no SCRIPT_LOG_FILE exported yet if [ -z "$SCRIPT_LOG_FILE" ]; then # make folder paths logdirparent=~/Terminal_typescripts logdirraw=raw/$(date +%F) logdir=$logdirparent/$logdirraw logfile=$logdir/$(date +%F_%T).$$.rawlog # if no folder exist - make one if [ ! -d $logdir ]; then mkdir -p $logdir fi export SCRIPT_LOG_FILE=$logfile export SCRIPT_LOG_PARENT_FOLDER=$logdirparent # quiet output if no args are passed if [ ! -z "$1" ]; then script -f $logfile else script -f -q $logfile fi exit fi ># Start logging into new file alias startnewlog='unset SCRIPT_LOG_FILE && smart_script -v' # Manually saves current log file: $ savelog logname savelog() < # make folder path manualdir=$SCRIPT_LOG_PARENT_FOLDER/manual # if no folder exists - make one if [ ! -d $manualdir ]; then mkdir -p $manualdir fi # make log name logname=$logname=$ # add user logname if passed as argument if [ ! -z $1 ]; then logname=$logname'_'$1 fi # make filepaths txtfile=$manualdir/$logname'.txt' rawfile=$manualdir/$logname'.rawlog' # make .rawlog readable and save it to .txt file cat $SCRIPT_LOG_FILE | perl -pe 's/\e([^\[\]]|\[.*?[a-zA-Z]|\].*?\a)//g' | col -b > $txtfile # copy corresponding .rawfile cp $SCRIPT_LOG_FILE $rawfile printf 'Saved logs:\n '$txtfile'\n '$rawfile'\n' >
And to the end of your .bashrc file add this:
After you’ve done this, «script» command will be executed once in every terminal session, logging everything to ~/Terminal_typescripts/raw .
If you want, you can save current session log after the fact (in the end of the session) by typing savelog or savelog logname – this will copy current raw log to ~/Terminal_typescripts/manual and also create readable .txt log in this folder. (If you forget to do so, raw log files will still be in their folder; you’ll just have to find them.) Also you may start recording to a new log file by typing startnewlog .
There will be a lot of junk log files, but you can clean old ones from time to time, so it’s not a big problem.
How to redirect output to a file and stdout
In bash, calling foo would display any output from that command on the stdout. Calling foo > output would redirect any output from that command to the file specified (in this case ‘output’). Is there a way to redirect output to a file and have it display on stdout?
If someone just ended up here looking for capturing error output to file, take a look at — unix.stackexchange.com/questions/132511/…
A note on terminology: when you execute foo > output the data is written to stdout and stdout is the file named output . That is, writing to the file is writing to stdout. You are asking if it is possible to write both to stdout and to the terminal.
@WilliamPursell I’m not sure your clarification improves things 🙂 How about this: OP is asking if it’s possible to direct the called program’s stdout to both a file and the calling program’s stdout (the latter being the stdout that the called program would inherit if nothing special were done; i.e. the terminal, if the calling program is an interactive bash session). And maybe they also want to direct the called program’s stderr similarly («any output from that command» might be reasonably interpreted to mean including stderr).
If we have multiple commands that want to pipe outputs, use ( ) . For example (echo hello; echo world) | tee output.txt
11 Answers 11
The command you want is named tee :
For example, if you only care about stdout:
If you want to include stderr, do:
program [arguments. ] 2>&1 | tee outfile
2>&1 redirects channel 2 (stderr/standard error) into channel 1 (stdout/standard output), such that both is written as stdout. It is also directed to the given output file as of the tee command.
Furthermore, if you want to append to the log file, use tee -a as:
program [arguments. ] 2>&1 | tee -a outfile
If OP wants «all output» to be redirected, you’ll also need to grab stderr: «ls -lR / 2>&1 | tee output.file»
@evanmcdonnal The answer is not wrong, it just may not be specific enough, or complete depending on your requirements. There certainly are conditions where you might not want stderr as part of the output being saved to a file. When I answered this 5 years ago I assumed that the OP only wanted stdout, since he mentioned stdout in the subject of the post.
Ah sorry, I might have been a little confused. When I tried it I just got no output, perhaps it was all going to stderr.
Use -a argument on tee to append content to output.file , instead of overwriting it: ls -lR / | tee -a output.file
If you’re using $? afterwards it will return the status code of tee , which is probably not what you want. Instead, you can use $ .
$ program [arguments. ] 2>&1 | tee outfile
2>&1 dumps the stderr and stdout streams. tee outfile takes the stream it gets and writes it to the screen and to the file «outfile».
This is probably what most people are looking for. The likely situation is some program or script is working hard for a long time and producing a lot of output. The user wants to check it periodically for progress, but also wants the output written to a file.
The problem (especially when mixing stdout and stderr streams) is that there is reliance on the streams being flushed by the program. If, for example, all the writes to stdout are not flushed, but all the writes to stderr are flushed, then they’ll end up out of chronological order in the output file and on the screen.
It’s also bad if the program only outputs 1 or 2 lines every few minutes to report progress. In such a case, if the output was not flushed by the program, the user wouldn’t even see any output on the screen for hours, because none of it would get pushed through the pipe for hours.
Update: The program unbuffer , part of the expect package, will solve the buffering problem. This will cause stdout and stderr to write to the screen and file immediately and keep them in sync when being combined and redirected to tee . E.g.:
$ unbuffer program [arguments. ] 2>&1 | tee outfile