Redirecting STDERR to file and screen, STDOUT only to file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Redirecting STDERR to file and screen, STDOUT only to file
# 15  
Old 09-07-2012
Quote:
Originally Posted by Lem
Just for the sake of curiosity, I was wondering whether we could try an ugly workaround (I know it's at least ugly): if GNU ls splits too much its error messages, what about a little pause between ls and tee?

It seems to work.

... <snip> ....

Code:
for i in {1..100}; do { exec 3>&1; ls file none 2>&1 1>&3 | { sleep 1; tee -a /dev/null; }; } |grep nonefile; done |wc -l
0


I realize that you are aware that your suggestion isn't elegant, but aside from that its utility is very limited. It may work for that small sample, but it won't scale. In this case, the delay is long enough for the system to copy the ls stderr writes to a kernel buffer and for ls to move on and flush its stdout stream before tee gets a chance to write to its stdout. But, if there were a lot of error messages -- e.g. from many non-existent files -- the delay wouldn't be long enough. Also, you couldn't simply extend the delay without limit because eventually an ls stderr write() would block and no matter how long you waited, the ls stdout stream would not be flushed before ls stderr messages flow downstream through tee to grep.

The following is a much better workaround:
Code:
{
    exec 3>&1
    ls iexist idont 2>&1 1>&3 | tee -a /dev/null | perl -lpe 'BEGIN {$|=1;}'
} | cat > logfile

The trace confirms that it's not just luck or timing; the writes are indeed coalesced into full lines:
Code:
# 4271: tee (since tee does not buffer, it appears to mimic GNU ls's write behavior)
# 4272: perl
4271  write(1, "ls: ", 4) = 4
4271  write(1, "cannot access idont", 19) = 19
4271  write(1, ": No such file or directory", 27) = 27
4271  write(1, "\n", 1)                 = 1
4272  write(1, "ls: cannot access idont: No such"..., 51) = 51

There are undoubtedly small utilities out there to absorb a stream and increase the level of buffering, but I have never bothered to search for them.

From the ugly-solution-dept: The following works, but it depends on too many unreliable implementation details to be a reassuring solution:
Code:
# ls (GNU coreutils) 8.13
# GNU bash, version 4.2.24(1)-release (x86_64-pc-linux-gnu)

{
    exec 3>&1
    ls iexist idont 2>&1 1>&3 | tee -a /dev/null | 
    {
	while IFS= read -r linebuf; do
	    printf '%s\n' "$linebuf"
        done
    }
} | cat > logfile

I definitely recommend the perl solution over a sh/bash hack, because there's nothing preventing the printf builtin from being fully-buffered when writing to a pipe. The correct solution to this problem requires line-oriented buffering (of course, of lengths <= PIPE_BUF).

Regards,
Alister

Last edited by alister; 09-07-2012 at 10:12 PM..
These 2 Users Gave Thanks to alister For This Post:
# 16  
Old 09-08-2012
Quote:
Originally Posted by alister
In this case, the delay is long enough for the system to copy the ls stderr writes to a kernel buffer and for ls to move on and flush its stdout stream before tee gets a chance to write to its stdout. But, if there were a lot of error messages -- e.g. from many non-existent files -- the delay wouldn't be long enough.
Of course, I saw this. But I though: "you work more, you just need to sleep a bit more..." . Ah ah, LOL. Smilie

Quote:
Also, you couldn't simply extend the delay without limit because eventually an ls stderr write() would block and no matter how long you waited, the ls stdout stream would not be flushed before ls stderr messages flow downstream through tee to grep.
This is what I didn't think of. I see it now. Thanks.
--
Bye
# 17  
Old 09-08-2012
Fantastic! This is one of the threads that keep me coming here again and again.

When i wrote my first post i had the following script as data source for tests (AIX 6.1 last TL):

Code:
#! /bin/ksh

print -u1 "This goes to stdout."
print -u2 "This goes to stderr."

exit 0

I understand now why that worked with my redirections but in other more real-life situations they might fail.

My focus is on writing scripts to accomplish tasks and i want my own messages (error and warning/info) to be as tolerant as possible, so here is what i suggest as a solution for writing custom scripts:

- Put a time stamp in every line of output. Even if the lines in the output file will become disorganized, a simple "sort" will put them in order again. I use a certain "output format" for my lines, which is consistent across scripts:

[PID TIMESTAMP MSGCLASS message]

where MSGCLASS is either "Info", "Warning" or "Error".

- Work similar to "syslog": all (error/warning/info) messages to a info log, and error messages also to a separate error log. This way you can avoid the elaborate redirection gymnastics for the usual things you want to achieve.

I hope this helps.

bakunin
# 18  
Old 09-10-2012
Quote:
Originally Posted by bakunin

...

Code:
script 2>&1 1>/some/file | tee -a /some/file

This finally does what we want: output to <stdout> is put into "/some/file", output to <stderr> is being displayed before being appended to /some/file" too.

The only uncertainty left is that i am not sure if the exact sequence of the messages will be preserved, especially if there is high load and many messages. You will have to try that. I'll be tankful if you could post a follow-up telling us this.
Hi bakunin,
thank you for your notes! I tried your adjustments, but the STDOUT will always displayed on screen.

I need this for a cronjob. Actually everything (STDOUT & STDERR) is written to a log. But I want to get informed, when an error occurs.
# 19  
Old 09-10-2012
As it is, alister did most of the work and you should thank him.

Quote:
Originally Posted by thuranga
I tried your adjustments, but the STDOUT will always displayed on screen.
hmm, that is unexpected. With the example script i gave (see up this thread). I was able to get the result you wanted, albeit we learned from alister that it was more out of chance and won't work in larger scales.

Quote:
I need this for a cronjob. Actually everything (STDOUT & STDERR) is written to a log. But I want to get informed, when an error occurs.
A cron job has no terminal attached to it at all and output to <stdout> is usually mailed to the owner of the cron job. This is the reason why <stderr> and <stdout> in cronjobs are always redirected - you don't want to get all these mails.

If you want output to go to the "system console" (don't confuse this with a terminal - the console can be any terminal, but not every terminal is the console) use syslogs facilities instead of simple output. Syslog messages can be configured to go either to the system console or every terminal. An example for this would be the "shutdown" command, which usually prints a "The system is about to go down"-message on every terminal. This is done via a syslog facility.

I hope this helps.

bakunin
This User Gave Thanks to bakunin For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Redirect STDOUT & STDERR to file and then on screen

Dear all, redirecting STDOUT & STDERR to file is quite simple, I'm currently using: Code: exec 1>>/tmp/tmp.log; exec 2>>/tmp/tmp.log But during script execution I would like the output come back again to screen, how to do that? Thanks Luc edit by bakunin: please use CODE-tags like the... (6 Replies)
Discussion started by: tmonk1
6 Replies

2. Shell Programming and Scripting

Lost redirecting stderr & stdout to 3 files - one each plus combined

Hi folks I need/want to redirect output (stdout, stderr) from an exec call to separate files. One for stderr only and two(!) different (!) ones for the combined output of stderr and stdout. After some research and testing i got this so far : (( exec ${command} ${command_parameters} 3>&1... (6 Replies)
Discussion started by: MDominok
6 Replies

3. Shell Programming and Scripting

Redirect STDOUT & STDERR to file and then on screen

Dear all, redirecting STDOUT & STDERR to file is quite simple, I'm currently using: exec 1>>/tmp/tmp.log; exec 2>>/tmp/tmp.logBut during script execution I would like the output come back again to screen, how to do that? Thanks Lucas (4 Replies)
Discussion started by: Lord Spectre
4 Replies

4. Programming

stderr stdout to a log file

I originally wrote my script using the korn shell and had to port it to bash on a another server. My script is working find for backing up but noticed that now after the move, I am not getting any output to my log files. Using Korn shell, this worked for me for some odd reason. This was sending... (2 Replies)
Discussion started by: metallica1973
2 Replies

5. Shell Programming and Scripting

Preserve output order when redirecting stdout and stderr

Hi, I already searched through the forum and tried to find a answer for my problem but I didn't found a full working solution, thats way I start this new thread and hope, some can help out. I wonder that I'm not able to find a working solution for the following scenario: Working in bash I... (8 Replies)
Discussion started by: Boemm
8 Replies

6. Shell Programming and Scripting

Redirect stdout/stderr to a file globally

Hi I am not if this is possible: is it possible in bach (or another shell) to redirect GLOBALLY the stdout/stderr channels to a file. So, if I have a script script.sh cmd1 cmd2 cmd3 I want all stdout/stderr goes to a file. I know I can do: ./script.sh 1>file 2>&1 OR ... (2 Replies)
Discussion started by: islegmar
2 Replies

7. Shell Programming and Scripting

sending stdout and stderr to a file

working on a c sell script I think I understand the concept of it, which is: filename >> file.txt (to appaend) or filename | tee -a file.txt (to append) The problem is that my shell script is used with several parameters, and these commands don't seem to work with just filename. They... (2 Replies)
Discussion started by: mistermojo
2 Replies

8. Shell Programming and Scripting

How to redirect stderr and stdout to a file

Hi friends I am facing one problem while redirecting the out of the stderr and stdout to a file let example my problem with a simple example I have a file (say test.sh)in which i run 2 command in the background ps -ef & ls & and now i am run this file and redirect the output to a file... (8 Replies)
Discussion started by: sushantnirwan
8 Replies

9. Shell Programming and Scripting

Redirecting STDERR message to STDOUT & file at same time

Friends I have to redirect STDERR messages both to screen and also capture the same in a file. 2 > &1 | tee file works but it also displays the non error messages to file, while i only need error messages. Can anyone help?? (10 Replies)
Discussion started by: vikashtulsiyan
10 Replies

10. Shell Programming and Scripting

redirecting STDOUT & STDERR

In bash, I need to send the STDOUT and STDERR from a command to one file, and then just STDERR to another file. Doing one or the other using redirects is easy, but trying to do both at once is a bit tricky. Anyone have any ideas? (9 Replies)
Discussion started by: jshinaman
9 Replies
Login or Register to Ask a Question