Just for the sake of curiosity, I was wondering whether we could try an ugly workaround (I know it's at least ugly): if GNU ls splits too much its error messages, what about a little pause between ls and tee?
It seems to work.
... <snip> ....
I realize that you are aware that your suggestion isn't elegant, but aside from that its utility is very limited. It may work for that small sample, but it won't scale. In this case, the delay is long enough for the system to copy the ls stderr writes to a kernel buffer and for ls to move on and flush its stdout stream before tee gets a chance to write to its stdout. But, if there were a lot of error messages -- e.g. from many non-existent files -- the delay wouldn't be long enough. Also, you couldn't simply extend the delay without limit because eventually an ls stderr write() would block and no matter how long you waited, the ls stdout stream would not be flushed before ls stderr messages flow downstream through tee to grep.
The following is a much better workaround:
The trace confirms that it's not just luck or timing; the writes are indeed coalesced into full lines:
There are undoubtedly small utilities out there to absorb a stream and increase the level of buffering, but I have never bothered to search for them.
From the ugly-solution-dept: The following works, but it depends on too many unreliable implementation details to be a reassuring solution:
I definitely recommend the perl solution over a sh/bash hack, because there's nothing preventing the printf builtin from being fully-buffered when writing to a pipe. The correct solution to this problem requires line-oriented buffering (of course, of lengths <= PIPE_BUF).
Regards,
Alister
Last edited by alister; 09-07-2012 at 10:12 PM..
These 2 Users Gave Thanks to alister For This Post:
In this case, the delay is long enough for the system to copy the ls stderr writes to a kernel buffer and for ls to move on and flush its stdout stream before tee gets a chance to write to its stdout. But, if there were a lot of error messages -- e.g. from many non-existent files -- the delay wouldn't be long enough.
Of course, I saw this. But I though: "you work more, you just need to sleep a bit more..." . Ah ah, LOL.
Quote:
Also, you couldn't simply extend the delay without limit because eventually an ls stderr write() would block and no matter how long you waited, the ls stdout stream would not be flushed before ls stderr messages flow downstream through tee to grep.
This is what I didn't think of. I see it now. Thanks.
--
Bye
Fantastic! This is one of the threads that keep me coming here again and again.
When i wrote my first post i had the following script as data source for tests (AIX 6.1 last TL):
I understand now why that worked with my redirections but in other more real-life situations they might fail.
My focus is on writing scripts to accomplish tasks and i want my own messages (error and warning/info) to be as tolerant as possible, so here is what i suggest as a solution for writing custom scripts:
- Put a time stamp in every line of output. Even if the lines in the output file will become disorganized, a simple "sort" will put them in order again. I use a certain "output format" for my lines, which is consistent across scripts:
[PID TIMESTAMP MSGCLASS message]
where MSGCLASS is either "Info", "Warning" or "Error".
- Work similar to "syslog": all (error/warning/info) messages to a info log, and error messages also to a separate error log. This way you can avoid the elaborate redirection gymnastics for the usual things you want to achieve.
This finally does what we want: output to <stdout> is put into "/some/file", output to <stderr> is being displayed before being appended to /some/file" too.
The only uncertainty left is that i am not sure if the exact sequence of the messages will be preserved, especially if there is high load and many messages. You will have to try that. I'll be tankful if you could post a follow-up telling us this.
Hi bakunin,
thank you for your notes! I tried your adjustments, but the STDOUT will always displayed on screen.
I need this for a cronjob. Actually everything (STDOUT & STDERR) is written to a log. But I want to get informed, when an error occurs.
As it is, alister did most of the work and you should thank him.
Quote:
Originally Posted by thuranga
I tried your adjustments, but the STDOUT will always displayed on screen.
hmm, that is unexpected. With the example script i gave (see up this thread). I was able to get the result you wanted, albeit we learned from alister that it was more out of chance and won't work in larger scales.
Quote:
I need this for a cronjob. Actually everything (STDOUT & STDERR) is written to a log. But I want to get informed, when an error occurs.
A cron job has no terminal attached to it at all and output to <stdout> is usually mailed to the owner of the cron job. This is the reason why <stderr> and <stdout> in cronjobs are always redirected - you don't want to get all these mails.
If you want output to go to the "system console" (don't confuse this with a terminal - the console can be any terminal, but not every terminal is the console) use syslogs facilities instead of simple output. Syslog messages can be configured to go either to the system console or every terminal. An example for this would be the "shutdown" command, which usually prints a "The system is about to go down"-message on every terminal. This is done via a syslog facility.
Dear all,
redirecting STDOUT & STDERR to file is quite simple, I'm currently using:
Code:
exec 1>>/tmp/tmp.log; exec 2>>/tmp/tmp.log
But during script execution I would like the output come back again to screen, how to do that?
Thanks
Luc
edit by bakunin: please use CODE-tags like the... (6 Replies)
Hi folks
I need/want to redirect output (stdout, stderr) from an exec call to separate files. One for stderr only and two(!) different (!) ones for the combined output of stderr and stdout.
After some research and testing i got this so far :
(( exec ${command} ${command_parameters} 3>&1... (6 Replies)
Dear all,
redirecting STDOUT & STDERR to file is quite simple, I'm currently using:
exec 1>>/tmp/tmp.log; exec 2>>/tmp/tmp.logBut during script execution I would like the output come back again to screen, how to do that?
Thanks
Lucas (4 Replies)
I originally wrote my script using the korn shell and had to port it to bash on a another server. My script is working find for backing up but noticed that now after the move, I am not getting any output to my log files.
Using Korn shell, this worked for me for some odd reason. This was sending... (2 Replies)
Hi,
I already searched through the forum and tried to find a answer for my problem but I didn't found a full working solution, thats way I start this new thread and hope, some can help out.
I wonder that I'm not able to find a working solution for the following scenario:
Working in bash I... (8 Replies)
Hi
I am not if this is possible: is it possible in bach (or another shell) to redirect GLOBALLY the stdout/stderr channels to a file.
So, if I have a script
script.sh
cmd1
cmd2
cmd3
I want all stdout/stderr goes to a file. I know I can do:
./script.sh 1>file 2>&1
OR
... (2 Replies)
working on a c sell script
I think I understand the concept of it, which is:
filename >> file.txt (to appaend)
or filename | tee -a file.txt (to append)
The problem is that my shell script is used with several parameters, and these commands don't seem to work with just filename. They... (2 Replies)
Hi friends
I am facing one problem while redirecting the out of the stderr and stdout to a file
let example my problem with a simple example
I have a file (say test.sh)in which i run 2 command in the background
ps -ef &
ls &
and now i am run this file and redirect the output to a file... (8 Replies)
Friends
I have to redirect STDERR messages both to screen and also capture the same in a file.
2 > &1 | tee file works but it also displays the non error messages to file, while i only need error messages.
Can anyone help?? (10 Replies)
In bash, I need to send the STDOUT and STDERR from a command to one file, and then just STDERR to another file. Doing one or the other using redirects is easy, but trying to do both at once is a bit tricky. Anyone have any ideas? (9 Replies)