Capturing Output?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Capturing Output?
# 1  
Old 03-01-2013
Capturing Output?

Hello All,

I'm writing a Bash Script and in it I execute a piped command within a Function I wrote and I can't seem to redirect the
stderr from the 1st pipe to stdout..?

I'm setting the output to an Array "COMMAND_OUTPUT" and splitting on newlines using this --> "( $(...) )". By putting
the extra ( ) around the $( ), it splits the output using the IFS and sets the array elements.
Then the command after the semi-colon echos BOTH Return Codes from each command using the PIPESTATUS Array
Variable to STDOUT. which works perfectly getting the Return Codes.

Here's the Command:
Code:
IFS="
"

COMMAND_OUTPUT=( $(cat "$SEND_FILE" 2>&1 | send_nsca $IPADDR -p 5667 -to 10 -d , -c $SEND_NSCA_CFG ; echo "EXIT_CODES=${PIPESTATUS[@]}") )

I've tried adding "2>&1" to each point in the Command within "$(..)", but none seem to send the 'cat' command's STDERR
to STDOUT in order to capture the error inside the Array "COMMAND_OUTPUT".

If I set $SEND_FILE to a file that doesn't Exist, I can see that everything gets stored in the COMMAND_OUTPUT Array
except for "cat: : No such file or directory". Which get printed immediately when that line is executed.


Any ideas on how to redirect the stderr from the 1st pipe command to stdout? Any thoughts would be much appreciated!

Thanks in Advance,
Matt
# 2  
Old 03-01-2013
Run the cat command in a subshell...
# 3  
Old 03-01-2013
Hey Shamrock, thanks for the reply!

Could you elaborate on that, I think there's a few ways to split into a subshell, isn't there? Or could you just show what you mean..?


Thanks Again,
Matt
# 4  
Old 03-01-2013
Subshell is great for low effort logging "(...) >>log_file 2>&1" as it allows you to concatenate stdout and or stderr on multiple commands and split stdin if you use whole-line or all-one-shell commands, like line, below. The 'tail -1' would not work, for the stdin FILE* buffer would swallow the second and some following line parts that would be lost when tail exited, but 'line' does a lot of read(0,*buf,1) until it finds a linefeed, so the second line is still left within/behind the fd, which is inherited, not the FILE*.
Code:
$ ps -el |(
line
sort -nr +9
)|pg 
  F S        UID   PID  PPID  C PRI NI             ADDR   SZ            WCHAN TTY       TIME COMD
  1 R          0 25915     1  0 152 20         c0b8ef00 15155                - ?        18:17 mad
401 R          0  2043  2041  0 152 20         a49f4000 13600                - ?        24:59 java
  1 S       6068 10960     1  0 154 20         a6c0ed00 12881           c0e470 ?         1:07 xterm
  1 S       6068 12264     1  1 154 20         a4603f00 10437           c0e470 ?         0:24 xterm
  1 S       6068 11882     1  0 154 20         a733d300 10437           c0e470 ?         0:17 xterm
  1 S          0  1600     1  0 127 20         a2a2a300 5754         b9006f00 ?        20:38 scopeux
401 R      37288  7220  7219  0 152 20         a724d000 5391                - ?        26:51 java
141 R          0  1566     1  0 -16 20         a2a54700 4827                - ?        138:40 midaemon
401 R          0  1274     1  0 152 20         a18f7f00 4362                - ?        27:32 java
  1 S       6068 11490     1  0 154 20         a7f97600 4185           c0e470 ?         0:00 xterm
  1 S       6068 11145     1  0 154 20         a8feba00 4185           c0e470 ?         0:01 xterm

# 5  
Old 03-01-2013
If I move the command to a new Function and simply execute the command only inside the function and then set the function call to a variable
and redirect to stderr from the function call, it will then redirect stderr to stdout.

Like this:
Code:
execute_command()
{
    local FILE="$1"

    cat "$FILE" | send_nsca $IPADDR -p 5667 -to 10 -d , -c $SEND_NSCA_CFG
    EXIT_CODES="${PIPESTATUS[@]}"

    echo "EXIT_CODES=$EXIT_CODES"
}

IFS="
"

# Redirecting HERE, at Function call will send STDERR to STDOUT and array now holds all messages from the 
# command, including the "EXIT_CODES=1 0"
COMMAND_OUTPUT=( $(execute_command "$SEND_FILE" 2>&1) )

echo -ne "COMMAND_OUTOUT:\n"
echo -ne "${COMMAND_OUTPUT[*]}\n\n"

Now this way sends STDERR to STDOUT and the Array now holds all the data (including stderr)...
Is this a bad way to do this, or is that ok to do?


Thanks Again,
Matt

---------- Post updated at 05:11 PM ---------- Previous update was at 05:09 PM ----------

Hey DGPickett, thanks for the reply!

Cool, thanks for the example... Much appreciated!
I was about to head out for the day, I'll probably pick back up on this tomorrow or Monday... Need to clear my head a bit!


Thanks guys for ALL your replies!
I'll be sure to give BGPickett's example a try.


Thanks AGAIN,
Matt
# 6  
Old 03-02-2013
Redirect all stderr to stdout inside $(...) so you will capture the error/output of all commands inside of $(...)
Code:
COMMAND_OUTPUT=$(exec 2>&1; cat "$SEND_FILE" 2>&1 | send_nsca $IPADDR -p 5667 -to 10 -d , -c $SEND_NSCA_CFG ; echo "EXIT_CODES=${PIPESTATUS[@]}")

This User Gave Thanks to shamrock For This Post:
# 7  
Old 03-04-2013
Hey Shamrock, thanks for your reply!

Awesome, that did it!

I had to remove the 2nd "2>&1" that was right after the 'cat' Command in order to get its STDERR to be captured by the
array "COMMAND_OUTPUT"... But other then that it worked PERFECTLY..!

Final code to capture both Command's STDERR & STDOUT into an Array:
Code:
IFS="
"

declare -a COMMAND_OUTPUT=( $(exec 2>&1; cat "$SEND_FILE" | send_nsca $IPADDR -p 5667 -to 10 -d , -c $SEND_NSCA_CFG ; echo "EXIT_CODES=${PIPESTATUS[@]}") )

Thanks again everybody for all your suggestions!


Thanks Again Shamrock,
Matt
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help capturing output of expect script

match_max 500000 set timeout 30 set outcome1 {} set outcome2 {} set inputfile C:\\Users\\Administrator\\Desktop\\inputfile.txt send -i $con "\r"; expect -i $con "Desktop>" { exp_send "type $inputfile \r" } set timeout 30 expect { "Desktop>" { set outcome $expect_out(0,string);}... (3 Replies)
Discussion started by: cityprince143
3 Replies

2. Shell Programming and Scripting

Capturing the output of a background job

Hello, I unfortunately have a process that does two things, it returns an answer to me and then does a bunch of work that I would like to wait on. Here is a simple example: #!/bin/bash function p_w { echo "poopy" sleep 10 echo "scoop" } foo=$(p_w &) sleep 1 echo "1... (7 Replies)
Discussion started by: brsett
7 Replies

3. Shell Programming and Scripting

Capturing the output of dbv

Hello, We have an oracle database running on a Linux host (RHEL5)...I'm trying to run Oracle dbv (database verify utility) and capture its output to a file using the following syntax but the standart output does NOT get redirected to the file... dbv blocksize=32768 ... (2 Replies)
Discussion started by: luft
2 Replies

4. Shell Programming and Scripting

Capturing and sending of vmstat output everyday

I need to capture the vmstat output of a server every 5 minutes, in a text filename with the name in the format vmoutput. yesterday's date.txt. I need to get the vmstat o/p for the whole day with 5 minutes interval and send it (preferably ftp) to my local desktop folder. eg: vmstat 300... (1 Reply)
Discussion started by: yuvanash
1 Replies

5. Shell Programming and Scripting

capturing output from top and format output

Hi all, I'd like to capture the output from the 'top' command to monitor my CPU and Mem utilisation.Currently my command isecho date `top -b -n1 | grep -e Cpu -e Mem` I get the output in 3 separate lines.Tue Feb 24 15:00:03 Cpu(s): 3.4% us, 8.5% sy .. .. Mem: 1011480k total, 226928k used, ....... (4 Replies)
Discussion started by: new2ss
4 Replies

6. Shell Programming and Scripting

Capturing the output from an exec command

Hi, I'm new to ksh - unix platform. I'm writing a small script which will search my current directory and will search for file names which it takes input from the users. Here is the code I'm having. 1 #!/bin/ksh 2 echo "enter a file name to be searched in the current dir : " 3 read... (1 Reply)
Discussion started by: avik
1 Replies

7. UNIX for Dummies Questions & Answers

Capturing output from C++ program

Hi I have a C++ program that generates a lot of log information on the console, I need this output (printed using printf function) to go to a file since I will use crontab to schedule the job. I know I can do this: myprog > myfile but I don't know how to enter this in crontab. I use... (3 Replies)
Discussion started by: GMMike
3 Replies

8. Shell Programming and Scripting

capturing output in script

I have the following line in my script: $sftpcmd $rmthost <<COMMANDS>> $sftplog 2>&1 For some reason this is not capturing the errors from sftp, they go to the file attached to the cron entry ie mm hh dd MM * /myscript > cron.out any idea why? digital unix 4.0d (6 Replies)
Discussion started by: MizzGail
6 Replies

9. Programming

Capturing cli Program output

A few years ago I took a C programming class and used both Linux and FreeBSD as my operating systems and gcc as the compiler. I ran a Command-line utility to to capture what happened as I ran the code, the output was saved to a file that I specified when I ran the utility command (that I can't... (1 Reply)
Discussion started by: Christopher
1 Replies

10. UNIX for Dummies Questions & Answers

capturing the output of grep as integer variable

Hi, I have an expression using grep and nawk that captures the ID number of a given Unix process. It gets printed to screen but I don't know how to declare a variable to this returned value! For example, ps -ef|grep $project | grep -v grep | nawk '{print $2}' This returns my number. How... (2 Replies)
Discussion started by: babariba
2 Replies
Login or Register to Ask a Question