Capture stdout from multiple processes


 
Thread Tools Search this Thread
Top Forums Programming Capture stdout from multiple processes
# 1  
Old 08-12-2011
Capture stdout from multiple processes

I have a number of binaries which I currenlty have no control over. They alright data to stdout. I would like to kick off any number of these binaries and capture and process their stdout. Doing this for one process is straight forward for example - comments and error checking removed for simplicity

Code:
pid_t pid;
int	commpipe[2];
char buf[BUFSIZE];

pipe(commpipe)
 
pid=fork();

 if(pid){
    dup2(commpipe[0],0);
    close(commpipe[0]);
    while ( fgets ( buf, BUFSIZE, stdin ) )
    { processBuffer ( buf );
 } else{
    dup2(commpipe[1],1); 
   close(commpipe[0]);
    setvbuf(stdout,(char*)NULL,_IONBF,0); 
   execl("child","child",NULL) 	
}

I would run the above from a separate thread. This works fine for one process, if I attempt to repeat the above for a secoind process. the output of the 2 process is intermingled. I can solve the stdin side by not removing the dup2 call in the parent and reading directly from the read side of the pipe i.e. fgets( buf, BUFSIZE, stdin). But the data is still intermingled since it is the same stdout.
I can close stdout before forking but then I only get the output from one process.

The goal is to have a separate thread read the stdout of each process Any help will be appreciated

Thanks in advance

I'm sure there is away to do it

Last edited by zxmaus; 08-12-2011 at 11:40 PM.. Reason: added code tags
# 2  
Old 08-12-2011
You didn't say whether or not you need to write to the child's stdin. If not, then use popen() and you'll be able to get what you want. Here's a sample:


Code:
#include <unistd.h>
#include <stdio.h>
#include <errno.h>

int main( int argc, char **argv )
{
    FILE* pipe1;
    FILE* pipe2;
    char    buf[4096];

    pipe1 = popen( "df -h", "r" );   /* execute commands in parallel */
    pipe2 = popen( "ls -al", "r" );

    printf( "output from command 2:\n" );     /* read and print all data from one command */
    while( fgets( buf, sizeof( buf ), pipe2 ) )
        printf( "%s", buf );

    pclose( pipe2 );

    printf( "output from command 1:\n" );      /* read and print from other command */
    while( fgets( buf, sizeof( buf ), pipe1 ) )
        printf( "%s", buf );

     pclose( pipe1 );


    return 0; 
 }

The popen() system call creates a one-way pipe, but if you need to write on stdin all may not be lost. If the writes do not depend on reading from stdout and responding to the data, then you can write your input to a file, and run the command via popen() with redirection in from the file you created in your programme. Ensure you close the file before invoking popen().


Hope this helps a bit.

Last edited by agama; 08-13-2011 at 02:49 PM.. Reason: fixed typo
# 3  
Old 08-13-2011
Agama

Thanks for the quick reply. No I do not need to write to the child stdin. I will give popen a try. I assumed it would behave the same way. You code executes the children serially I will run them in parallel would that make a difference. In any case I will give this a shot I probably will not get to it until Monday, I did not expect such a quick response.

Thanks
Dave.
# 4  
Old 08-13-2011
You are most welcome.

The sample actually executes the commands in parallel, but reads the output serially to demonstrate that the output was not intermixed.

Here's a small example of a main that starts a thread for each filename (up to 10) given on the command line. Each thread uses a popen() to cat the file (please no flames about useless uses of cat!!) and reads the data counting the number of times the character 'e' appears in the file. It's a silly (um, useless) programme, because the thread could open and read the file directly, but rather than invoking a ps or netstat command, I wanted something that I could repeat to ensure that the proper number of e's from each file were being counted (to prove no mixing of output from the commands).

The function invoked as the thread sleeps to demonstrate that the threads are indeed created in parallel -- all "create" messages from the main should print before any thread actually announces that it's running the command.

Hope this helps to get you going

Code:
#include <unistd.h>
#include <stdlib.h>
#include <stdio.h>
#include <errno.h>
#include <pthread.h>
#include <string.h>



/*
    started as a thread. creates a child process via popen() and reads
    the child's output counting the number of target characters (global)
    that were in the output. When done, the count is printed on stdout.
    A delay of 2s is imposed to demonstrate that the main has started
    all threads in parallel, otherwise it is unnecessary.
*/
void *reader( void *data )
{
    char    target = 'e';       /* count these from the command's output */
    char    *cmd;               /* command to execute */
    FILE    *rpipe;
    char    buf[4096];          /* read bufffer */
    int     tcount = 0;         /* count of target */
    char    *cp;                /* pointer as we walk the buffer looking for target */

    cmd = (char *) data;

    sleep( 2);
    fprintf( stderr, "starting the command: %s\n", cmd );

    rpipe = popen( cmd, "r" );                      /* start the child */
    while( fgets( buf, sizeof( buf ), rpipe ) )     /* read all of the child's stdout */
    {
        cp = buf;
        while( (cp = strchr( cp, target )) )        /* count number of target characters in this buffer */
        {
            tcount++;
            cp++;
        }
    }

    pclose( rpipe );

    printf( "output from `%s` contained %d '%c' characters\n", cmd, tcount, target );

    free( data );
    pthread_exit( 0 );

    return NULL;            /* shouldn't matter, but keeps compilers happy */
}

int main( int argc, char **argv )
{
    pthread_t tids[10];
    char    cbuf[1024];
    int     i;

    if( argc < 2 || argc > 11 )
    {
        fprintf( stderr, "usage: %s filename1 [filename2... filename10]\n", argv[0] );
        exit( 1 );
    }


    for( i = 1; i < argc; i++ )         /* start all threads to process all files in parallel */
    {
        snprintf( cbuf, sizeof( cbuf ), "cat %s", argv[i] );
        fprintf( stderr, "starting thread: %s\n", cbuf );
        pthread_create( &tids[i-1], NULL, reader, (void *) strdup( cbuf ) );        
                                                  /* thread must free dup to avoid leak */
    }

    fprintf( stderr, "all threads started, main thread waits....\n" );

    for( i = 0; i < argc-1; i++ )                       /* wait for all threads to finish */
        pthread_join( tids[i], NULL );

    fprintf( stderr, "all threads finished\n" );

    return 0;
 }

# 5  
Old 08-13-2011
Thanks again,

Yes is exactly what I'm looking for. I'll give it a shot Monday. If I can do it sooner I will.

Thanks agin
# 6  
Old 08-15-2011
Agama

I gave it a try it seems to be working as you said it would. I tested it with two binaries and none of the data was intermingled. I will try with more once I get the remaining binaries.

Thanks again
# 7  
Old 09-04-2011
Your example is in C. If that is not a requirement, have a look at GNU Parallel which is made for running programs in parallel without having the output mix.

Watch the intro video: Part 1: GNU Parallel script processing and execution - YouTube
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Execute Multiple Scripts and Capture Log Details

Hi All, I have a requirement to execute multiple scripts (say 4) one after the other in one script and capture log details and error messages in a log file below LOG_FILE= FILE.`date ++"%Y%m%d%H:%M:%S"` Script 1 : File_Checkr.sh Script 2 : Pre_Validation.sh Script 3 : Testing.sh Script... (12 Replies)
Discussion started by: Deena1984
12 Replies

2. UNIX for Dummies Questions & Answers

Capture Multiple Lines Into Variable As Of Standard Output

Hello All, I have the below script and output. cat test.sh #!/bin/bash -x logit() { echo " - ${*}" > ${LOG_FILE} } LOG_FILE=/home/infrmtca/bin/findtest.log VAR=`find . -type f -name "*sql"` logit $VAR Output: cat /home/infrmtca/bin/findtest.log -... (9 Replies)
Discussion started by: Ariean
9 Replies

3. Shell Programming and Scripting

grep - Extracting multiple key words from stdout

Hello. From command line, the command zypper info nxclient return a bloc of data : linux local # zypper info nxclient Loading repository data... Reading installed packages... Information for package nxclient: Repository: zypper_local Name: nxclient Version: 3.5.0-7 Arch: x86_64... (7 Replies)
Discussion started by: jcdole
7 Replies

4. UNIX for Dummies Questions & Answers

multiple variables assignement (stdout/stderr outputs)

Hi all, I've been looking around for this for a while and can't seem to find a satifactory way to do what I want: I would like to assign the output of stdout to a variable and that of stderr to another one, and this without using temporary files/named pipes. In other words be able to assign... (4 Replies)
Discussion started by: anthalamus
4 Replies

5. Shell Programming and Scripting

Capture values using multiple regex patterns

I have to read the file, in each line of file i need to get 2 values using more than one search pattern. ex: <0112 02:12:20 def > /some string/some string||some string||124 i donot have same delimiter in the line, I have to read '0112 02:12:20' which is timestamp, and last field '124' which is... (4 Replies)
Discussion started by: adars1
4 Replies

6. UNIX for Dummies Questions & Answers

Redirect stdin stdout to multiple files

Hi, i know how to a) redirect stdout and stderr to one file, b) and write to two files concurrently with same output using tee command Now, i want to do both the above together. I have a script and it should write both stdout and stderr in one file and also write the same content to... (8 Replies)
Discussion started by: ysrini
8 Replies

7. UNIX for Advanced & Expert Users

Capture child processes and change return values question

Thanks in advance. My environment is Ubuntu 9.04 desktop customized to be a high school classroom server for teaching code development. I have a unique "fake" jail called "lshell" which is very easy to setup and restricts users to commands that I dictate DISALLOWING ANYTHING ELSE. These... (6 Replies)
Discussion started by: tuxhats
6 Replies

8. Shell Programming and Scripting

How do I capture multiple lines of the status output of a command?

I need to know what the upload speed of an Internet connection. I thought the easiest way to do this would be to transfer a file via FTP to my server using the command: sh-3.2$ ftp -u ftp://username:password@computerdomain/directory/ file_to_be_uploaded Note: My environment allows me to issue... (2 Replies)
Discussion started by: zzz1528
2 Replies

9. UNIX for Advanced & Expert Users

How to capture STDOut of script in a CGI script?

Hi Perl Experts, I am invoking a shell script thru a perl script and the perl script is cgi script.I need to capture the STDOUT of the shell script in the html page where I am invoking the script .?The shell script takes couple of mintutes to complete its execution .Mean while my html page does... (1 Reply)
Discussion started by: kittu1979
1 Replies

10. UNIX for Dummies Questions & Answers

not able to capture STDOUT

I am using a third party API to get some real time feed. When I run the command it shows the results properly: etd@mhs-apps5009 $ mamalistenc -m lbm -tport mamaqa -S MLALERTS -s KANA wFinancialStatus Type CTRL-C to exit. (null).MLALERTS.KANA Type: INITIAL Status OK wFinancialStatus |... (5 Replies)
Discussion started by: aks__
5 Replies
Login or Register to Ask a Question