Sponsored Content
Full Discussion: gerp data from several files
Top Forums Shell Programming and Scripting gerp data from several files Post 302080568 by reborg on Wednesday 19th of July 2006 04:09:20 PM
Old 07-19-2006
Jim actually explained the rations, which was to deal with filnames containing spaces, he used the while loop in order to be able to put quotes around the filename.

Try your example with filenames with spaces and you will understand, with no spaces the xargs method is much more efficient.

However I believe the OP was asking for something more think this:

Code:
find /onedirectory/somewhere -name "v*" -exec \
    awk '/\/users\/test\/db/ { printf "%s\t%s\t", FILENAME, $0}
         /memory leak =/ { printf "%s\t" $0}
         /error =/ {print}' {} \; > reportFile

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

unzip particular gzip files among the normal data files

Hello experts, I run Solaris 9. I have a below script which is used for gunzip the thousand files from a directory. ---- #!/usr/bin/sh cd /home/thousands/gzipfiles/ for i in `ls -1` do gunzip -c $i > /path/to/file/$i done ---- In my SAME directory there thousand of GZIP file and also... (4 Replies)
Discussion started by: thepurple
4 Replies

2. Shell Programming and Scripting

Help - Bug: A script to compile two types of data files into two temporary files

Dear other forum members, I'm writing a script for my homework, but I'm scratching all over my head and still can't figure out what I did wrong. Please help me. I just started to learn about bash scripting, and I appreciate if anyone of you can point out my errors. I thank you in advance. ... (3 Replies)
Discussion started by: ilove2smoke
3 Replies

3. Shell Programming and Scripting

List Files + data inside files

Hey everyone, I'm trying get a list of files using but also append some data located inside the file. The log files contain two strings that I am trying to extract. These strings are surrounded by tags. Here is a sample log file: ... (2 Replies)
Discussion started by: ToeLint
2 Replies

4. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

5. Shell Programming and Scripting

How to extract data from indexed files (ISAM files) maintained in an unix server.

Hi, Could someone please assist on a quick way of How to extract data from indexed files (ISAM files) maintained in an UNIX(AIX) server.The file data needs to be extracted in flat text file or CSV or excel format . Usually we have programs in microfocus COBOL to extract data, but would like... (2 Replies)
Discussion started by: devina
2 Replies

6. Shell Programming and Scripting

Complex data sorting in excel files or text files

Dear all, I have a complex data file shown below,,,,, A_ABCD_13208 0 0 4.16735 141044 902449 1293900 168919 C_ABCD_13208 0 0 4.16735 141044 902449 1293900 168919 A_ABCDEF715 52410.9 18598.2 10611 10754.7 122535 252426 36631.4 C_DBCDI_1353 0... (19 Replies)
Discussion started by: AAWT
19 Replies

7. Shell Programming and Scripting

Compare 2 files and match column data and align data from 3 column

Hello experts, Please help me in achieving this in an easier way possible. I have 2 csv files with following data: File1 08/23/2012 12:35:47,JOB_5330 08/23/2012 12:35:47,JOB_5330 08/23/2012 12:36:09,JOB_5340 08/23/2012 12:36:14,JOB_5340 08/23/2012 12:36:22,JOB_5350 08/23/2012... (5 Replies)
Discussion started by: asnandhakumar
5 Replies

8. Shell Programming and Scripting

Combine data from two files base on uniq data

File 1 ID Name Po1 Po2 DD134 DD134_4A_1 NN-1 L_0_1 DD134 DD134_4B_1 NN-2 L_1_1 DD134 DD134_4C_1 NN-3 L_2_1 DD142 DD142_4A_1 NN-1 L_0_1 DD142 DD142_4B_1 NN-2 L_1_1 DD142 DD142_4C_1 NN-3 L_2_1 DD142 DD142_3A_1 NN-41 L_3_1 DD142 DD142_3A_1 NN-42 L_3_2 File 2 ( Combination of... (1 Reply)
Discussion started by: pareshkp
1 Replies

9. UNIX for Dummies Questions & Answers

Stack data from multiple files into one, with variable column files

Hello Gurus, Im new to scripting. Got struck with a file merge issue in Unix. Was looking for some direction and stumbled upon this site. I saw many great posts and replies but couldnt find a solution to my issue. Greatly appreciate any help.. I have three csv files -> Apex_10_Latest.csv,... (1 Reply)
Discussion started by: wamshi
1 Replies

10. Shell Programming and Scripting

In PErl script: need to read the data one file and generate multiple files based on the data

We have the data looks like below in a log file. I want to generat files based on the string between two hash(#) symbol like below Source: #ext1#test1.tale2 drop #ext1#test11.tale21 drop #ext1#test123.tale21 drop #ext2#test1.tale21 drop #ext2#test12.tale21 drop #ext3#test11.tale21 drop... (5 Replies)
Discussion started by: Sanjeev G
5 Replies
HOBBITD_FILESTORE(8)					      System Manager's Manual					      HOBBITD_FILESTORE(8)

NAME
hobbitd_filestore - hobbitd worker module for storing Xymon data SYNOPSIS
hobbitd_channel --channel=status hobbitd_filestore --status [options] hobbitd_channel --channel=data hobbitd_filestore --data [options] hobbitd_channel --channel=notes hobbitd_filestore --notes [options] hobbitd_channel --channel=enadis hobbitd_filestore --enadis [options] DESCRIPTION
hobbitd_filestore is a worker module for hobbitd, and as such it is normally run via the hobbitd_channel(8) program. It receives hobbitd messages from a hobbitd channel via stdin, and stores these in the filesystem in a manner that is compatible with the Big Brother daemon, bbd. This program can be started multiple times, if you want to store messages for more than one channel. OPTIONS
--status Incoming messages are "status" messages, they will be stored in the $BBLOGS/ directory. If you are using xymon(7) throughout your Xymon server, you will not need to run this module to save status messages, unless you have a third-party add-on that reads the sta- tus-logs directly. This module is NOT needed to get trend graphs, you should run the hobbitd_rrd(8) module instead. --data Incoming messages are "data" messages, they will be stored in the $BBDATA directory. This module is not needed, unless you have a third-party module that processes the data-files. This module is NOT needed to get trend graphs, you should run the hobbitd_rrd(8) module instead. --notes Incoming messages are "notes" messages, they will be stored in the $BBNOTES directory. This modules is only needed if you want to allow people to remotely update the notes-files available on the Xymon webpages. --enadis Incoming messages are enable/disable messages, they will update files in the $BBDISABLED directory. This is only needed if you have third-party add-ons that use these files. --dir=DIRECTORY Overrides the default output directory. --html Used together with "--status". Tells hobbitd_filestore to also save an HTML version of the status-log. Should not be used unless you must run with "BBLOGSTATUS=static". --htmldir=DIRECTORY The directory where HTML-versions of the status logs are stored. Default: $BBHTML --htmlext=.EXT Set the filename extension for generated HTML files. By default, HTML files are saved with a ".html" extension. --multigraphs=TEST1[,TEST2] This causes hobbitd_filestore to generate HTML status pages with links to service graphs that are split up into multiple images, with at most 5 graphs per image. If not specified, only the "disk" status is split up this way. --only=test[,test,test] Save status messages only for the listed set of tests. This can be useful if you have an external script that needs to parse some of the status logs, but you do not want to save all status logs. --debug Enable debugging output. FILES
This module does not rely on any configuration files. SEE ALSO
hobbitd_channel(8), hobbitd_rrd(8), hobbitd(8), xymon(7) Xymon Version 4.2.3: 4 Feb 2009 HOBBITD_FILESTORE(8)
All times are GMT -4. The time now is 04:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy