Hi,
I want to pull all the files that have the pattern like the input file given. In a directory there are say 5000 files out of which 1000 files have the text starting with 'submit file'.
and from those files I want to get the output I mentioned earlier.
First filter those files which have the text starting with 'submit file' ( out of 5000, I would have 1000 such input file ) and read those input files which have have the text starting with 'submit file' and get the output in the required format I mentioned earlier. So basically, I want to filter out the files and read those file as input only and get required output. thanks
This is very convoluted logic. Using awk to read all of your 5000 files to get a list of files containing a certain string to use as arguments to another awk script is grossly inefficient since you have to read each of your 5000 and then read the selected 1000 files a second time. There is very seldom a need to invoke awk twice, but if you must you have to actually invoke awk twice instead of just using the command line arguments that would be used to invoke awk as operands to awk as in:
or:
But, as I said before, putting it all in a single awk script would be much more efficient:
This User Gave Thanks to Don Cragun For This Post:
Hello All,
I need some assistance to extract a piece of information from a huge file.
The file is like this one :
database information
ccccccccccccccccc
ccccccccccccccccc
ccccccccccccccccc
ccccccccccccccccc
os information
cccccccccccccccccc
cccccccccccccccccc... (2 Replies)
Hi,
Following is sample portion of the file;
<JDBCConnectionPool DriverName="oracle.jdbc.OracleDriver"
MaxCapacity="10" Name="MyApp_DevPool"
PasswordEncrypted="{3DES}7tXFH69Xg1c="
Properties="user=MYAPP_ADMIN" ShrinkingEnabled="false"
... (12 Replies)
Good evening! Trying to make a shell script to parse log file and show only required information.
log file has 44 fields and alot of lines, each columns separated by ":".
log file is like:
first_1:3:4:5:6:1:3:4:5:something:notinterested
second_2:3:4:3:4:2
first_1:3:4:6:6:7:8
I am interested... (3 Replies)
Hi to all,
I got this content/pattern from file http.log.20110808.gz
mail1 httpd: Account Notice: close igchung@abc.com 2011/8/7 7:37:36 0:00:03 0 0 1
mail1 httpd: Account Information: login sastria9@abc.com proxy sid=gFp4DLm5HnU
mail1 httpd: Account Notice: close sastria9@abc.com... (16 Replies)
I'm still new to bash script , I have a log file and I want to extract the items within the last 5 days . and also within the last 10 hours
the log file is like this : it has 14000 items started from march 2002 to january 2003
awk '{print $4}' < *.log |uniq -c|sort -g|tail -10
but... (14 Replies)
Hye ShamRock
If you can help me with this difficult task for me then it will save my day
Logs :
==================================================================================================================
... (4 Replies)
Hi, i have a file like this:
<Iteration>
<Iteration_iter-num>3</Iteration_iter-num>
<Iteration_query-ID>lcl|3_0</Iteration_query-ID>
<Iteration_query-def>G383C4U01EQA0A length=197</Iteration_query-def>
<Iteration_query-len>197</Iteration_query-len>
... (9 Replies)
Hello!
I need help :) I have a file like this:
AA BC FG
RF TT GH
DD FF HH
(a few number of rows and three columns) and I want to put the letters of each column in a variable step by step in order to give them as input in another script. So I would like to obtain:
for the 1° loop:... (11 Replies)
Gents,
If is possible please help.
I have a big file (example attached) which contends exactly same value in column, but from column 2 to 6 these values are diff. I will like to compile for all records all columns like the example attached in .csv format (output.rar ).. The last column in the... (11 Replies)
I need help to extract transcript information from gff3 file.
Here is the input
Chr01 JGI gene 82773 86941 . - . ID=Potri.001G000900;Name=Potri.001G000900
Chr01 JGI mRNA 82793 86530 . - . ID=PAC:27047814;Name=Potri.001G000900.1;pacid=27047814;longest=1;Parent=Potri.001G000900... (6 Replies)