[Solved] Find Specific records from file and add totals into variables


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting [Solved] Find Specific records from file and add totals into variables
# 1  
Old 08-05-2009
[Solved] Find Specific records from file and add totals into variables

Hi Eveyone,

I am working on one shell script to find the specific records from data file and add the totals into variables and print them. you can find the sample data file below for more clarification.

Sample Data File:

PXSTYL00__20090803USA
CHCART00__20090803IND
SSTRAN00__20090803USA
Records Sent: 100 Sum1: 10 20090803 1234567890 1234567890 123456789 123456789 123456789
Records Sent: 200 Sum1: 20 20090803 1234567890 1234567890 123456789 123456789 123456789
Records Sent: 300 Sum1: 30 20090803 1234567890 1234567890 123456789 123456789 123456789

In the above sample data file, I want to read all Records with word Records Sent and then get the totals and sums for all records together.

Out Put Should be like this

Total Records Sent: 100+200+300 = 600
Total Sum : 10+20+30 = 60

I tried the following code:

total = 0
sum = 0
grep "Records Sent:" filename > filename.tmp
cat filename.tmp |\
while read line
do
tot = `$line | cut -f3 -d " "`
su = `$line | cut -f5 -d " "`
total ='expr $total + $total`
sum =`expr $sum +su`
done
echo $total
echo $sum

I am having some issues while running this code. Those issues are
when i am using while read line so it will try to goby line by line but basically it should go by record by record because the Record which has Records Sent: will be 2 lines.
I had another error saying cannot create. I have no idea why this error is coming.
Can anybody write above script in different way.Please advise me on this

Best Regards
# 2  
Old 08-05-2009
Code:
total=0
summ=0
grep "Records Sent:" filename | while read line
do
tot=`$line | cut -f3 -d " "`
suu=`$line | cut -f5 -d " "`
total=`expr $total + $tot`
summ=`expr $summ + suu`
done
echo "Total = $total"
echo "Sum = $summ"

1. Removed space before/after all "="
2. Replaced reserved word "su" with "suu" and "sum" with "summ"
3. Typo error in total ='expr.....Should be total=`expr
4. Typo error. Should be $total + $tot`

Last edited by edidataguy; 08-05-2009 at 01:26 AM..
# 3  
Old 08-05-2009
Code:
nawk '$1=="Records" && $2=="Sent:" && $4="Sum1:" {sent+=$3;sum+=$5} END{print sent" "sum}'

# 4  
Old 08-05-2009
I tried the following code but i am getting below errors

Code:
total=0
summ=0
grep "Records Sent:" filename | while read line
do
tot=`$line | cut -f3 -d " "`
suu=`$line | cut -f5 -d " "`
total=`expr $total + $tot`
summ=`expr $summ + suu`
done
echo "Total = $total"
echo "Sum = $summ"

Errors
expr: non-numeric argument
expr: syntax error
expr: syntax error
# 5  
Old 08-05-2009
Quote:
Originally Posted by veeru
Out Put Should be like this

Total Records Sent: 100+200+300 = 600
Total Sum : 10+20+30 = 60
Base on your data sample
Code:
# awk '/Records Sent/{a[$1FS$2]+=$3;b[$4]+=$5}END{for(i in a)print v,i,a[i];for(i in b)print v,i,b[i]}' v="Total" file
Total Records Sent: 600
Total Sum1: 60

# 6  
Old 08-05-2009
I don't want to rewrite everything because I have lot other validations included in the same script. My below code is working for total but not for sum because I have some decimals in my sum.

Code:
total=0
summ=0
grep "Records Sent:" filename | while read line
do
tot=`$line | cut -f3 -d " "`
suu=`$line | cut -f5 -d " "`
total=`expr $total + $tot`
summ=`expr $summ + $suu`
done
echo "Total = $total"
echo "Sum = $summ"

The above code is working to get the total but not for sum because I have some decimals in sum. please let me know how to get rid of those errors

Errors:

expr: non-numeric argument
expr: syntax error
expr: syntax error
# 7  
Old 08-05-2009
Hi.

Try using bc instead of eval

Code:
summ=$(echo "$summ + $suu" | bc)

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Find records with specific characters in 2 nd field

Hi , I have a requirement to read a file ( 5 fields , ~ delimited) and find the records which contain anything other than Alphabets, Numbers , comma ,space and dot . ie a-z and A-Z and 0-9 and . and " " and , in 2nd field. Once I do that i would want the result to have field1|<flag> flag can... (2 Replies)
Discussion started by: ashwin3086
2 Replies

2. Shell Programming and Scripting

[Solved] How to print specific text from a file?

Hi All, I have the below text file from which I have to cut particular section starting from PTR_Security_Rpeorting.cpf to PTR_Security_Reporting_Env93_export. Report Model............: "D:\Cognos_Publishing\tmp.a2R94KLQec"\PTR_Security_Reporting.cpf Report Output Script....:... (4 Replies)
Discussion started by: Vikram_Tanwar12
4 Replies

3. Shell Programming and Scripting

[SOLVED] nawk FS using pipe read variables from file

I have a file data_1.out which contains: 1|abc mail|mail subject|mail body 2|def mail|mail subject|def mail body I am trying to read the variables from data_1.out and use them to print to 2 different files based on the id (first_column) The problem is I am not able to read the file... (8 Replies)
Discussion started by: sol_nov
8 Replies

4. Shell Programming and Scripting

[Solved] Working with date (add minutes using variables)

Dear all, today I'm scratching my head with a simple (I believe) issue. Working with date is quite simple, so if I Need to add some seconds to current time, I'll use: date --date='+30 seconds' +"%Y-%m-%d %H:%M:%S"But, how to pass the value to add from a variable? I tried the following without... (2 Replies)
Discussion started by: Lord Spectre
2 Replies

5. Shell Programming and Scripting

Extract error records based on specific criteria from Unix file

Hi, I look for a awk one liner for below issue. input file ABC 1234 abc 12345 ABC 4567 678 XYZ xyz ght 678 ABC 787 yyuu ABC 789 7890 777 zxr hyip hyu mno uii 678 776 ABC ty7 888 All lines should be started with ABC as first field. If a record has another value for 1st... (7 Replies)
Discussion started by: ratheesh2011
7 Replies

6. Shell Programming and Scripting

[Solved] Find duplicate and add pattern in sed/awk

<Update> I have the solution: sed 's/\{3\}/&;&;---;4/' The thread can be marked as solved! </Update> Hi There, I'm working on a script processing some data from a website into cvs format. There is only one final problem left I can't find a solution. I've processed my file... (0 Replies)
Discussion started by: lolworlds
0 Replies

7. UNIX for Dummies Questions & Answers

Grep specific records from a file of records that are separated by an empty line

Hi everyone. I am a newbie to Linux stuff. I have this kind of problem which couldn't solve alone. I have a text file with records separated by empty lines like this: ID: 20 Name: X Age: 19 ID: 21 Name: Z ID: 22 Email: xxx@yahoo.com Name: Y Age: 19 I want to grep records that... (4 Replies)
Discussion started by: Atrisa
4 Replies

8. UNIX for Dummies Questions & Answers

Cut specific fields from a file containing multiline records

Hi, I am looking for a method to get column13 to column 50 data from the 1st line of a multiline reord. The records are stored in a large file and are separated by newline. sample format is (data in red is to be extracted) <header> A001dfhskhfkdsh hajfhksdhfjh... (3 Replies)
Discussion started by: sunayana3112
3 Replies

9. Shell Programming and Scripting

cut specific records from a file

I am trying to cut the first 10 characters from a file only if the file has 'xyz' in field 185-188. I tried this cat filename | cut -c1-10 but this gives me all the records regardless of what is in field 185-188. Is this doable ? Thanks in advance for responses. (2 Replies)
Discussion started by: jxh461
2 Replies

10. Shell Programming and Scripting

Extarct specific records from wide file

I have a file which is 5 million records. And each records has 412 fields has delimited by "|". So that makes each records to be 2923 bytes long. I wanted to extract specific records like top 100 or 2500 - 5000, 50001 - 10000 etc. from this file. I tried using head command for top 100 records,... (1 Reply)
Discussion started by: acheepi
1 Replies
Login or Register to Ask a Question