Sponsored Content
Top Forums Shell Programming and Scripting Report from Log file using awk Post 302898569 by vgersh99 on Tuesday 22nd of April 2014 06:20:45 PM
Old 04-22-2014
here's something to start with:
awk -f vasu.awk myFile where vasu.awk is:
Code:
BEGIN {
  FS=";"
  OFS="\t"

  FLDserv=5
  FLDpart=6
  FLDtime=7
  FLDerr=8

  fmt="%-15s%-40s%-10d%-10d%s\n"
}
FNR==2 {date=substr($1, 1, index($1, " ")-1)}
FNR>1 {
  servA[$FLDserv]=(servA[$FLDserv])? servA[$FLDserv]"," $FLDpart:$FLDpart
  if ($FLDtime < 1000)
     goodA[$FLDserv]++
  else
     badA[$FLDserv]++

  if ($FLDerr ~ /[^0-9]/)
     errorA[$FLDserv]= $FLDerr "(" $FLDpart ")"
}
END {
   print date
   printf("%-15s%-40s%-10s%-10s%s\n", "Service;", "Partner;", "badCount;", "goodCount;", "ERROR;")
   for (i in servA) {
     printf(fmt, i, servA[i], badA[i], goodA[i], errorA[i])
   }
}

This User Gave Thanks to vgersh99 For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk- report generation from input file

I have input file with below content: Person: Name: Firstname1 lastname1 Address: 111, Straat City : Hilversum Person: Name : Fistname2 lastname2 Address: 222, street Cit: Bussum Person: Name : Firstname2 lastname3 Address: 333, station straat City: Amsterdam I need... (6 Replies)
Discussion started by: McLan
6 Replies

2. Shell Programming and Scripting

awk script to count percentage from log file

Hi, I have a log like this : actually i want to get the log like this : where % can get from : 100 * pmTotNoRrcConnectReqSucc / pmTotNoRrcConnectReq Thanks in advance.. :) (8 Replies)
Discussion started by: justbow
8 Replies

3. Shell Programming and Scripting

Piping tail to awk to parse a log file

Hello all, I've got what I'm pretty sure is a simple problem, but I just can't seem to work past it. I'm trying to use awk to pretty up a log file, and calculate a percentage. The log file looks like this: # tail strtovrUsage 20090531-18:15:45 RSreq - 24, RSsuc - 24, RSrun - 78, RSerr -... (4 Replies)
Discussion started by: DeCoTwc
4 Replies

4. Shell Programming and Scripting

Extract XML message from a log file using awk

Dear all I have a log file and the content like this file name: temp.log <?xml version="1.0" encoding="cp850"?> <!DOCTYPE aaabbb SYSTEM '/dtdpath'> <aaabbb> <tranDtl> <msgId>000001</msgId> </tranDtl> ..... </aaabbb> ... ... (1 Reply)
Discussion started by: on9west
1 Replies

5. Shell Programming and Scripting

Assistance on making a report log file

:):):):):) (0 Replies)
Discussion started by: bryan101
0 Replies

6. Shell Programming and Scripting

awk to log transform on matrix file

Hi Friends, I have an input matrix file like this Col1 Col2 Col3 Col4 R1 1 2 3 4 R2 4 5 6 7 R3 5 6 7 8 I would like to consider only the numeric values without touching the column header and the row header. I looked up on the forum's search, and I found this. But, I donno how to... (3 Replies)
Discussion started by: jacobs.smith
3 Replies

7. Shell Programming and Scripting

Help on Log File format using sed or awk

Hello Gurus, First, i would like to know is there any way to solve my problem. i have a log file like this: INFO - ABCDRequest :: processing started for the record <0> TransactionNo <Txn#1> recordID <recID#1> INFO - ABCDRequest :: processing started for the record <0> TransactionNo... (9 Replies)
Discussion started by: VasuKukkapalli
9 Replies

8. Shell Programming and Scripting

Issue with awk script parsing log file

Hello All, I am trying to parse a log file and i got this code from one of the good forum colleagues, However i realised later there is a problem with this awk script, being naive to awk world wanted to see if you guys can help me out. AWK script: awk '$1 ~ "^WRITER_" {p=1;next}... (18 Replies)
Discussion started by: Ariean
18 Replies

9. Shell Programming and Scripting

Parsing a log file and creating a report script

The log file is huge and lot of information, i would like to parse and make a report . below is the log file looks like: REPORT DATE: Mon Aug 10 04:16:17 CDT 2017 SYSTEN VER: v1.3.0.9 TERMINAL TYPE: prod SYSTEM: nb11cu51 UPTIME: 04:16AM up 182 days 57 mins min MODEL, TYPE, and SN:... (8 Replies)
Discussion started by: amir07
8 Replies

10. Shell Programming and Scripting

Outputting data from log file to report

I have a log file that looks like this. the lines are grouped. 2 lines per entry. M: 2019-01-25 13:02:31.698 P25, received network transmission from KI4EKI to TG 10282 M: 2019-01-25 13:02:35.694 P25, network end of transmission, 4.3 seconds, 1% packet loss M: 2019-01-25 13:02:38.893 P25,... (7 Replies)
Discussion started by: ae4ml
7 Replies
CUBRID_FIELD_TYPE(3)							 1						      CUBRID_FIELD_TYPE(3)

cubrid_field_type - Return the type of the column corresponding to the given field offset

SYNOPSIS
string cubrid_field_type (resource $result, int $field_offset) DESCRIPTION
This function returns the type of the column corresponding to the given field offset. The returned field type could be one of the follow- ing: "int", "real", "string", etc. PARAMETERS
o $result -Array type of the fetched result CUBRID_NUM, CUBRID_ASSOC, CUBRID_BOTH. o $field_offset - The numerical field offset. The $field_offset starts at 0. If $field_offset does not exist, an error of level E_WARNING is also issued. RETURN VALUES
Type of the column, on success. FALSE when invalid field_offset value. -1 if SQL sentence is not SELECT. EXAMPLES
Example #1 cubrid_field_type(3) example <?php $conn = cubrid_connect("localhost", 33000, "demodb"); $result = cubrid_execute($conn, "SELECT * FROM code"); $col_num = cubrid_num_cols($result); printf("%-15s %-15s %s ", "Field Table", "Field Name", "Field Type"); for($i = 0; $i < $col_num; $i++) { printf("%-15s %-15s %s ", cubrid_field_table($result, $i), cubrid_field_name($result, $i), cubrid_field_type($result, $i)); } cubrid_disconnect($conn); ?> The above example will output: Field Table Field Name Field Type code s_name char code f_name varchar PHP Documentation Group CUBRID_FIELD_TYPE(3)
All times are GMT -4. The time now is 10:06 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy