Script (ksh) to get data in every 30 mins interval for the given date


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Script (ksh) to get data in every 30 mins interval for the given date
# 1  
Old 10-04-2013
Lightbulb Script (ksh) to get data in every 30 mins interval for the given date

Hello,

Since I m new to shell, I had a hard time to sought out this problem.

I have a log file of a utility which tells that batch files are successful with timestamp. Given below is a part of the log file.

Code:
2013/03/07 00:13:50 [notice] Apache/1.3.29 (Unix) configured -- resuming normal operations
2013/03/07 00:23:50 [info] Server built: Feb 27 2004 13:56:37
2013/03/07 00:39:40 [notice] Accept mutex: sysvsem (Default: sysvsem)
2013/03/07 01:05:49 [info] [client 64.242.88.10] Batch 126 was successful
2013/03/07 01:15:19 [info] [client 64.242.88.10] Batch 272 was successful
2013/03/07 01:55:29 [info] [client 64.242.88.10] Batch 353 was successful
2013/03/07 02:05:09 [info] [client 64.242.88.10] Batch 241 was successful
2013/03/07 02:35:49 statistics: Use of uninitialized value in concatenation (.) or string at /home/httpd/twiki/lib/TWiki.pm line 528.
2013/03/07 04:45:41 statistics: Can't create file /home/httpd/twiki/data/Main/WebStatistics.txt - Permission denied
2013/03/07 05:05:49 [info] [client 64.242.88.10] Batch 671 was successful
2013/03/07 05:46:41 [info] [client 64.242.88.10] Batch 251 was successful
2013/03/07 06:35:26 [info] [client 64.242.88.10] Batch 181 was successful
2013/03/07 08:05:49 [info] [client 64.242.88.10] Batch 389 was successful
2013/03/07 10:05:29 [info] [client 64.242.88.10] Batch 911 was successful
2013/03/07 10:13:42 [info] [client 64.242.88.10] Batch 681 was successful
2013/03/07 10:45:33 [info] [client 64.242.88.10] Batch 451 was successful
2013/03/07 10:49:51 [info] [client 64.242.88.10] Batch 675 was successful
2013/03/07 11:05:29 [info] [client 64.242.88.10] Batch 439 was successful
2013/03/07 12:55:19 [info] [client 64.242.88.10] Batch 678 was successful
2013/03/07 13:05:33 [info] [client 64.242.88.10] Batch 557 was successful
2013/03/07 13:47:12 [info] [client 64.242.88.10] Batch 881 was successful
2013/03/07 14:09:16 [info] [client 64.242.88.10] Batch 115 was successful
2013/03/07 14:15:31 [info] [client 64.242.88.10] Batch 612 was successful
2013/03/07 14:29:19 [info] [client 64.242.88.10] Batch 111 was successful
2013/03/07 14:35:50 [info] [client 64.242.88.10] Batch 971 was successful
2013/03/07 14:57:49 [info] [client 64.242.88.10] Batch 347 was successful
2013/03/07 15:19:55 [info] [client 64.242.88.10] Batch 824 was successful
2013/03/07 15:28:51 [info] [client 64.242.88.10] Batch 908 was successful
2013/03/07 15:31:44 [info] [client 64.242.88.10] Batch 113 was successful
2013/03/07 15:47:41 [info] [client 64.242.88.10] Batch 990 was successful
2013/03/07 15:57:41 [info] [client 64.242.88.10] Batch 290 was successful
2013/03/07 16:05:49 [info] [client 64.242.88.10] Batch 120 was successful
2013/03/07 16:22:18 [error] [client 24.70.56.49] File does not exist: /home/httpd/twiki/view/Main/WebHome
2013/03/07 16:25:39 [info] [client 64.242.88.10] Batch 150 was successful
2013/03/07 16:29:49 [info] [client 64.242.88.10] Batch 145 was successful
2013/03/07 16:47:41 [info] [client 64.242.88.10] Batch 131 was successful
2013/03/07 16:51:19 [info] [client 64.242.88.10] Batch 481 was successful
2013/03/07 17:01:42 [info] [client 64.242.88.10] Batch 676 was successful
2013/03/07 17:23:14 [info] [client 64.242.88.10] Batch 121 was successful
2013/03/07 17:37:12 [info] [client 64.242.88.10] Batch 439 was successful
2013/03/07 17:43:39 [info] [client 64.242.88.10] Batch 336 was successful
2013/03/07 18:21:42 [info] [client 64.242.88.10] Batch 772 was successful
2013/03/07 18:25:11 [info] [client 64.242.88.10] Batch 154 was successful
2013/03/07 18:26:26 [info] [client 64.242.88.10] Batch 189 was successful
2013/03/07 19:01:09 [info] [client 64.242.88.10] Batch 346 was successful
2013/03/07 19:11:28 [info] [client 64.242.88.10] Batch 678 was successful
2013/03/07 19:19:29 [info] [client 64.242.88.10] Batch 814 was successful
2013/03/07 19:31:41 [info] [client 64.242.88.10] Batch 114 was successful
2013/03/07 19:33:16 [info] [client 64.242.88.10] Batch 561 was successful
2013/03/07 19:41:19 [info] [client 64.242.88.10] Batch 881 was successful
2013/03/07 19:53:17 [info] [client 64.242.88.10] Batch 445 was successful
2013/03/07 19:57:49 [info] [client 64.242.88.10] Batch 321 was successful
2013/03/07 20:01:56 [info] [client 64.242.88.10] Batch 890 was successful
2013/03/07 21:11:17 [info] [client 64.242.88.10] Batch 665 was successful
2013/03/07 21:19:55 [info] [client 64.242.88.10] Batch 340 was successful
2013/03/07 21:29:29 [info] [client 64.242.88.10] Batch 149 was successful
2013/03/07 21:31:49 [info] [client 64.242.88.10] Batch 213 was successful
2013/03/07 21:43:19 [info] [client 64.242.88.10] Batch 522 was successful
2013/03/07 21:49:46 [info] [client 64.242.88.10] Batch 450 was successful
2013/03/07 22:37:31 [info] [client 64.242.88.10] Batch 661 was successful
2013/03/07 22:39:11 [info] [client 64.242.88.10] Batch 542 was successful
2013/03/07 23:05:49 [info] [client 64.242.88.10] Batch 598 was successful
2013/03/07 23:41:14 [info] [client 64.242.88.10] Batch 811 was successful
2013/03/07 24:12:42 [info] [client 64.242.88.10] Batch 429 was successful
2013/03/07 24:22:09 [info] [client 64.242.88.10] Batch 238 was successful
2013/03/07 24:29:01 [info] [client 64.242.88.10] Batch 987 was successful
2013/03/07 24:44:43 [info] [client 64.242.88.10] Batch 144 was successful
.
.
.
2013/03/08 ------------------------------
2013/03/09 ------------------------------
2013/03/10 ------------------------------


Kindly help me with a ksh to get data from this log file in 30 minutes interval i.e for the given date in every 30 mins interval I need how many batches
were successful.

Desired output:
Code:
Enter the date:
2013/03/07

Batches that were successful on 2013/03/07 between 00:00:00 and 00:29:59 : 4
Batches that were successful on 2013/03/07 between 00:30:00 and 00:59:59 : 2
.
.
.
.
.
Batches that were successful on 2013/03/07 between 23:29:59 and 23:59:59 :4


Last edited by Don Cragun; 10-04-2013 at 04:51 AM.. Reason: Fix ending CODE tag
# 2  
Old 10-04-2013
Quote:
Originally Posted by rpm120
Hello,

Since I m new to shell, I had a hard time to sought out this problem.

I have a log file of a utility which tells that batch files are successful with timestamp. Given below is a part of the log file.

Code:
2013/03/07 00:13:50 [notice] Apache/1.3.29 (Unix) configured -- resuming normal operations
2013/03/07 00:23:50 [info] Server built: Feb 27 2004 13:56:37
2013/03/07 00:39:40 [notice] Accept mutex: sysvsem (Default: sysvsem)
2013/03/07 01:05:49 [info] [client 64.242.88.10] Batch 126 was successful
2013/03/07 01:15:19 [info] [client 64.242.88.10] Batch 272 was successful
2013/03/07 01:55:29 [info] [client 64.242.88.10] Batch 353 was successful
2013/03/07 02:05:09 [info] [client 64.242.88.10] Batch 241 was successful
2013/03/07 02:35:49 statistics: Use of uninitialized value in concatenation (.) or string at /home/httpd/twiki/lib/TWiki.pm line 528.
2013/03/07 04:45:41 statistics: Can't create file /home/httpd/twiki/data/Main/WebStatistics.txt - Permission denied
2013/03/07 05:05:49 [info] [client 64.242.88.10] Batch 671 was successful
2013/03/07 05:46:41 [info] [client 64.242.88.10] Batch 251 was successful
2013/03/07 06:35:26 [info] [client 64.242.88.10] Batch 181 was successful
2013/03/07 08:05:49 [info] [client 64.242.88.10] Batch 389 was successful
2013/03/07 10:05:29 [info] [client 64.242.88.10] Batch 911 was successful
2013/03/07 10:13:42 [info] [client 64.242.88.10] Batch 681 was successful
2013/03/07 10:45:33 [info] [client 64.242.88.10] Batch 451 was successful
2013/03/07 10:49:51 [info] [client 64.242.88.10] Batch 675 was successful
2013/03/07 11:05:29 [info] [client 64.242.88.10] Batch 439 was successful
2013/03/07 12:55:19 [info] [client 64.242.88.10] Batch 678 was successful
2013/03/07 13:05:33 [info] [client 64.242.88.10] Batch 557 was successful
2013/03/07 13:47:12 [info] [client 64.242.88.10] Batch 881 was successful
2013/03/07 14:09:16 [info] [client 64.242.88.10] Batch 115 was successful
2013/03/07 14:15:31 [info] [client 64.242.88.10] Batch 612 was successful
2013/03/07 14:29:19 [info] [client 64.242.88.10] Batch 111 was successful
2013/03/07 14:35:50 [info] [client 64.242.88.10] Batch 971 was successful
2013/03/07 14:57:49 [info] [client 64.242.88.10] Batch 347 was successful
2013/03/07 15:19:55 [info] [client 64.242.88.10] Batch 824 was successful
2013/03/07 15:28:51 [info] [client 64.242.88.10] Batch 908 was successful
2013/03/07 15:31:44 [info] [client 64.242.88.10] Batch 113 was successful
2013/03/07 15:47:41 [info] [client 64.242.88.10] Batch 990 was successful
2013/03/07 15:57:41 [info] [client 64.242.88.10] Batch 290 was successful
2013/03/07 16:05:49 [info] [client 64.242.88.10] Batch 120 was successful
2013/03/07 16:22:18 [error] [client 24.70.56.49] File does not exist: /home/httpd/twiki/view/Main/WebHome
2013/03/07 16:25:39 [info] [client 64.242.88.10] Batch 150 was successful
2013/03/07 16:29:49 [info] [client 64.242.88.10] Batch 145 was successful
2013/03/07 16:47:41 [info] [client 64.242.88.10] Batch 131 was successful
2013/03/07 16:51:19 [info] [client 64.242.88.10] Batch 481 was successful
2013/03/07 17:01:42 [info] [client 64.242.88.10] Batch 676 was successful
2013/03/07 17:23:14 [info] [client 64.242.88.10] Batch 121 was successful
2013/03/07 17:37:12 [info] [client 64.242.88.10] Batch 439 was successful
2013/03/07 17:43:39 [info] [client 64.242.88.10] Batch 336 was successful
2013/03/07 18:21:42 [info] [client 64.242.88.10] Batch 772 was successful
2013/03/07 18:25:11 [info] [client 64.242.88.10] Batch 154 was successful
2013/03/07 18:26:26 [info] [client 64.242.88.10] Batch 189 was successful
2013/03/07 19:01:09 [info] [client 64.242.88.10] Batch 346 was successful
2013/03/07 19:11:28 [info] [client 64.242.88.10] Batch 678 was successful
2013/03/07 19:19:29 [info] [client 64.242.88.10] Batch 814 was successful
2013/03/07 19:31:41 [info] [client 64.242.88.10] Batch 114 was successful
2013/03/07 19:33:16 [info] [client 64.242.88.10] Batch 561 was successful
2013/03/07 19:41:19 [info] [client 64.242.88.10] Batch 881 was successful
2013/03/07 19:53:17 [info] [client 64.242.88.10] Batch 445 was successful
2013/03/07 19:57:49 [info] [client 64.242.88.10] Batch 321 was successful
2013/03/07 20:01:56 [info] [client 64.242.88.10] Batch 890 was successful
2013/03/07 21:11:17 [info] [client 64.242.88.10] Batch 665 was successful
2013/03/07 21:19:55 [info] [client 64.242.88.10] Batch 340 was successful
2013/03/07 21:29:29 [info] [client 64.242.88.10] Batch 149 was successful
2013/03/07 21:31:49 [info] [client 64.242.88.10] Batch 213 was successful
2013/03/07 21:43:19 [info] [client 64.242.88.10] Batch 522 was successful
2013/03/07 21:49:46 [info] [client 64.242.88.10] Batch 450 was successful
2013/03/07 22:37:31 [info] [client 64.242.88.10] Batch 661 was successful
2013/03/07 22:39:11 [info] [client 64.242.88.10] Batch 542 was successful
2013/03/07 23:05:49 [info] [client 64.242.88.10] Batch 598 was successful
2013/03/07 23:41:14 [info] [client 64.242.88.10] Batch 811 was successful
2013/03/07 24:12:42 [info] [client 64.242.88.10] Batch 429 was successful
2013/03/07 24:22:09 [info] [client 64.242.88.10] Batch 238 was successful
2013/03/07 24:29:01 [info] [client 64.242.88.10] Batch 987 was successful
2013/03/07 24:44:43 [info] [client 64.242.88.10] Batch 144 was successful
.
.
.
2013/03/08 ------------------------------
2013/03/09 ------------------------------
2013/03/10 ------------------------------


Kindly help me with a ksh to get data from this log file in 30 minutes interval i.e for the given date in every 30 mins interval I need how many batches
were successful.

Desired output:
Code:
Enter the date:
2013/03/07

Batches that were successful on 2013/03/07 between 00:00:00 and 00:29:59 : 4
Batches that were successful on 2013/03/07 between 00:30:00 and 00:59:59 : 2
.
.
.
.
.
Batches that were successful on 2013/03/07 between 23:29:59 and 23:59:59 :4


Code:
#!/bin/ksh
RECIPIENTS="your email"
count=`grep "was successful*" filename| wc -l`
Date=`cat filename | awk '{print $1}' | head -1`
start_time=`sed -n '1 p' filename | awk '{print $2}'`
End_time=`sed -n '$ p' filename | awk '{print $2}'`
echo "Batches that were successful on $Date between $start_time and $End_time : $count" > file.txt
mailx -r Batch Report -s "Current Status of successful batchs" $RECIPIENTS >> file.txt
sleep 1800

[CODE] Run the script in background like nohup ksh scriptname & it will run ever 30 minutes and send you email.
# 3  
Old 10-04-2013
Could you please check timestamp in your file.
Code:
2013/03/07 24:12:42 [info] [client 64.242.88.10] Batch 429 was successful
2013/03/07 24:22:09 [info] [client 64.242.88.10] Batch 238 was successful
2013/03/07 24:29:01 [info] [client 64.242.88.10] Batch 987 was successful
2013/03/07 24:44:43 [info] [client 64.242.88.10] Batch 144 was successful

I think hours 24 is invalid as your start hour is 0
This User Gave Thanks to pravin27 For This Post:
# 4  
Old 10-04-2013
I agree with pravin27 that your input data has out of range timestamps. It is also weird that you want the final entry in your output to have timestamps 23:29:59 and 23:59:59 rather than 23:30:00 and 23:59:59. But, following you general pattern and ignoring the out of range data, the following script seems to do what you want:
Code:
#!/bin/ksh
printf "Enter the date (YYYY/MM/DD): "
read dd
awk -F '[ :]' -v dd="$dd" '
BEGIN { fmt = "Batches that were successful on %s between %02d:%02d:00 " \
                "and %02d:%02d:59 : %d\n"
}
$1 == dd && $NF == "successful" { s[$2 + 0, $3 > 29]++ }
END {   for(h = 0; h < 24; h++)
                 for(m = 0; m < 2; m++)
                        printf(fmt, dd, h, m * 30, h, m * 30 + 29, s[h, m])
}' log

producing the output:
Code:
Enter the date (YYYY/MM/DD): 2013/03/07
Batches that were successful on 2013/03/07 between 00:00:00 and 00:29:59 : 0
Batches that were successful on 2013/03/07 between 00:30:00 and 00:59:59 : 0
Batches that were successful on 2013/03/07 between 01:00:00 and 01:29:59 : 2
Batches that were successful on 2013/03/07 between 01:30:00 and 01:59:59 : 1
Batches that were successful on 2013/03/07 between 02:00:00 and 02:29:59 : 1
Batches that were successful on 2013/03/07 between 02:30:00 and 02:59:59 : 0
Batches that were successful on 2013/03/07 between 03:00:00 and 03:29:59 : 0
Batches that were successful on 2013/03/07 between 03:30:00 and 03:59:59 : 0
Batches that were successful on 2013/03/07 between 04:00:00 and 04:29:59 : 0
Batches that were successful on 2013/03/07 between 04:30:00 and 04:59:59 : 0
Batches that were successful on 2013/03/07 between 05:00:00 and 05:29:59 : 1
Batches that were successful on 2013/03/07 between 05:30:00 and 05:59:59 : 1
Batches that were successful on 2013/03/07 between 06:00:00 and 06:29:59 : 0
Batches that were successful on 2013/03/07 between 06:30:00 and 06:59:59 : 1
Batches that were successful on 2013/03/07 between 07:00:00 and 07:29:59 : 0
Batches that were successful on 2013/03/07 between 07:30:00 and 07:59:59 : 0
Batches that were successful on 2013/03/07 between 08:00:00 and 08:29:59 : 1
Batches that were successful on 2013/03/07 between 08:30:00 and 08:59:59 : 0
Batches that were successful on 2013/03/07 between 09:00:00 and 09:29:59 : 0
Batches that were successful on 2013/03/07 between 09:30:00 and 09:59:59 : 0
Batches that were successful on 2013/03/07 between 10:00:00 and 10:29:59 : 2
Batches that were successful on 2013/03/07 between 10:30:00 and 10:59:59 : 2
Batches that were successful on 2013/03/07 between 11:00:00 and 11:29:59 : 1
Batches that were successful on 2013/03/07 between 11:30:00 and 11:59:59 : 0
Batches that were successful on 2013/03/07 between 12:00:00 and 12:29:59 : 0
Batches that were successful on 2013/03/07 between 12:30:00 and 12:59:59 : 1
Batches that were successful on 2013/03/07 between 13:00:00 and 13:29:59 : 1
Batches that were successful on 2013/03/07 between 13:30:00 and 13:59:59 : 1
Batches that were successful on 2013/03/07 between 14:00:00 and 14:29:59 : 3
Batches that were successful on 2013/03/07 between 14:30:00 and 14:59:59 : 2
Batches that were successful on 2013/03/07 between 15:00:00 and 15:29:59 : 2
Batches that were successful on 2013/03/07 between 15:30:00 and 15:59:59 : 3
Batches that were successful on 2013/03/07 between 16:00:00 and 16:29:59 : 3
Batches that were successful on 2013/03/07 between 16:30:00 and 16:59:59 : 2
Batches that were successful on 2013/03/07 between 17:00:00 and 17:29:59 : 2
Batches that were successful on 2013/03/07 between 17:30:00 and 17:59:59 : 2
Batches that were successful on 2013/03/07 between 18:00:00 and 18:29:59 : 3
Batches that were successful on 2013/03/07 between 18:30:00 and 18:59:59 : 0
Batches that were successful on 2013/03/07 between 19:00:00 and 19:29:59 : 3
Batches that were successful on 2013/03/07 between 19:30:00 and 19:59:59 : 5
Batches that were successful on 2013/03/07 between 20:00:00 and 20:29:59 : 1
Batches that were successful on 2013/03/07 between 20:30:00 and 20:59:59 : 0
Batches that were successful on 2013/03/07 between 21:00:00 and 21:29:59 : 3
Batches that were successful on 2013/03/07 between 21:30:00 and 21:59:59 : 3
Batches that were successful on 2013/03/07 between 22:00:00 and 22:29:59 : 0
Batches that were successful on 2013/03/07 between 22:30:00 and 22:59:59 : 2
Batches that were successful on 2013/03/07 between 23:00:00 and 23:29:59 : 1
Batches that were successful on 2013/03/07 between 23:30:00 and 23:59:59 : 1

when the user types in the data in red in response to the prompt for the date and the file named log contains your sample input data.
This User Gave Thanks to Don Cragun For This Post:
# 5  
Old 10-04-2013
Quote:
Originally Posted by Don Cragun
I agree with pravin27 that your input data has out of range timestamps. It is also weird that you want the final entry in your output to have timestamps 23:29:59 and 23:59:59 rather than 23:30:00 and 23:59:59. But, following you general pattern and ignoring the out of range data, the following script seems to do what you want:
Code:
#!/bin/ksh
printf "Enter the date (YYYY/MM/DD): "
read dd
awk -F '[ :]' -v dd="$dd" '
BEGIN { fmt = "Batches that were successful on %s between %02d:%02d:00 " \
                "and %02d:%02d:59 : %d\n"
}
$1 == dd && $NF == "successful" { s[$2 + 0, $3 > 29]++ }
END {   for(h = 0; h < 24; h++)
                 for(m = 0; m < 2; m++)
                        printf(fmt, dd, h, m * 30, h, m * 30 + 29, s[h, m])
}' log

producing the output:
Code:
Enter the date (YYYY/MM/DD): 2013/03/07
Batches that were successful on 2013/03/07 between 00:00:00 and 00:29:59 : 0
Batches that were successful on 2013/03/07 between 00:30:00 and 00:59:59 : 0
Batches that were successful on 2013/03/07 between 01:00:00 and 01:29:59 : 2
Batches that were successful on 2013/03/07 between 01:30:00 and 01:59:59 : 1
Batches that were successful on 2013/03/07 between 02:00:00 and 02:29:59 : 1
Batches that were successful on 2013/03/07 between 02:30:00 and 02:59:59 : 0
Batches that were successful on 2013/03/07 between 03:00:00 and 03:29:59 : 0
Batches that were successful on 2013/03/07 between 03:30:00 and 03:59:59 : 0
Batches that were successful on 2013/03/07 between 04:00:00 and 04:29:59 : 0
Batches that were successful on 2013/03/07 between 04:30:00 and 04:59:59 : 0
Batches that were successful on 2013/03/07 between 05:00:00 and 05:29:59 : 1
Batches that were successful on 2013/03/07 between 05:30:00 and 05:59:59 : 1
Batches that were successful on 2013/03/07 between 06:00:00 and 06:29:59 : 0
Batches that were successful on 2013/03/07 between 06:30:00 and 06:59:59 : 1
Batches that were successful on 2013/03/07 between 07:00:00 and 07:29:59 : 0
Batches that were successful on 2013/03/07 between 07:30:00 and 07:59:59 : 0
Batches that were successful on 2013/03/07 between 08:00:00 and 08:29:59 : 1
Batches that were successful on 2013/03/07 between 08:30:00 and 08:59:59 : 0
Batches that were successful on 2013/03/07 between 09:00:00 and 09:29:59 : 0
Batches that were successful on 2013/03/07 between 09:30:00 and 09:59:59 : 0
Batches that were successful on 2013/03/07 between 10:00:00 and 10:29:59 : 2
Batches that were successful on 2013/03/07 between 10:30:00 and 10:59:59 : 2
Batches that were successful on 2013/03/07 between 11:00:00 and 11:29:59 : 1
Batches that were successful on 2013/03/07 between 11:30:00 and 11:59:59 : 0
Batches that were successful on 2013/03/07 between 12:00:00 and 12:29:59 : 0
Batches that were successful on 2013/03/07 between 12:30:00 and 12:59:59 : 1
Batches that were successful on 2013/03/07 between 13:00:00 and 13:29:59 : 1
Batches that were successful on 2013/03/07 between 13:30:00 and 13:59:59 : 1
Batches that were successful on 2013/03/07 between 14:00:00 and 14:29:59 : 3
Batches that were successful on 2013/03/07 between 14:30:00 and 14:59:59 : 2
Batches that were successful on 2013/03/07 between 15:00:00 and 15:29:59 : 2
Batches that were successful on 2013/03/07 between 15:30:00 and 15:59:59 : 3
Batches that were successful on 2013/03/07 between 16:00:00 and 16:29:59 : 3
Batches that were successful on 2013/03/07 between 16:30:00 and 16:59:59 : 2
Batches that were successful on 2013/03/07 between 17:00:00 and 17:29:59 : 2
Batches that were successful on 2013/03/07 between 17:30:00 and 17:59:59 : 2
Batches that were successful on 2013/03/07 between 18:00:00 and 18:29:59 : 3
Batches that were successful on 2013/03/07 between 18:30:00 and 18:59:59 : 0
Batches that were successful on 2013/03/07 between 19:00:00 and 19:29:59 : 3
Batches that were successful on 2013/03/07 between 19:30:00 and 19:59:59 : 5
Batches that were successful on 2013/03/07 between 20:00:00 and 20:29:59 : 1
Batches that were successful on 2013/03/07 between 20:30:00 and 20:59:59 : 0
Batches that were successful on 2013/03/07 between 21:00:00 and 21:29:59 : 3
Batches that were successful on 2013/03/07 between 21:30:00 and 21:59:59 : 3
Batches that were successful on 2013/03/07 between 22:00:00 and 22:29:59 : 0
Batches that were successful on 2013/03/07 between 22:30:00 and 22:59:59 : 2
Batches that were successful on 2013/03/07 between 23:00:00 and 23:29:59 : 1
Batches that were successful on 2013/03/07 between 23:30:00 and 23:59:59 : 1

when the user types in the data in red in response to the prompt for the date and the file named log contains your sample input data.
Thanks a lot for your solution....Smilie Regrets for the typo...Smilie

can you please explain the code such that it will be helpful for me to build on this... for example if I want to match some different patterns or strings.
# 6  
Old 10-04-2013
Code:
#!/bin/ksh
printf "Enter the date (YYYY/MM/DD): "
read dd
awk -F '[ :]' -v dd="$dd" '
BEGIN { fmt = "Batches that were successful on %s between %02d:%02d:00 " \
                "and %02d:%02d:59 : %d\n"
}
$1 == dd && $NF == "successful" { s[$2 + 0, $3 > 29]++ }
END {   for(h = 0; h < 24; h++)
                 for(m = 0; m < 2; m++)
                        printf(fmt, dd, h, m * 30, h, m * 30 + 29, s[h, m])
}' log

1st line: Use the Korn shell to interpret this script
2nd line: Print a prompt asking the user to input a date.
3rd line: Read the date the user enters and save it in the shell variable named dd.
4th line: Invoke the awk utility telling it to use space and colon characters as field delimiters and define the awk variable dd to have the same value as the shell variable dd.
5th, 6th, and 7th lines: Before any input is read by the awk script, define the awk variable fmt to be the format string we will use to print results in the end.
8th line: If the 1st field on each input file line is the same string as the dd awk variable and the last field on the line is the string "successful", then increment the element of the array s[] indexed by two subscripts (the 2nd field in the line [the hour] with the leading zero removed if there is one, and 0 if the 3rd field in the line [the minute] is less than or equal to 29 or 1 if the minute is 30 or larger).
9th, 10th, and 11th line: After all input lines have been read, loop through every half hour with h set to the hour of the day and m set to the half hour within the hour and print the user entered date (dd), the hour as a two digit string with leading 0 fill, the starting minute as a two digit string with leading 0 fill, the hour again, the ending minute, and the numberer of times the patterns matched on the 8th line were found on lines in the input file within that half hour period.
12 line: Terminate the awk script and specify that the input to be processed is contained in a file named log.

Last edited by Don Cragun; 10-04-2013 at 09:57 AM.. Reason: Fix line number typo
This User Gave Thanks to Don Cragun For This Post:
# 7  
Old 10-04-2013


Quote:
Originally Posted by Don Cragun
Code:
#!/bin/ksh
printf "Enter the date (YYYY/MM/DD): "
read dd
awk -F '[ :]' -v dd="$dd" '
BEGIN { fmt = "Batches that were successful on %s between %02d:%02d:00 " \
                "and %02d:%02d:59 : %d\n"
}
$1 == dd && $NF == "successful" { s[$2 + 0, $3 > 29]++ }
END {   for(h = 0; h < 24; h++)
                 for(m = 0; m < 2; m++)
                        printf(fmt, dd, h, m * 30, h, m * 30 + 29, s[h, m])
}' log

1st line: Use the Korn shell to interpret this script
2nd line: Print a prompt asking the user to input a date.
3rd line: Read the date the user enters and save it in the shell variable named dd.
4th line: Invoke the awk utility telling it to use space and colon characters as field delimiters and define the awk variable dd to have the same value as the shell variable dd.
5th, 6th, and 7th lines: Before any input is read by the awk script, define the awk variable fmt to be the format string we will use to print results in the end.
8th line: If the 1st field on each input file line is the same string as the dd awk variable and the last field on the line is the string "successful", then increment the element of the array s[] indexed by two subscripts (the 2nd field in the line [the hour] with the leading zero removed if there is one, and 0 if the 3rd field in the line [the minute] is less than or equal to 29 or 1 if the minute is 30 or larger).
9th, 10th, and 11th line: After all input lines have been read, loop through every half hour with h set to the hour of the day and m set to the half hour within the hour and print the user entered date (dd), the hour as a two digit string with leading 0 fill, the starting minute as a two digit string with leading 0 fill, the hour again, the ending minute, and the numberer of times the patterns matched on the 8th line were found on lines in the input file within that half hour period.
12 line: Terminate the awk script and specify that the input to be processed is contained in a file named log.
Great thanks for your explanation.Smilie

I was trying out this piece of code below


Code:
awk -v rd="2013/03/07" -v st="00:00:00" -v et="00:29:59" '$1==rd && $2>=st && $2<=et' log | grep -c "successful"

Because i wanted make that script very generic such that in which ever place the pattern "successful" is found it should get the count. Since I want to use the same script for two more diffrent log in which the successful message can be in any place not neccesarily in the last ($NF == "successful").

But the problem i was facing here was i was not able to increment the timestamp values i.e "st" and "et" with 30 minutes and get a logic for that to repeat the piece of code for every 30 minutes.

So can you please suggest me a solution for that.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Script to search log file for last 15 mins data

Hi All, I have an issue which I'm trying to understand a way of doing, I have several nodes which contain syslog events which I want to force trigger an email initially (eventually leading to another method of alerting but to start with an email). Basically the syslog file will have hours worth... (6 Replies)
Discussion started by: mutley2202
6 Replies

2. UNIX for Dummies Questions & Answers

Extract only the data from ksh script running netezza query

Hi I searched this forum before posting the question, but couldnt find it, the issue i'm facing is, i'm trying to select a column from a netezza table from a korn shell script, but the query runs var=$(nzodbcsql -q "select MAX(millcount) from table1";) echo $var it returns the value like... (10 Replies)
Discussion started by: maximus_jack
10 Replies

3. Shell Programming and Scripting

Run Bash Script thrice & then interval for 10 mins.

Hi... I am very new to shell scripting. I have written a script with help of this forum and some googling and it works the way I want it to. Currently this script checks for my SIP trunk registration every 5 seconds, if registration is not available then it reboots my router through telnet... (4 Replies)
Discussion started by: jeetz
4 Replies

4. Shell Programming and Scripting

AWK counting interval / histogram data

My data looks like this: frame phi psi 0 68.466774 -58.170494 1 75.128593 -51.646816 2 76.083946 -64.300102 3 77.578056 -76.464218 4 63.180199 -76.067680 5 77.203979 -58.560757 6 66.574913 -60.000214 7 73.218269 -70.978203 8 70.956879 -76.096558 9 65.538872 -76.716568... (19 Replies)
Discussion started by: chrisjorg
19 Replies

5. Shell Programming and Scripting

Averaging data every 30 mins using AWK

A happy Monday to you all, I have a .csv file which contains data taken every 5 seconds. I want to average these 5 second data points into 30 minute averages! date co2 25/06/2011 08:04 8.31 25/06/2011 08:04 8.32 25/06/2011 08:04 8.33... (18 Replies)
Discussion started by: gd9629
18 Replies

6. Shell Programming and Scripting

KSH Script -- loop and data copy question

I am trying to write a script that will allow me to train others with commands that I run manually by only allowing the exact command before continuing onto the next set of commands. Here is where I come into an issue. I have changed the directories for this post. Software we run creates files... (2 Replies)
Discussion started by: hurtzdonut
2 Replies

7. Shell Programming and Scripting

How to Get 60 days Old date from current date in KSH script

Hi i am writing a cron job. so for it i need the 60 days old date form current date in variable. Like today date is 27 jan 2011 then output value will be stote in variable in formet Nov 27. i am using EST date, and tried lot of solution and see lot of post but it did not helpful for me. so... (3 Replies)
Discussion started by: Himanshu_soni
3 Replies

8. Shell Programming and Scripting

Need Help for interval date

Hello, I need help about a shell script I have a text file with this fields: 2009/01/19 09:33:35: --> ---ORA-28817: PL/SQL function returned an error. 2009/01/19 09:33:35: --> ---ORA-28817: PL/SQL function returned an error. 2009/01/19 09:33:35: --> ---ORA-28817: PL/SQL function returned an... (4 Replies)
Discussion started by: giofai
4 Replies

9. Shell Programming and Scripting

To extract data of a perticular interval (date-time wise)

I want a shell script which extract data from a log file which contains date and time-wise data and i need the data for a perticular interval of time...what can i do??? (3 Replies)
Discussion started by: abhishek27
3 Replies

10. Shell Programming and Scripting

Parsing and getting data from XML file using ksh script

Hi All, I have a xml file for example as described below <xml> <address> <street><street> <address/> <isbn>426728783932020308393930303</isbn> <book> <name> </name> </book> . . . </xml> My problem is to get the isbn number from the above described file using ksh script. Could... (6 Replies)
Discussion started by: vinna
6 Replies
Login or Register to Ask a Question