Sponsored Content
Top Forums Shell Programming and Scripting Counting multiple entries in a file using awk Post 302456353 by Franklin52 on Friday 24th of September 2010 03:00:50 AM
Old 09-24-2010
Another approach:
Code:
awk '!e{e=$1+7200} 
$1-e>0{print "Range "++i , c " entries"; e+=7200; c=0}
{c++}
END{print "Range " ++i , c " entries"}
' file

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Counting lines in multiple files

Hi, I have couple of .txt files (say 50 files) in a folder. For each file: I need to get the number of lines in each file and then that count -1 (I wanted to exclude the header. Then sum the counts of all files and output the total sum. Is there an efficient way to do this using shell... (7 Replies)
Discussion started by: Lucky Ali
7 Replies

2. Shell Programming and Scripting

multiple files: counting

In a directory, I have 5000 multiple files that contains around 4000 rows with 10 columns in each file containing a unique string 'AT' located at 4th column. OM 3328 O BT 268 5.800 7.500 4.700 0.000 1.400 OM 3329 O BT 723 8.500 8.900... (7 Replies)
Discussion started by: asanjuan
7 Replies

3. Shell Programming and Scripting

Counting duplicate entries in a file using awk

Hi, I have a very big (with around 1 million entries) txt file with IPv4 addresses in the standard format, i.e. a.b.c.d The file looks like 10.1.1.1 10.1.1.1 10.1.1.1 10.1.2.4 10.1.2.4 12.1.5.6 . . . . and so on.... There are duplicate/multiple entries for some IP... (3 Replies)
Discussion started by: sajal.bhatia
3 Replies

4. Shell Programming and Scripting

Counting occurrences of all words in multiple files

Hey Unix gurus, I would like to count the number occurrences of all the words (regardless of case) across multiple files, preferably outputting them in descending order of occurrence. This is well beyond my paltry shell scripting ability. Researching, I can find many scripts/commands that... (4 Replies)
Discussion started by: twjolson
4 Replies

5. Shell Programming and Scripting

counting particular record format in a file using AWK

I am trying to count records of particular format from a file and assign it to a variable. I tried below command br_count=wc -l "inputfile.dat"| awk -F"|" '{if (NF != "14") print }' but I amnot able to get it done. Please share me some idea how to get it done. Thanks in advance (7 Replies)
Discussion started by: siteregsam
7 Replies

6. Shell Programming and Scripting

Counting entries in a file

Hi, I have a very large two column log file in the following format: # Epoch Time IP Address 899726401 112.254.1.0 899726401 112.254.1.0 899726402 154.162.38.0 899726402 160.114.12.0 899726402 165.161.7.0 899726403 ... (39 Replies)
Discussion started by: sajal.bhatia
39 Replies

7. Shell Programming and Scripting

Awk match multiple columns in multiple lines in single file

Hi, Input 7488 7389 chr1.fa chr1.fa 3546 9887 chr5.fa chr9.fa 7387 7898 chrX.fa chr3.fa 7488 7389 chr21.fa chr3.fa 7488 7389 chr1.fa chr1.fa 3546 9887 chr9.fa chr5.fa 7898 7387 chrX.fa chr3.fa Desired Output 7488 7389 chr1.fa chr1.fa 2 3546 9887 chr5.fa chr9.fa 2... (2 Replies)
Discussion started by: jacobs.smith
2 Replies

8. Shell Programming and Scripting

Counting Multiple Fields with awk/nawk

I am trying to figure out a way in nawk to 1) get a count of the number of times a value appears in field 1 and 2) count each time the same value appears in field 2 for each value of field 1. So for example, if I have a text file with the following: grapes, purple apples, green squash, yellow... (2 Replies)
Discussion started by: he204035
2 Replies

9. Shell Programming and Scripting

Shell script with awk command for counting in a file

Hi, I hope you can help me with the awk command in shell scripting. I want to do the following, but it doesn't work. for i in $REF1 $REF2 $REF3; do awk '{if($n>=0 && $n<=50000){count+=1}} END{print count}' ${DIR}${i} >${DIR}${i}_count.txt done REF1 to REF3 are only variables for .txt... (1 Reply)
Discussion started by: y.g.
1 Replies

10. Shell Programming and Scripting

Counting lines in a file using awk

I want to count lines of a file using AWK (only) and not in the END part like this awk 'END{print FNR}' because I want to use it. Does anyone know of a way? Thanks a lot. (7 Replies)
Discussion started by: guitarist684
7 Replies
ZIPROXYLOGTOOL(1)						      ZIPROXY							 ZIPROXYLOGTOOL(1)

NAME
ziproxylogtool - is a low-level log analyser SYNOPSIS
ziproxylogtool [general_options] [filtering_options] DESCRIPTION
ziproxylogtool is a low-level log analyser, its purpose is to be invoked by scripts. It is expected to be reasonably fast and generic. These programs follow the usual GNU command line syntax, with long options starting with two dashes (`-'). A summary of options is included below. GENERAL OPTIONS
-m, --mode mode Output mode: "g" - Global stats "h" - Per host stats (accesses, in_bytes, out_bytes, compression %, hostname) "f" - Filter mode (filter log entries according to filtering options) -i, --in-file filename Input file (Ziproxy log file). If unspecified, uses stdin. -o, --out-file filename Output file (stats output). If unspecified, uses stdout. -h, --help Display summarized help FILTERING OPTIONS
If ommited, won't apply. -1, --epoch-min unix_epoch Filter entries starting from that epoch. -2, --epoch-max unix_epoch Filter entries older than that epoch. -3, --bytes-in-min bytes Filter entries which incoming_bytes >= <bytes> -4, --bytes-in-max bytes Filter entries which incoming_bytes < <bytes> -5, --bytes-out-min bytes Filter entries which outgoing_bytes >= <bytes> -6, --bytes-out-max bytes Filter entries which outgoing_bytes < <bytes> -7, --delay-min mili_seconds Filter entries which delay >= <mili_seconds> -8, --delay-max mili_seconds Filter entries which delay < <mili_seconds> SEE ALSO
ziproxy(1) AUTHOR
ziproxylogtool was written by Daniel Mealha Cabrita. This manual page was written by Marcos Talau <marcostalau@gmail.com>. ZIPROXY
December 02, 2007 ZIPROXYLOGTOOL(1)
All times are GMT -4. The time now is 06:03 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy