Sponsored Content
Full Discussion: Grouping and counting
Top Forums Shell Programming and Scripting Grouping and counting Post 302979557 by Akshay Hegde on Tuesday 16th of August 2016 08:44:34 AM
Old 08-16-2016
To retain order you may try this

Code:
akshay@localhost tmp]$ cat file
1234|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|AA
ABCD|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|AA
EFGH|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|CC
IJKLM|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|CC
EFGH|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|BB
IJKLM|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|BB
NOPQ|"ZZZ"|"Date"|"1"|"Y"|"ABC"|""|BB

Code:
[akshay@localhost tmp]$ awk -F\| '!($NF in c){d[++q]=$NF}{c[$NF]++}END{for(i=1; i in d; i++)print d[i],c[d[i]]}' file
AA 2
CC 2
BB 3

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

egrep and grouping

i am using the c shell on solaris. directories i'm working with: ls -1d DIV* DIV_dental/ DIV_ibc/ DIV_ifc/ DIV_index/ DIV_pharm/ DIV_sectionI/ DIV_sectionI-title/ DIV_sectionI-toc/ DIV_sectionII-title/ DIV_sectionII-toc/ DIV_standing/ DIV_standing-toc/ DIV_title/ DIV_vision/ (1 Reply)
Discussion started by: effigy
1 Replies

2. UNIX for Dummies Questions & Answers

Help with data grouping

Hi all, I have a set data as shown below, and i would like to eliminate the name that no children - boy and girl. What is the appropriate command can i use(other than grep)? Please assist... My input: name sex marital status children - boy children - girl ... (3 Replies)
Discussion started by: 793589
3 Replies

3. UNIX for Advanced & Expert Users

grouping lines

Hi all, I have input lines like below: A;100;Paris;City;10;0;0 A;100;Paris;City;0;10;0 A;100;Paris;City in Europe;0;0;20 B;101;London;City;20;0;0 B;101;London;City;0;20;0 B;101;London;City in Europe;0;0;40 I need to group the above lines to: A;100;Paris;City in Europe;10;10;20... (4 Replies)
Discussion started by: andy2000
4 Replies

4. Shell Programming and Scripting

Selective grouping

I have a text file in this format. Group: AAA Notes: IP : 11.11.11.11 #User xxxxxxxxx #Password aaaaaaaaaaaaaaaa Group: AAA Notes: IP : 11.11.11.22 #User yyyyyyyyyyyyy #Password bbbbbbbbbbbbb (8 Replies)
Discussion started by: anil510
8 Replies

5. UNIX for Dummies Questions & Answers

Grouping in grep

How do you do grouping in grep? Here's how I tried it at first: egrep 'qualit(y|ies)' /usr/share/dict/words -bash: syntax error near unexpected token `(' I'm using GNUgrep, and I found this on their site. grep regular expression syntax So I tried this: egrep 'qualit\(y\|ies\)'... (2 Replies)
Discussion started by: sudon't
2 Replies

6. Shell Programming and Scripting

Grouping

Hi all, I am using following command: perl program.pl input.txt output.txt CUTOFF 3 > groups_3.txt containing program.pl, two files (input.txt, output.txt) and getting output in groups_3.txt: But, I wish to have 30 files corresponding to each CUTOFF ranging from 0 to 30 using the same... (1 Reply)
Discussion started by: bioinfo
1 Replies

7. Shell Programming and Scripting

UNIX grouping

Hi guys, I am a complete newbie to unix and have been tasked with creating a script to group the following data (file) by hourly slots so that I can count the transactions completed within the peak hour. I am not sure how to group data like this in unix. Can anyone please help? Here is an... (1 Reply)
Discussion started by: MrMidas
1 Replies

8. Shell Programming and Scripting

Name grouping

awk 'FNR==NR {a; next} $NF in a' genes.txt refseq_exons.txt > output.txt I can not figure out how to group the same name in $4 together. Basically, all the SKI together in separate rows and all the TGFB2. Thank you :). chr1 2160133 2161174 SKI chr1 218518675 218520389 TGFB2... (1 Reply)
Discussion started by: cmccabe
1 Replies

9. Shell Programming and Scripting

Grouping and Calculating

Hi All, I want to read the input file and store the output in the Output file. I pasted the sample Input and Output file below. Help me with this. Input file ================================= ITEM1 AAAAA 1 ITEM1 BBBBB 1 ITEM1 CCCCC 1 ITEM2 AAAAA 5 ITEM2 CCCCC 4... (1 Reply)
Discussion started by: humaemo
1 Replies

10. Shell Programming and Scripting

Help with grouping and zipping

Hi can you please help with the below ? source file: Column1,Column2,Column3,Column4 abc,123,dir1/FXX/F19,1 abc,123,dir1/FXX/F20,1 abc,123,dir1/FXX/F23,2 abc,123,dir1/FXX/C25,2 abc,123,dir1/FXX/X25,2 abc,123,dir1/FXX/A23,3 abc,123,dir1/FXX/Z25,3 abc,123,dir1/FXX/Y25,4 I want to... (3 Replies)
Discussion started by: paul1234
3 Replies
MMSEG(1)						User Contributed Perl Documentation						  MMSEG(1)

NAME
mmseg - maximum matching segment Chinese text. SYNOPSIS
mmseg -d dict_file [option]... [corpus_file]... DESCRIPTION
mmseg is a tool for segmenting Chinese text into words using maximum matching algorithm. mmseg segments corpus_file, or standard input if no filename is specified, and write the segmented result to standard output. OPTIONS
-d dict_file Use dict_file as lexicon. A default lexicon can be found at /usr/share/sunpinyin-slm/dict.utf8. -f,--format (text|bin) Output Format, can be 'text' or 'bin'. default 'bin'. Normally, in text mode, word text are output, while in binary mode, binary short integer of the word-ids are written to stdout. -s, --stok STOK_ID Sentence token id. Default 10. It will be written to output in binary mode after every sentence. -i, --show-id Show Id info. Under text output format mode, attach id after known words. If under binary mode, print id(s) in text. -a, --ambiguious-id AMBI-ID Ambiguious means ABC => A BC or AB C. If specified (AMBI-ID != 0), The sequence ABC will not be segmented, in binary mode, the AMBI-ID is written out; in text mode, "<ambi>ABC</ambi>" will be output. Default is 0. NOTES
Under binary mode, consecutive id of 0 are merged into one 0. Under text mode, no space are inserted between unknown-words. AUTHOR
Originally written by Phill.Zhang <phill.zhang@sun.com>. Currently maintained by Kov.Chai <tchaikov@gmail.com>. SEE ALSO
slmseg(1), ids2ngram (1). perl v5.14.2 2012-06-09 MMSEG(1)
All times are GMT -4. The time now is 11:20 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy