Sponsored Content
Top Forums Shell Programming and Scripting HELP - uniq values per column Post 302984680 by RudiC on Saturday 29th of October 2016 01:10:06 PM
Old 10-29-2016
Jumping in as Don Cragun seems to be logged out at the moment:
Code:
awk '
function AddFieldData(field)    {if ( (field, $field) in data) return                           # if field No. and field contents occurred (and registered) before, quit function
                                                                                                # data is an array whose elements'' contents don''t count, only its indices

                                 data[ field, FieldValue [field, ++count[field] ] = $field]     # this is quite complex (array elements are defined on first reference):
                                                                                                # register field No. and field contents as index into data array, and
                                                                                                #   at the same time, create FieldValue array element holding field contents and indexed
                                                                                                #   by field No. and an incremented counter for the field No. 
                                 if (count[field] > maxr) maxr = count[field]                   # keep max count across ALL fields (i.e. the No. of lines to be printed
                                                                                                #   e.g. for three unique values in field2 we need to print three lines)
                                 if (field > maxc) maxc = field                                 # keep max of NF across all lines
                                }

        {for(i = 1; i <= NF; i++) AddFieldData(i)                                               # main: call above function for ALL fields in ALL lines filling the necessary arrays
        }

END     {for (i = 1; i <= maxr; i++)                                                            # print maxr lines with
           for (j = 1; j <= maxc; j++)                                                          # maxc fields, each
                printf("%s%s", FieldValue[j, i], (j == maxc) ? ORS : OFS)                       # a sort of random distribution of unique field values ; after last (maxc)
                                                                                                # field print line feed (ORS) else field separator (OFS, space)
        }
'  file

This User Gave Thanks to RudiC For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Column sum group by uniq records

Dear All, I want to get help for below case. I have a file like this. saman 1 gihan 2 saman 4 ravi 1 ravi 2 so i want to get the result, saman 5 gihan 2 ravi 3 like this. Pls help me. (17 Replies)
Discussion started by: Nayanajith
17 Replies

2. Shell Programming and Scripting

print unique values of a column and sum up the corresponding values in next column

Hi All, I have a file which is having 3 columns as (string string integer) a b 1 x y 2 p k 5 y y 4 ..... ..... Question: I want get the unique value of column 2 in a sorted way(on column 2) and the sum of the 3rd column of the corresponding rows. e.g the above file should return the... (6 Replies)
Discussion started by: amigarus
6 Replies

3. UNIX for Dummies Questions & Answers

Re: How To Use UNIQ UNIX Command On single Column

Hi , Can You Please let Know How use unix uniq command on a single column for deleting records from file with Below Structure.Pipe Delimter File . Source Name | Account_Id A | 101 B... (2 Replies)
Discussion started by: anudeepkumar123
2 Replies

4. UNIX for Dummies Questions & Answers

shift values in one column as header for values in another column

Hi Gurus, I have a tab separated text file with two columns. I would like to make the first column values as headings for the second column values. Ex. >value1 subjects >value2 priorities >value3 requirements ...etc and I want to have a file >value1 subjects >value2 priorities... (4 Replies)
Discussion started by: Unilearn
4 Replies

5. Shell Programming and Scripting

for uniq entries add values in corresponding columns

Hi, I have a file as listed below.. What I want to get is for each unique value in column 1 the corresponding values in the rest of the columns should be summed up.. AAK1 0 1 0 11 AAK1 0 0 1 1 AAK1 0 0 1 2... (2 Replies)
Discussion started by: Diya123
2 Replies

6. Shell Programming and Scripting

awk uniq and longest string of a column as index

I met a challenge to filter ~70 millions of sequence rows and I want using awk with conditions: 1) longest string of each pattern in column 2, ignore any sub-string, as the index; 2) all the unique patterns after 1); 3) print the whole row; input: 1 ABCDEFGHI longest_sequence1 2 ABCDEFGH... (12 Replies)
Discussion started by: yifangt
12 Replies

7. Shell Programming and Scripting

Filter uniq field values (non-substring)

Hello, I want to filter column based on string value. All substring matches are filtered out and only unique master strings are picked up. infile: 1 abcd 2 abc 3 abcd 4 cdef 5 efgh 6 efgh 7 efx 8 fgh Outfile: 1 abcd 4 cdef 5 efgh 7 efxI have tried awk '!a++; match(a, $2)>0'... (32 Replies)
Discussion started by: yifangt
32 Replies

8. Shell Programming and Scripting

Uniq count second column

Hello How can I get a number of occurrence count for this file; ERR315389.1000156 CTTGAAGAAGAATTGAAAACTGTGACGAACAACTTGAAGTCACTGGAGGCTCAGGCTGAGAAGTACTCGCAGAAGGAAGACAGATATGAGGAAGAG ERR315389.1000281 ... (3 Replies)
Discussion started by: Wan Fahmi
3 Replies

9. Shell Programming and Scripting

Bring values in the second column into single line (comma sep) for uniq value in the first column

I want to bring values in the second column into single line for uniq value in the first column. My input jvm01, Web 2.0 Feature Pack Library jvm01, IBM WebSphere JAX-RS jvm01, Custom01 Shared Library jvm02, Web 2.0 Feature Pack Library jvm02, IBM WebSphere JAX-RS jvm03, Web 2.0 Feature... (10 Replies)
Discussion started by: kchinnam
10 Replies

10. UNIX for Beginners Questions & Answers

Get first column value uniq

Hi All, I have a directory and sub-directory that having ‘n' number of .log file in nearly 1GB. The file is comma separated file. I need to recursively grep and uniq first column values only. I did in perl. But i wish to know more command line utilities to calculate the time for grep and... (4 Replies)
Discussion started by: k_manimuthu
4 Replies
uniq(1) 							   User Commands							   uniq(1)

NAME
uniq - report or filter out repeated lines in a file SYNOPSIS
uniq [-c | -d | -u] [-f fields] [-s char] [ input_file [output_file]] uniq [-c | -d | -u] [-n] [ + m] [ input_file [output_file]] DESCRIPTION
The uniq utility will read an input file comparing adjacent lines, and write one copy of each input line on the output. The second and suc- ceeding copies of repeated adjacent input lines will not be written. Repeated lines in the input will not be detected if they are not adjacent. OPTIONS
The following options are supported: -c Precedes each output line with a count of the number of times the line occurred in the input. -d Suppresses the writing of lines that are not repeated in the input. -f fields Ignores the first fields fields on each input line when doing comparisons, where fields is a positive decimal integer. A field is the maximal string matched by the basic regular expression: [[:blank:]]*[^[:blank:]]* If fields specifies more fields than appear on an input line, a null string will be used for comparison. -s chars Ignores the first chars characters when doing comparisons, where chars is a positive decimal integer. If specified in con- junction with the -f option, the first chars characters after the first fields fields will be ignored. If chars specifies more characters than remain on an input line, a null string will be used for comparison. -u Suppresses the writing of lines that are repeated in the input. -n Equivalent to -f fields with fields set to n. +m Equivalent to -s chars with chars set to m. OPERANDS
The following operands are supported: input_file A path name of the input file. If input_file is not specified, or if the input_file is -, the standard input will be used. output_file A path name of the output file. If output_file is not specified, the standard output will be used. The results are unspeci- fied if the file named by output_file is the file named by input_file. EXAMPLES
Example 1: Using the uniq command The following example lists the contents of the uniq.test file and outputs a copy of the repeated lines. example% cat uniq.test This is a test. This is a test. TEST. Computer. TEST. TEST. Software. example% uniq -d uniq.test This is a test. TEST. example% The next example outputs just those lines that are not repeated in the uniq.test file. example% uniq -u uniq.test TEST. Computer. Software. example% The last example outputs a report with each line preceded by a count of the number of times each line occurred in the file: example% uniq -c uniq.test 2 This is a test. 1 TEST. 1 Computer. 2 TEST. 1 Software. example% ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of uniq: LANG, LC_ALL, LC_CTYPE, LC_MES- SAGES, and NLSPATH. EXIT STATUS
The following exit values are returned: 0 Successful completion. >0 An error occurred. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWesu | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ SEE ALSO
comm(1), pack(1), pcat(1), sort(1), uncompress(1), attributes(5), environ(5), standards(5) SunOS 5.10 20 Dec 1996 uniq(1)
All times are GMT -4. The time now is 04:56 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy