Sponsored Content
Full Discussion: sort | uniq question
Top Forums Shell Programming and Scripting sort | uniq question Post 302493408 by drl on Wednesday 2nd of February 2011 04:07:49 PM
Old 02-02-2011
Hi, dba_frog.
Quote:
Originally Posted by dba_frog
i tried this
Code:
awk '!a[$1$2]++' filename

on this
Code:
01/Feb/2011   -- User Count : 27
  31/Jan/2011   --  User Count : 21
  02/Feb/2011   -- User Count : 24
  30/Jan/2011   --  User Count : 4

and it didn't sort by mo & day. But, I assumed that is because I didn't specify the correct columns.
The main purpose of this thread is to choose the correct line among lines that have the same value for a field.

Although sorting may be involved in some solutions, the purpose of most of the awk codes is to remove duplicates.

If you are interested in sorting your data, I suggest that you start a new thread.

Best wishes ... cheers, drl
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

sort/uniq

I have a file: Fred Fred Fred Jim Fred Jim Jim If sort is executed on the listed file, shouldn't the output be?: Fred Fred Fred Fred Jim Jim Jim (3 Replies)
Discussion started by: jimmyflip
3 Replies

2. UNIX for Dummies Questions & Answers

Help with Last,uniq, sort and cut

Using the last, uniq, sort and cut commands, determine how many times the different users have logged in. I know how to use the last command and cut command... i came up with last | cut -f1 -d" " | uniq i dont know if this is right, can someone please help me... thanks (1 Reply)
Discussion started by: jay1228
1 Replies

3. Shell Programming and Scripting

sort and uniq in perl

Does anyone have a quick and dirty way of performing a sort and uniq in perl? How an array with data like: this is bkupArr BOLADVICE_VN this is bkupArr MLT6800PROD2A this is bkupArr MLT6800PROD2A this is bkupArr BOLADVICE_VN_7YR this is bkupArr MLT6800PROD2A I want to sort it... (4 Replies)
Discussion started by: reggiej
4 Replies

4. Shell Programming and Scripting

Sort, Uniq, Duplicates

Input File is : ------------- 25060008,0040,03, 25136437,0030,03, 25069457,0040,02, 80303438,0014,03,1st 80321837,0009,03,1st 80321977,0009,03,1st 80341345,0007,03,1st 84176527,0047,03,1st 84176527,0047,03, 20000735,0018,03,1st 25060008,0040,03, I am using the following in the script... (5 Replies)
Discussion started by: Amruta Pitkar
5 Replies

5. Shell Programming and Scripting

Help with Uniq and sort

The key is first field i want only uniq record for the first field in file. I want the output as or output as Appreciate help on this (4 Replies)
Discussion started by: pinnacle
4 Replies

6. Shell Programming and Scripting

Sort and uniq after comparision

Hi All, I have a text file with the format shown below. Some of the records are duplicated with the only exception being date (Field 15). I want to compare all duplicate records using subscriber number (field 7) and keep only those records with greater date. ... (1 Reply)
Discussion started by: nua7
1 Replies

7. Shell Programming and Scripting

Sort field and uniq

I have a flatfile A.txt 2012/12/04 14:06:07 |trees|Boards 2, 3|denver|mekong|mekong12 2012/12/04 17:07:22 |trees|Boards 2, 3|denver|mekong|mekong12 2012/12/04 17:13:27 |trees|Boards 2, 3|denver|mekong|mekong12 2012/12/04 14:07:39 |rain|Boards 1|tampa|merced|merced11 How do i sort and get... (3 Replies)
Discussion started by: sabercats
3 Replies

8. Shell Programming and Scripting

Sort uniq or awk

Hi again, I have files with the following contents datetime,ip1,port1,ip2,port2,number How would I find out how many times ip1 field shows up a particular file? Then how would I find out how many time ip1 and port 2 shows up? Please mind the file may contain 100k lines. (8 Replies)
Discussion started by: LDHB2012
8 Replies

9. Shell Programming and Scripting

Uniq or sort -u or similar only between { }

Hi ! I am trying to remove doubbled entrys in a textfile only between delimiters. Like that example but i dont know how to do that with sort or similar. input: { aaa aaa } { aaa aaa } output: { aaa } { (8 Replies)
Discussion started by: fugitivus
8 Replies

10. UNIX for Dummies Questions & Answers

Uniq and sort -u

Hello all, Need to pick your brains, I have a 10Gb file where each row is a name, I am expecting about 50 names in total. So there are a lot of repetitions in clusters. So I want to do a sort -u file Will it be considerably faster or slower to use a uniq before piping it to sort... (3 Replies)
Discussion started by: senhia83
3 Replies
uniq(1) 						      General Commands Manual							   uniq(1)

Name
       uniq - report repeated lines in a file

Syntax
       uniq [-udc[+n][-n]] [input[output]]

Description
       The  command  reads  the  input	file comparing adjacent lines.	In the normal case, the second and succeeding copies of repeated lines are
       removed; the remainder is written on the output file.  Note that repeated lines must be adjacent in order to be found.  For further  infor-
       mation, see

Options
       The n arguments specify skipping an initial portion of each line in the comparison:

       -n Skips specified number of fields.  A field is defined as a string of non-space, non-tab characters separated by tabs and spaces from its
	  neighbors.

       +n Skips specified number of characters in addition to fields.  Fields are skipped before characters.

       -c Displays number of repetitions, if any, for each line.

       -d Displays only lines that were repeated.

       -u Displays only unique (nonrepeated) lines.

       If the -u flag is used, just the lines that are not repeated in the original file are output.  The -d option specifies  that  one  copy	of
       just the repeated lines is to be written.  The normal mode output is the union of the -u and -d mode outputs.

       The  -c option supersedes -u and -d and generates an output report in default style but with each line preceded by a count of the number of
       times it occurred.

See Also
       comm(1), sort(1)

																	   uniq(1)
All times are GMT -4. The time now is 05:43 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy