Sponsored Content
Full Discussion: Remove Doubles Without Sort?
Top Forums UNIX for Dummies Questions & Answers Remove Doubles Without Sort? Post 302742901 by rdcwayx on Tuesday 11th of December 2012 07:58:55 PM
Old 12-11-2012
This User Gave Thanks to rdcwayx For This Post:
 

10 More Discussions You Might Find Interesting

1. Programming

long doubles

hey there, i've been trrying to calculate the first 10000 fibonacci numbers using a long double. weird thing is that from a certain value it returns Inf. i'm declaring the vars as long double var; and printing them to a file using: fprintf(filepointer, "%.0Ld\n", var); am i doing... (1 Reply)
Discussion started by: crashnburn
1 Replies

2. Solaris

How to remove duplicate records with out sort

Can any one give me command How to delete duplicate records with out sort. Suppose if the records like below: 345,bcd,789 123,abc,456 234,abc,456 712,bcd,789 out tput should be 345,bcd,789 123,abc,456 Key for the records is 2nd and 3rd fields.fields are seperated by colon(,). (2 Replies)
Discussion started by: svenkatareddy
2 Replies

3. Shell Programming and Scripting

How to remove duplicate records with out sort

Can any one give me command How to delete duplicate records with out sort. Suppose if the records like below: 345,bcd,789 123,abc,456 234,abc,456 712,bcd,789 out tput should be 345,bcd,789 123,abc,456 Key for the records is 2nd and 3rd fields.fields are seperated by colon(,). (19 Replies)
Discussion started by: svenkatareddy
19 Replies

4. UNIX Desktop Questions & Answers

need help writing a program to look for doubles

to determine if two two doubles are equal, we check to see if their absolute difference is very close to zero. . .if two numbers are less than .00001 apart, theyre equal. keep a count field in each record (as you did in p5). once the list is complete, ask the user to see if an element is on... (2 Replies)
Discussion started by: rickym2626
2 Replies

5. Shell Programming and Scripting

remove duplicates and sort

Hi, I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another. Thanks (6 Replies)
Discussion started by: dvah
6 Replies

6. Shell Programming and Scripting

awk syntax mistake doubles desired output

I am trying to add a line to a BASH shell script to print out a large variable length table on a web page. I am very new to this obviously, but I tried this with awk and it prints out every line twice. What I am doing wrong? echo "1^2^3%4^5^6%7^8^9%" | awk 'BEGIN { RS="%"; FS="^"; } {for (i =... (6 Replies)
Discussion started by: awknewb123
6 Replies

7. UNIX for Dummies Questions & Answers

Grep words with X doubles only

Hi! I'm trying to figure out how to find words with X number of doubles, only. I'm searching a dictionary, (one word per line). For instance, if you want to find words containing only one pair of double letters, you could do something like this: egrep '(.)\1' wordlist.txt |egrep -v '(.)\1.*(.)\2'... (3 Replies)
Discussion started by: sudon't
3 Replies

8. Shell Programming and Scripting

Bash - remove duplicates without sort

I need to use bash to remove duplicates without using sort first. I can not use: cat file | sort | uniq But when I use only cat file | uniq some duplicates are not removed. (4 Replies)
Discussion started by: locoroco
4 Replies

9. Shell Programming and Scripting

Sort and Remove duplicates

Here is my task : I need to sort two input files and remove duplicates in the output files : Sort by 13 characters from 97 Ascending Sort by 1 characters from 96 Ascending If duplicates are found retain the first value in the file the input files are variable length, convert... (4 Replies)
Discussion started by: ysvsr1
4 Replies

10. Shell Programming and Scripting

Concatenate and sort to remove duplicates

Following is the input. 1st and 3rd block are same(block starts here with '*' and ends before blank line) , 2nd and 4th blocks are also the same: cat <file> * Wed Feb 24 2016 Tariq Saeed <tariq.x.saeed@mail.com> 2.0.7-1.0.7 - add vmcore dump support for ocfs2 * Mon Jun 8 2015 Brian Maly... (4 Replies)
Discussion started by: Paras Pandey
4 Replies
UNIQ(1) 						    BSD General Commands Manual 						   UNIQ(1)

NAME
uniq -- report or filter out repeated lines in a file SYNOPSIS
uniq [-cdu] [-f fields] [-s chars] [input_file [output_file]] DESCRIPTION
The uniq utility reads the standard input comparing adjacent lines, and writes a copy of each unique input line to the standard output. The second and succeeding copies of identical adjacent input lines are not written. Repeated lines in the input will not be detected if they are not adjacent, so it may be necessary to sort the files first. The following options are available: -c Precede each output line with the count of the number of times the line occurred in the input, followed by a single space. -d Don't output lines that are not repeated in the input. -f fields Ignore the first fields in each input line when doing comparisons. A field is a string of non-blank characters separated from adja- cent fields by blanks. Field numbers are one based, i.e. the first field is field one. -s chars Ignore the first chars characters in each input line when doing comparisons. If specified in conjunction with the -f option, the first chars characters after the first fields fields will be ignored. Character numbers are one based, i.e. the first character is character one. -u Don't output lines that are repeated in the input. If additional arguments are specified on the command line, the first such argument is used as the name of an input file, the second is used as the name of an output file. The uniq utility exits 0 on success, and >0 if an error occurs. COMPATIBILITY
The historic +number and -number options have been deprecated but are still supported in this implementation. SEE ALSO
sort(1) STANDARDS
The uniq utility is expected to be IEEE Std 1003.2 (``POSIX.2'') compatible. BSD
January 6, 2007 BSD
All times are GMT -4. The time now is 10:19 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy