Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Remove duplicates from a file Post 302804675 by vidyadhar85 on Thursday 9th of May 2013 04:39:00 AM
Old 05-09-2013
Quote:
Originally Posted by saga20
Here is the sample input & output file-
Input- Emp.txt
Empid Empname Joining_Date Salary
101 ABC 11/01/1991 5000
104 MNO 17/04/1984 12000
102 DEF 15/04/1998 8000
101 ABC 11/01/1991 5000
107 XYZ 24/09/1978 6000
104 MNO 17/04/1984 12000
101 ABC 11/01/1991 5000

Output- Emp.txt
Empid Empname Joining_Date Salary
101 ABC 11/01/1991 5000
104 MNO 17/04/1984 12000
102 DEF 15/04/1998 8000
107 XYZ 24/09/1978 6000

I want the output in the same file...
sort -u will work for you
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates from File from specific location

How can i remove the duplicate lines from a file, for example sample123456Sample testing123456testing XXXXX131323XXXXX YYYYY423432YYYYY fsdfdsf123456gsdfdsd all the duplicates from column 6-12 , must be deleted. I want to consider the first row, if same comes in the given range i want to... (1 Reply)
Discussion started by: gopikgunda
1 Replies

2. Shell Programming and Scripting

remove duplicates within a block in a file..help required

hi.. i have a file in the following format :- name-a age -12 address-123 age-12 phone-22222 ============ name-ab age -11 address-123 age-11 phone-222223 ============= name-abc age -12 address-1234 age-12 phone-2222223 ============= (2 Replies)
Discussion started by: nipun_garg
2 Replies

3. Shell Programming and Scripting

Shell script to remove duplicates lines in a file

Hi, I am writing a shell script that needs to remove duplicate lines within a file by category. example: section a a c b a section b a b a c I need to remove the duplicates within th category with out removing the duplicates from the 2 different sections (one of the a's in section... (1 Reply)
Discussion started by: RichElks
1 Replies

4. Shell Programming and Scripting

Remove duplicates from end of file

1/p ---- A B C A C o/p --- B A C From input file it should remove duplicates from end without changing order (5 Replies)
Discussion started by: lavnayas
5 Replies

5. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

6. Shell Programming and Scripting

Search based on 1,2,4,5 columns and remove duplicates in the same file.

Hi, I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file. Source filename: Filename.csv "1","ccc","information","5000","temp","concept","new" "1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies

7. Shell Programming and Scripting

How to remove duplicates from the .dat file

All, I have a file 1181CUSTOMER-L061411_003500.dat.Z having duplicate records in it. bash-2.05$ zcat 1181CUSTOMER-L061411_003500.dat.Z|grep "90876251S" 90876251S|ABG, AN ADAYANA COMPANY|3550 DEPAUW BLVD|||US|IN|INDIANAPOLIS||DAL|46268||||||GEN|||||||USD|||ABG, AN ADAYANA... (3 Replies)
Discussion started by: Oracle_User
3 Replies

8. UNIX for Dummies Questions & Answers

Remove duplicates and keep them in a separate file

Hi, I have a tablular separated file and I want to remove all the rows that have duplicates. The diuplicates I need to check are in column 13. I have tried to use awk but I have no Idea how to keep the duplicate file. awk 'FNR==NR{a++;next}(a> 1)' tomodify.txt tomodify.txt > new.txt ... (4 Replies)
Discussion started by: flacchy
4 Replies

9. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies

10. UNIX for Advanced & Expert Users

Remove duplicates in flat file

Hi all, I have a issues while loading a flat file to the DB. It is taking much time. When analyzed i found out that there are duplicates entry in the flat file. There are 2 type of Duplicate entry. 1) is entire row is duplicate. ( i can use sort | uniq) to remove the duplicated entry. 2) the... (4 Replies)
Discussion started by: samjoshuab
4 Replies
ISUPPER(3)						   BSD Library Functions Manual 						ISUPPER(3)

NAME
isupper -- upper-case character test LIBRARY
Standard C Library (libc, -lc) SYNOPSIS
#include <ctype.h> int isupper(int c); DESCRIPTION
The isupper() function tests for any upper-case letter. The value of the argument must be representable as an unsigned char or the value of EOF. In the ASCII character set, this includes the following characters (preceded by their numeric values, in octal): 101 ``A'' 102 ``B'' 103 ``C'' 104 ``D'' 105 ``E'' 106 ``F'' 107 ``G'' 110 ``H'' 111 ``I'' 112 ``J'' 113 ``K'' 114 ``L'' 115 ``M'' 116 ``N'' 117 ``O'' 120 ``P'' 121 ``Q'' 122 ``R'' 123 ``S'' 124 ``T'' 125 ``U'' 126 ``V'' 127 ``W'' 130 ``X'' 131 ``Y'' 132 ``Z'' RETURN VALUES
The isupper() function returns zero if the character tests false and returns non-zero if the character tests true. COMPATIBILITY
The 4.4BSD extension of accepting arguments outside of the range of the unsigned char type in locales with large character sets is considered obsolete and may not be supported in future releases. The iswupper() function should be used instead. SEE ALSO
ctype(3), isalnum_l(3), iswupper(3), toupper(3), ascii(7) STANDARDS
The isupper() function conforms to ISO/IEC 9899:1990 (``ISO C90''). BSD
July 17, 2005 BSD
All times are GMT -4. The time now is 01:21 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy