Sponsored Content
Top Forums UNIX for Dummies Questions & Answers removing duplicates and sort -k Post 302161421 by orahi001 on Thursday 24th of January 2008 01:04:28 PM
Old 01-24-2008
removing duplicates and sort -k

Hello experts,

I am trying to remove all lines in a csv file where the 2nd columns is a duplicate. I am try to use sort with the key parameter


sort -u -k 2,2 File.csv > Output.csv


File.csv
File Name|Document Name|Document Title|Organization
Word Doc 1.doc|Word Document|Sample Doc|Org 1
Exl Doc 1.xls|Excel Sheet|Sample Sheet|Org 2
Pdf File 1.pdf|Pdf|Sample pdf|Org3
Exl Sheet 2.xls|Excel Sheet|Test Spreadsheet|Org 2



I want Output.csv to remove the 2nd Excell Sheet line
Output.csv
File Name|Document Name|Document Title|Organization
Word Doc 1.doc|Word Document|Sample Doc|Org 1
Exl Doc 1.xls|Excel Sheet|Sample Sheet|Org 2
Pdf File 1.pdf|Pdf|Sample pdf|Org3


I believe the -k option uses spaces to determine the start and end fields

My file seperator is a '|' so I want to remove the line with the duplicate Document Name (2nd column).

Can this be done using the -k option of sort or is there another way to perform this task?


thanks
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing duplicates

Hi, I've been trying to removed duplicates lines with similar columns in a fixed width file and it's not working. I've search the forum but nothing comes close. I have a sample file: 27147140631203RA CCD * 27147140631203RA PPN * 37147140631207RD AAA 47147140631203RD JNA... (12 Replies)
Discussion started by: giannicello
12 Replies

2. Shell Programming and Scripting

Removing duplicates [sort , uniq]

Hey Guys, I have file which looks like this, Contig201#numbPA Contig1452#nmdynD6PA dm022p15.r#CG6461PA dm005e16.f#SpatPA IGU001_0015_A06.f#CG17593PA I need to remove duplicates based on the chracter matching upto '#'. for example if we consider this.. Contig201#numbPA... (4 Replies)
Discussion started by: sharatz83
4 Replies

3. Shell Programming and Scripting

removing duplicates

Hi I have a file that are a list of people & their credentials i recieve frequently The issue is that whne I catnet this list that duplicat entries exists & are NOT CONSECUTIVE (i.e. uniq -1 may not weork here ) I'm trying to write a scrip that will remove duplicate entries the script can... (5 Replies)
Discussion started by: stevie_velvet
5 Replies

4. Shell Programming and Scripting

Removing duplicates

Hi, I have a file in the below format., test test (10) to to (25) see see (45) and i need the output in the format of test 10 to 25 see 45 Some one help me? (6 Replies)
Discussion started by: imdadulla
6 Replies

5. UNIX for Advanced & Expert Users

removing duplicates.

Hi All In unix ,we have a file ,there we have to remove the duplicates by using one specific column. Can any body tell me the command. ex: file1 id,name 1,ww 2,qwq 2,asas 3,asa 4,asas 4,asas o/p: 1,ww 2,qwq 3,asa (7 Replies)
Discussion started by: raju4u
7 Replies

6. Shell Programming and Scripting

Removing duplicates

I have a test file with the following 2 columns: Col 1 | Col 2 T1 | 1 <= remove T5 | 1 T4 | 2 T1 | 3 T3 | 3 T4 | 1 <= remove T1 | 2 <= remove T3 ... (7 Replies)
Discussion started by: gctex
7 Replies

7. Emergency UNIX and Linux Support

Removing all the duplicates

i want to remove all the duplictaes in a file.I dont want even a single entry. For the input data: 12345|12|34 12345|13|23 3456|12|90 15670|12|13 12345|10|14 3456|12|13 i need the below data in one file 15670|12|13 and the below data in another file (9 Replies)
Discussion started by: pandeesh
9 Replies

8. Shell Programming and Scripting

Removing Duplicates from file

Hi Experts, Please check the following new requirement. I got data like the following in a file. FILE_HEADER 01cbbfde7898410| 3477945| home| 1 01cbc275d2c122| 3478234| WORK| 1 01cbbe4362743da| 3496386| Rich Spare| 1 01cbc275d2c122| 3478234| WORK| 1 This is pipe separated file with... (3 Replies)
Discussion started by: tinufarid
3 Replies

9. Shell Programming and Scripting

Help in removing duplicates

I have an input file abc.txt with info like: abcd rateuse inklite robet rateuse abcd I need to remove duplicates from the file (eg: abcd,rateuse) from the file and need to place the contents in same file abc.txt if needed can be placed in another file. can anyone help me in this :( (4 Replies)
Discussion started by: rkrish
4 Replies

10. Shell Programming and Scripting

Removing duplicates from new file

i hav two files like i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3 (2 Replies)
Discussion started by: sagar_1986
2 Replies
obd2csv(1)						      General Commands Manual							obd2csv(1)

NAME
obd2csv - Convert obdgpslogger(1) logs to csv files SYNOPSIS
obd2csv [ options ] DESCRIPTION
Convert obdgpslogger(1) logs to csv files OPTIONS
-o|--out <output filename> Output to this .csv file -d|--db <database> Work from logs stored in this database file -s|--start <time> Only dump rows more recent than this -e|--end <time> Only dump rows older than this -z|--gzip gzip compress output using zlib [if available] -v|--version Print out version number and exit. -h|--help Print out help and exit. NOT OPTIONS
These options aren't intended for end-users, they're for the GUI. -p|--progress Print out progress. It will occasionally print a number in the range [0..100], indicating progress percentage. SEE ALSO
obdgpslogger(1), obd2kml(1), obd2gpx(1), obdsim(1), obdgui(1), obdlogrepair(1) AUTHORS
Gary "Chunky Ks" Briggs <chunky@icculus.org> obd2csv(1)
All times are GMT -4. The time now is 07:24 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy