Sponsored Content
Full Discussion: Help in removing duplicates
Top Forums Shell Programming and Scripting Help in removing duplicates Post 302612253 by Scrutinizer on Sunday 25th of March 2012 11:20:15 AM
Old 03-25-2012
Hi, did you try searching the forum first for "duplicate"?
Code:
awk '!A[$1]++' infile

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing duplicates

Hi, I've been trying to removed duplicates lines with similar columns in a fixed width file and it's not working. I've search the forum but nothing comes close. I have a sample file: 27147140631203RA CCD * 27147140631203RA PPN * 37147140631207RD AAA 47147140631203RD JNA... (12 Replies)
Discussion started by: giannicello
12 Replies

2. UNIX for Dummies Questions & Answers

removing duplicates and sort -k

Hello experts, I am trying to remove all lines in a csv file where the 2nd columns is a duplicate. I am try to use sort with the key parameter sort -u -k 2,2 File.csv > Output.csv File.csv File Name|Document Name|Document Title|Organization Word Doc 1.doc|Word Document|Sample... (3 Replies)
Discussion started by: orahi001
3 Replies

3. Shell Programming and Scripting

removing duplicates

Hi I have a file that are a list of people & their credentials i recieve frequently The issue is that whne I catnet this list that duplicat entries exists & are NOT CONSECUTIVE (i.e. uniq -1 may not weork here ) I'm trying to write a scrip that will remove duplicate entries the script can... (5 Replies)
Discussion started by: stevie_velvet
5 Replies

4. Shell Programming and Scripting

Removing duplicates

Hi, I have a file in the below format., test test (10) to to (25) see see (45) and i need the output in the format of test 10 to 25 see 45 Some one help me? (6 Replies)
Discussion started by: imdadulla
6 Replies

5. UNIX for Advanced & Expert Users

removing duplicates.

Hi All In unix ,we have a file ,there we have to remove the duplicates by using one specific column. Can any body tell me the command. ex: file1 id,name 1,ww 2,qwq 2,asas 3,asa 4,asas 4,asas o/p: 1,ww 2,qwq 3,asa (7 Replies)
Discussion started by: raju4u
7 Replies

6. Shell Programming and Scripting

Removing duplicates

I have a test file with the following 2 columns: Col 1 | Col 2 T1 | 1 <= remove T5 | 1 T4 | 2 T1 | 3 T3 | 3 T4 | 1 <= remove T1 | 2 <= remove T3 ... (7 Replies)
Discussion started by: gctex
7 Replies

7. Emergency UNIX and Linux Support

Removing all the duplicates

i want to remove all the duplictaes in a file.I dont want even a single entry. For the input data: 12345|12|34 12345|13|23 3456|12|90 15670|12|13 12345|10|14 3456|12|13 i need the below data in one file 15670|12|13 and the below data in another file (9 Replies)
Discussion started by: pandeesh
9 Replies

8. UNIX for Dummies Questions & Answers

Removing duplicates from a file

Hi All, I am merging files coming from 2 different systems ,while doing that I am getting duplicates entries in the merged file I,01,000131,764,2,4.00 I,01,000131,765,2,4.00 I,01,000131,772,2,4.00 I,01,000131,773,2,4.00 I,01,000168,762,2,2.00 I,01,000168,763,2,2.00... (5 Replies)
Discussion started by: Sri3001
5 Replies

9. Shell Programming and Scripting

Removing duplicates except the last occurrence

Hi All, i have a file like below, @DB_FCTS\src\Data\Scripts\Delete_CU_OM_BIL_PRT_STMT_TYP.sql @DB_FCTS\src\Data\Scripts\Delete_CDP_BILL_LBL_MSG.sql @DB_FCTS\src\Data\Scripts\Delete_OM_BIDDR.sql @DB_FCTS\src\Data\Scripts\Insert_CU_OM_LBL_MSG.sql... (11 Replies)
Discussion started by: mechvijays
11 Replies

10. Shell Programming and Scripting

Removing duplicates from new file

i hav two files like i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3 (2 Replies)
Discussion started by: sagar_1986
2 Replies
IRSEND(1)								FSF								 IRSEND(1)

NAME
irsend - basic LIRC program to send infra-red commands SYNOPSIS
irsend [options] DIRECTIVE REMOTE CODE [CODE...] DESCRIPTION
Asks the lircd daemon to send one or more CIR (Consumer Infra-Red) commands. This is intended for remote control of electronic devices such as TV boxes, HiFi sets, etc. DIRECTIVE can be: SEND_ONCE - send CODE [CODE ...] once SEND_START - start repeating CODE SEND_STOP - stop repeating CODE LIST - list configured remote items SET_TRANSMITTERS - set transmitters NUM [NUM ...] SIMULATE - simulate IR event REMOTE is the name of a remote, as described in the lircd configuration file. CODE is the name of a remote control key of REMOTE, as it appears in the lircd configuration file. NUM is the transmitter number of the hardware device. For the LIST DIRECTIVE, REMOTE and/or CODE can be empty: LIST "" "" - list all configured remote names LIST REMOTE "" - list all codes of REMOTE LIST REMOTE CODE - list only CODE of REMOTE The SIMULATE command only works if it has been explicitly enabled in lircd. -h --help display usage summary -v --version display version -d --device use given lircd socket [/var/run/lirc/lircd] -a --address=host[:port] connect to lircd at this address -# --count=n send command n times EXAMPLES
irsend LIST DenonTuner "" irsend SEND_ONCE DenonTuner PROG-SCAN irsend SEND_ONCE OnkyoAmpli VOL-UP VOL-UP VOL-UP VOL-UP irsend SEND_START OnkyoAmpli VOL-DOWN ; sleep 3 irsend SEND_STOP OnkyoAmpli VOL-DOWN irsend SET_TRANSMITTERS 1 irsend SET_TRANSMITTERS 1 3 4 irsend SIMULATE "0000000000000476 00 OK TECHNISAT_ST3004S" FILES
/etc/lirc/lircd.conf Default lircd configuration file. It should contain all the remotes, their infra-red codes and the corresponding timing and wave- form details. DIAGNOSTICS
If lircd is not running (or /var/run/lirc/lircd lacks write permissions) irsend aborts with the following diagnostics: "irsend: could not connect to socket" "irsend: Connection refused" (or "Permission denied"). SEE ALSO
The documentation for lirc is maintained as html pages. They are located under html/ in the documentation directory. lircd(8), mode2(1), smode2(1), xmode2(1), irrecord(1), irw(1), http://www.lirc.org. irsend 0.8.7pre1 May 2010 IRSEND(1)
All times are GMT -4. The time now is 05:02 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy