Sponsored Content
Top Forums Shell Programming and Scripting remove all duplicate lines from all files in one folder Post 302321075 by ghostdog74 on Saturday 30th of May 2009 02:55:42 AM
Old 05-30-2009
@dev, requirement is different. OP still needs to keep his files.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to remove duplicate lines

I have following file content (3 fields each line): 23 888 10.0.0.1 dfh 787 10.0.0.2 dssf dgfas 10.0.0.3 dsgas dg 10.0.0.4 df dasa 10.0.0.5 df dag 10.0.0.5 dfd dfdas 10.0.0.5 dfd dfd 10.0.0.6 daf nfd 10.0.0.6 ... as can be seen, that the third field is ip address and sorted. but... (3 Replies)
Discussion started by: fredao
3 Replies

2. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

3. Shell Programming and Scripting

perl/shell need help to remove duplicate lines from files

Dear All, I have multiple files having number of records, consist of more than 10 columns some column values are duplicate and i want to remove these duplicate values from these files. Duplicate values may come in different files.... all files laying in single directory.. Need help to... (3 Replies)
Discussion started by: arvindng
3 Replies

4. Shell Programming and Scripting

Remove duplicate lines

Hi, I have a huge file which is about 50GB. There are many lines. The file format likes 21 rs885550 0 9887804 C C T C C C C C C C 21 rs210498 0 9928860 0 0 C C 0 0 0 0 0 0 21 rs303304 0 9941889 A A A A A A A A A A 22 rs303304 0 9941890 0 A A A A A A A A A The question is that there are a few... (4 Replies)
Discussion started by: zhshqzyc
4 Replies

5. Shell Programming and Scripting

[uniq + awk?] How to remove duplicate blocks of lines in files?

Hello again, I am wanting to remove all duplicate blocks of XML code in a file. This is an example: input: <string-array name="threeItems"> <item>item1</item> <item>item2</item> <item>item3</item> </string-array> <string-array name="twoItems"> <item>item1</item> <item>item2</item>... (19 Replies)
Discussion started by: raidzero
19 Replies

6. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

7. UNIX for Dummies Questions & Answers

Remove Duplicate Lines

Hi I need this output. Thanks. Input: TAZ YET FOO FOO VAK TAZ BAR Output: YET VAK BAR (10 Replies)
Discussion started by: tara123
10 Replies

8. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies

9. Windows & DOS: Issues & Discussions

Remove duplicate lines from text files.

So, I have text files, one "fail.txt" And one "color.txt" I now want to use a command line (DOS) to remove ANY line that is PRESENT IN BOTH from each text file. Afterwards there shall be no duplicate lines. (1 Reply)
Discussion started by: pasc
1 Replies

10. Shell Programming and Scripting

How to remove duplicate lines?

Hi All, I am storing the result in the variable result_text using the below code. result_text=$(printf "$result_text\t\n$name") The result_text is having the below text. Which is having duplicate lines. file and time for the interval 03:30 - 03:45 file and time for the interval 03:30 - 03:45 ... (4 Replies)
Discussion started by: nalu
4 Replies
RMM(1)								     [nmh-1.5]								    RMM(1)

NAME
rmm - remove messages SYNOPSIS
rmm [+folder] [msgs] [-unlink | -nounlink] [-version] [-help] DESCRIPTION
By default, rmm will remove the specified messages by renaming each of the message files with a site-dependent prefix (usually a comma). Such files will then need to be removed in some manner after a certain amount of time. Many sites arrange for cron to remove these files once a day, so check with your system administrator. Alternately, if you wish for rmm to really remove the files representing these messages, you can use the -unlink switch. But messages removed by this method cannot be later recovered. If you prefer a more sophisticated method of `removing' messages, you can define the rmmproc profile component. For example, you can add a profile component such as rmmproc: /home/foouser/bin/rmm_msgs then instead of simply renaming the message file, rmm will call the named program or script to handle the files that represent the messages to be deleted. Some users of csh prefer the following: alias rmm 'refile +d' where folder `+d' is a folder for deleted messages, and alias mexp 'rm `mhpath +d all`' is used to "expunge" deleted messages. The current message is not changed by rmm, so a next will advance to the next message in the folder as expected. FILES
$HOME/.mh_profile The user profile PROFILE COMPONENTS
Path: To determine the user's nmh directory Current-Folder: To find the default current folder rmmproc: Program to delete the message SEE ALSO
refile(1), rmf(1) DEFAULTS
`+folder' defaults to the current folder `msgs' defaults to cur `-nounlink' CONTEXT
If a folder is given, it will become the current folder. BUGS
Since refile uses your rmmproc to delete the message, the rmmproc must NOT call refile without specifying -normmproc, or you will create an infinte loop. MH.6.8 11 June 2012 RMM(1)
All times are GMT -4. The time now is 09:04 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy