Sponsored Content
Top Forums Shell Programming and Scripting Help with remove duplicated content Post 302549237 by durden_tyler on Monday 22nd of August 2011 10:35:19 AM
Old 08-22-2011
Here's a Perl solution -

Code:
$
$
$ cat f9
hcmv-US25-2-3p  hsa-3160-5
hcmv-US33 hsa-47
hcmv-UL70-3p hsa-4508
hcmv-UL70-3p hsa-4486
hcms-US25 hsa-360-5
hcms-US25 hsa-4
hcms-US25 hsa-458
hcms-US25 hsa-44812
$
$
$ perl -lane 'print $F[0] eq $prev ? " " x length($prev) : $F[0], " ", $F[1]; $prev=$F[0]' f9
hcmv-US25-2-3p hsa-3160-5
hcmv-US33 hsa-47
hcmv-UL70-3p hsa-4508
             hsa-4486
hcms-US25 hsa-360-5
          hsa-4
          hsa-458
          hsa-44812
$
$
$

tyler_durden
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove duplicated xml record in a file under unix

Hi, If i have a file with xml format, i would like to remove duplicated records and save to a new file. Is it possible...to write script to do it? (8 Replies)
Discussion started by: happyv
8 Replies

2. Shell Programming and Scripting

remove duplicated lines without sort

Hi Just wondering whether or not I can remove duplicated lines without sort For example, I use the command who, which shows users who are logging on. In some cases, it shows duplicated lines of users who are logging on more than one terminal. Normally, I would do who | cut -d" " -f1 |... (6 Replies)
Discussion started by: lalelle
6 Replies

3. Shell Programming and Scripting

remove duplicated columns

hi all, i have a file contain multicolumns, this file is sorted by col2 and col3. i want to remove the duplicated columns if the col2 and col3 are the same in another line. example fileA AA BB CC DD CC XX CC DD BB CC ZZ FF DD FF HH HH the output is AA BB CC DD BB CC ZZ FF... (6 Replies)
Discussion started by: kamel.seg
6 Replies

4. UNIX for Dummies Questions & Answers

How to remove duplicated based on longest row & largest value in a column

Hii i have a file with data as shown below. Here i need to remove duplicates of the rows in such a way that it just checks for 2,3,4,5 column for duplicates.When deleting duplicates,retain largest row i.e with many columns with values should be selected.Then it must remove duplicates such that by... (11 Replies)
Discussion started by: reva
11 Replies

5. Shell Programming and Scripting

Remove rows with first 4 fields duplicated in awk

Hi, I am trying to use awk to remove all rows where the first 4 fields are duplicates. e.g. in the following data lines 6-9 would be removed, leaving one copy of the duplicated row (row 5) Borgarhraun FH9822 ol24 FH9822_ol24_m20 ol Deformed c Borgarhraun FH9822 ol24 ... (3 Replies)
Discussion started by: tomahawk
3 Replies

6. Shell Programming and Scripting

How to remove duplicated lines?

Hi, if i have a file like this: Query=1 a a b c c c d Query=2 b b b c c e . . . (7 Replies)
Discussion started by: the_simpsons
7 Replies

7. Shell Programming and Scripting

Merge files and remove duplicated rows

In a folder I'll several times daily receive new files that I want to combine into one big file, without any duplicate rows. The file name in the folder will look like e.q: MissingData_2014-08-25_09-30-18.txt MissingData_2014-08-25_09-30-14.txt MissingData_2014-08-26_09-30-12.txt The content... (9 Replies)
Discussion started by: Bergans
9 Replies

8. Shell Programming and Scripting

How to remove duplicated column in a text file?

Dear all, How can I remove duplicated column in a text file? Input: LG10_PM_map_19_LEnd 1000560 G AA AA AA AA AA GG LG10_PM_map_19_LEnd 1005621 G GG GG GG AA AA GG LG10_PM_map_19_LEnd 1011214 A AA AA AA AA GG GG LG10_PM_map_19_LEnd 1011673 T TT TT TT TT CC CC... (1 Reply)
Discussion started by: huiyee1
1 Replies

9. AIX

Remove duplicated bootlist entries

Hello. I have a server with 2 boot disk but in the bootlist there are 5 paths of one disk but no path of the other. How can I remove paths from one disk to insert paths from the other disk? Thanks in advance. server074:root:/# bootlist -om normal hdisk0 blv=hd5 pathid=0 hdisk0... (7 Replies)
Discussion started by: Gabriander
7 Replies

10. Shell Programming and Scripting

Remove duplicated records and update last line record counts

Hi Gurus, I need to remove duplicate line in file and update TRAILER (last line) record count. the file is comma delimited, field 2 is key to identify duplicated record. I can use below command to remove duplicated. but don't know how to replace last line 2nd field to new count. awk -F","... (11 Replies)
Discussion started by: green_k
11 Replies
PREV(1)                                                              [nmh-1.5]                                                             PREV(1)

NAME
prev - show the previous message SYNOPSIS
prev [+folder] [-showproc program] [-showmimeproc program] [-header | -noheader] [-checkmime | -nocheckmime] [switches for showproc or showmimeproc] [-version] [-help] DESCRIPTION
Prev performs a show on the previous message in the specified (or current) folder. Like show, it passes any switches on to the program named by showproc or showmimeproc, which is called to list the message. This command is almost exactly equivalent to "show prev". Consult the manual entry for show(1) for all the details. FILES
$HOME/.mh_profile The user profile PROFILE COMPONENTS
Path: To determine the user's nmh directory Current-Folder: To find the default current folder showproc: Program to show non-MIME messages showmimeproc: Program to show MIME messages SEE ALSO
show(1), next(1) DEFAULTS
`+folder' defaults to the current folder `-checkmime' `-header' CONTEXT
If a folder is specified, it will become the current folder. The message that is shown (i.e., the previous message in sequence) will become the current message. BUGS
prev is really a link to the show program. As a result, if you make a link to prev and that link is not called prev, your link will act like show instead. To circumvent this, add a profile-entry for the link to your nmh profile and add the argument prev to the entry. MH.6.8 11 June 2012 PREV(1)
All times are GMT -4. The time now is 01:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy