Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Remove Duplicate Two Line Pairs? Post 302772488 by bakere19 on Monday 25th of February 2013 10:31:34 PM
Old 02-25-2013
oh this does work! I made a mistake entering it. Thanks so much!

---------- Post updated at 10:31 PM ---------- Previous update was at 10:28 PM ----------

It seems that this removes some duplicates but not all. I still have entries with the same GI and same sequence.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Remove Duplicate line

Hi, I have a scenario here where I have created a flatfile with the below mentioned information. File as you can see is dispalyed in three columns 1st column is FileNameString 2nd column is Report_Name (this has spaces) 3rd column is Flag Result file needed is, removal of duplicate... (1 Reply)
Discussion started by: Student37
1 Replies

2. UNIX for Dummies Questions & Answers

Remove duplicate entry in one line

Can anyone help me how can i print only the unique entry in a line? MI_AP MI_AP MI_CM MI_MF RC_NAP MBS_AP SF_RAN MBS_AP NT_CAR so that it will on output the one unique entry per line. MI_AP MI_CM MI_MF RC_NAP MBS_AP SF_RAN NT_CAR I can't find the same situation on the knowledge... (5 Replies)
Discussion started by: kharen11
5 Replies

3. Shell Programming and Scripting

remove duplicate words in a line

Hi, Please help! I have a file having duplicate words in some line and I want to remove the duplicate words. The order of the words in the output file doesn't matter. INPUT_FILE pink_kite red_pen ball pink_kite ball yellow_flower white no white no cloud nine_pen pink cloud pink nine_pen... (6 Replies)
Discussion started by: sam_2921
6 Replies

4. Shell Programming and Scripting

remove of duplicate line from a file

I have a file a.txt having content like deepak ram sham deepram sita kumar I Want to delete the first line containing "deep" ... I tried using... grep -i 'deep' a.txt It gives me 2 rows...I want to delete the first one.. + need to know the command to delete the line from... (5 Replies)
Discussion started by: saluja.deepak
5 Replies

5. Shell Programming and Scripting

Remove duplicate line on condition

Hi Ive been scratching over this for some time with no solution. I have a file like this 1 bla bla 1 2 bla bla 2 4 bla bla 3 5 bla bla 1 6 bla bla 1 I want to remove consecutive occurrences of lines like bla bla 1, but the first column may be different. Any ideasss?? (23 Replies)
Discussion started by: jamie_123
23 Replies

6. Shell Programming and Scripting

Remove duplicate entries from the same line

Hello, I have a file which have several duplicate entries on the same line: File ID source 1 GM GF GM 2 GM GF GM GF GM GF GM GF GM GF 3 GM GF GM SF GM GF GM SF 4 FF FF FF FF 5 FF GM FF ... (2 Replies)
Discussion started by: nans
2 Replies

7. Shell Programming and Scripting

Remove duplicate line starting with a pattern

HI, I have the below input file /* ----------------- cmdsDlyStartFWJ -----------------*/ UNIX_JOB CMDS065J RUN ANY CMDNAME sleep 5 AGENT CMDSHP USER proddata RUN MON,TUE,WED,THU,FRI DELAYSUB 02:00 /* "Triggers daily file watcher jobs" */ ENVAR... (5 Replies)
Discussion started by: varun22486
5 Replies

8. Shell Programming and Scripting

Remove lines with duplicate pairs where AB is equal to BA

I have a file with four columns like dmn10003t1 PF00001 PF00022 dmn12390t1 dmn10008t1 PF00069 PF00027 dmn9781t1 dmn10008t1 PF00068 PF00027 dmn9781t1 dmn10008t1 PF00069 PF00069 dmn9781t1 dmn12390t1 PF00069 PF00076 dmn10003t1 I want to create a new file by comparing the repeated word pairs... (2 Replies)
Discussion started by: sammy777
2 Replies

9. Shell Programming and Scripting

Remove sections based on duplicate first line

Hi, I have a file with many sections in it. Each section is separated by a blank line. The first line of each section would determine if the section is duplicate or not. if the section is duplicate then remove the entire section from the file. below is the example of input and output.... (5 Replies)
Discussion started by: ahmedwaseem2000
5 Replies

10. UNIX for Dummies Questions & Answers

Using awk to remove duplicate line if field is empty

Hi all, I've got a file that has 12 fields. I've merged 2 files and there will be some duplicates in the following: FILE: 1. ABC, 12345, TEST1, BILLING, GV, 20/10/2012, C, 8, 100, AA, TT, 100 2. ABC, 12345, TEST1, BILLING, GV, 20/10/2012, C, 8, 100, AA, TT, (EMPTY) 3. CDC, 54321, TEST3,... (4 Replies)
Discussion started by: tugar
4 Replies
set_color(1)							       fish							      set_color(1)

NAME
set_color - set_color - set the terminal color set_color - set the terminal color Synopsis set_color [-v --version] [-h --help] [-b --background COLOR] [COLOR] Description Change the foreground and/or background color of the terminal. COLOR is one of black, red, green, brown, yellow, blue, magenta, purple, cyan, white and normal. o -b, --background Set the background color o -c, --print-colors Prints a list of all valid color names o -h, --help Display help message and exit o -o, --bold Set bold or extra bright mode o -u, --underline Set underlined mode o -v, --version Display version and exit Calling set_color normal will set the terminal color to whatever is the default color of the terminal. Some terminals use the --bold escape sequence to switch to a brighter color set. On such terminals, set_color white will result in a grey font color, while set_color --bold white will result in a white font color. Not all terminal emulators support all these features. This is not a bug in set_color but a missing feature in the terminal emulator. set_color uses the terminfo database to look up how to change terminal colors on whatever terminal is in use. Some systems have old and incomplete terminfo databases, and may lack color information for terminals that support it. Download and install the latest version of ncurses and recompile fish against it in order to fix this issue. Version 1.23.1 Sun Jan 8 2012 set_color(1)
All times are GMT -4. The time now is 09:03 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy