Sponsored Content
Top Forums Shell Programming and Scripting Duplicate line removal matching some columns only Post 302782965 by RudiC on Tuesday 19th of March 2013 04:37:30 PM
Old 03-19-2013
Untested, just dreamed up:
Code:
awk -F, '{$30=length ($30$31) FS $30}1' file | sort -u | awk '{$30 = ""; $0=$0; $1=$1}1'

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Sort, duplicate removal - Query

Hi All, I have a problem with the sort and duplicate filter command I am using in one of my scripts. I have a '|' delimited file and want to sort and remove duplicates on the 1,2,15 fields. These fields constitute the primary key of the table I will be loading the data into. But I see that some... (4 Replies)
Discussion started by: novice1324
4 Replies

2. UNIX for Dummies Questions & Answers

exclude columns with a matching line pattern

Hi , I have 5 columns total and am wanting to search lines in columns 3-5 and basically grep -v patterns that match 'BBB_0123' 'BVG_0895' 'BSD_0987' Does anyone know how to do this? I tried combining grep -v with grep -e but, it didn't work. Thanks! (5 Replies)
Discussion started by: greptastic
5 Replies

3. Shell Programming and Scripting

Removal of Duplicate Entries from the file

I have a file which consists of 1000 entries. Out of 1000 entries i have 500 Duplicate Entires. I want to remove the first Duplicate Entry (i,e entire Line) in the File. The example of the File is shown below: 8244100010143276|MARISOL CARO||MORALES|HSD768|CARR 430 KM 1.7 ... (1 Reply)
Discussion started by: ravi_rn
1 Replies

4. Shell Programming and Scripting

using command line arguments as columns for pattern matching using awk

Hi, I wish to use a column, as inputted by a user from command line, for pattern matching. awk file: { if($1 ~ /^8/) { print $0> "temp2.csv" } } something like this, but i want '$1' to be any column as selected by the user from command line. ... (1 Reply)
Discussion started by: invinclible0009
1 Replies

5. Shell Programming and Scripting

Column Search and Line Removal

Hello Gurus, I need to remove lines within a file if it contains specific criteria. Here is what I am trying to resolve: Users of AppRuntime: (Total of 10 licenses issued; Total of 6 licenses in use) buih02 dsktp501 AppGui 1 (compute_lic/27006 3122), start Mon 2/22 7:58 dingj1... (3 Replies)
Discussion started by: leepet01
3 Replies

6. Shell Programming and Scripting

Remove duplicate lines (the first matching line by field criteria)

Hello to all, I have this file 2002 1 23 0 0 2435.60 131.70 5.60 20.99 0.89 0.00 285.80 2303.90 2002 1 23 15 0 2436.60 132.90 6.45 21.19 1.03 0.00 285.80 2303.70 2002 1 23 ... (6 Replies)
Discussion started by: joggdial3000
6 Replies

7. UNIX for Advanced & Expert Users

Duplicate removal

I have an input file of 5GB which contains duplicate records and have to remove duplicate records by retaing first instance of that record . Based on 5 fields the duplicates has to be removed . Kindly request to help me in writing a Unix Script. Thanks Asim (11 Replies)
Discussion started by: duplicate
11 Replies

8. Shell Programming and Scripting

awk to copy previous line matching a particular columns

Hello Help, 2356798 7689867 999 000 123678 20385907 9797 666 17978975 87468976 968978 98798 I am trying to have out put which actually look for the third column value of 9797 and then it insert line there after with first, second column value exactly as the previous line and replace the third... (3 Replies)
Discussion started by: Indra2011
3 Replies

9. Shell Programming and Scripting

Honey, I broke awk! (duplicate line removal in 30M line 3.7GB csv file)

I have a script that builds a database ~30 million lines, ~3.7 GB .cvs file. After multiple optimzations It takes about 62 min to bring in and parse all the files and used to take 10 min to remove duplicates until I was requested to add another column. I am using the highly optimized awk code: awk... (34 Replies)
Discussion started by: Michael Stora
34 Replies

10. UNIX for Dummies Questions & Answers

Print only the duplicate line only with matching columns

Hi There, I have an I/P which looks like -- 1 2 3 4 5 1 2 3 4 6 4 7 8 9 9 5 6 7 8 9 I would like O/P to be --- 1 2 3 4 5 1 2 3 4 6 So, printing only the consecutive lines where $1,$2,$3,$4 are matching. Is there any command to do this or small awk script? Thanks, (12 Replies)
Discussion started by: Indra2011
12 Replies
pamdeinterlace(1)					      General Commands Manual						 pamdeinterlace(1)

NAME
pamdeinterlace - remove ever other row from a PAM/PNM image SYNOPSIS
pamdeinterlace [-takeodd] [-takeeven] N [infile] You can use the minimum unique abbreviation of the options. You can use two hyphens instead of one. You can separate an option name from its value with white space instead of an equals sign. DESCRIPTION
pamdeinterlace Removes all the even-numbered or odd-numbered rows from the input PNM or PAM image. Specify which with the -takeeven and -takeodd options. This can be useful if the image is a video capture from an interlaced video source. In that case, each row shows the subject 1/60 second before or after the two rows that surround it. If the subject is moving, this can detract from the quality of the image. Because the resulting image is half the height of the input image, you will then want to use pamstretch or pnmscale to restore it to its normal height: pamdeinterlace myimage.ppm | pamstretch -yscale=2 >newimage.ppm OPTIONS
-takeodd Take the odd-numbered rows from the input and put them in the output. The rows are numbered starting at zero, so the first row in the output is the second row from the input. You cannot specify both -takeeven and -takeodd. -takeeven Take the even-numbered rows from the input and put them in the output. The rows are numbered starting at zero, so the first row in the output is the first row from the input. This is the default. You cannot specify both -takeeven and -takeodd. SEE ALSO
pamstretch(1), pnmscale(1) 11 November 2001 pamdeinterlace(1)
All times are GMT -4. The time now is 10:54 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy