Sponsored Content
Top Forums UNIX for Advanced & Expert Users Remove duplicates in flat file Post 302879757 by jethrow on Sunday 15th of December 2013 01:16:57 AM
Old 12-15-2013
Code:
awk -F'|' '{k=$1"|"$4"|"$6"|"$8} !(k in keys) {print; keys[k]++}' file

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates from File from specific location

How can i remove the duplicate lines from a file, for example sample123456Sample testing123456testing XXXXX131323XXXXX YYYYY423432YYYYY fsdfdsf123456gsdfdsd all the duplicates from column 6-12 , must be deleted. I want to consider the first row, if same comes in the given range i want to... (1 Reply)
Discussion started by: gopikgunda
1 Replies

2. Shell Programming and Scripting

remove duplicates within a block in a file..help required

hi.. i have a file in the following format :- name-a age -12 address-123 age-12 phone-22222 ============ name-ab age -11 address-123 age-11 phone-222223 ============= name-abc age -12 address-1234 age-12 phone-2222223 ============= (2 Replies)
Discussion started by: nipun_garg
2 Replies

3. UNIX for Dummies Questions & Answers

how to remove the first line from a flat file ?

Hi, I want to remove the first line from a flat file using unix command as simple as possible. Can anybody give me a hand ? Thanks in advance. xli (21 Replies)
Discussion started by: xli
21 Replies

4. Shell Programming and Scripting

Remove duplicates from end of file

1/p ---- A B C A C o/p --- B A C From input file it should remove duplicates from end without changing order (5 Replies)
Discussion started by: lavnayas
5 Replies

5. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

6. Shell Programming and Scripting

How to remove duplicates from the .dat file

All, I have a file 1181CUSTOMER-L061411_003500.dat.Z having duplicate records in it. bash-2.05$ zcat 1181CUSTOMER-L061411_003500.dat.Z|grep "90876251S" 90876251S|ABG, AN ADAYANA COMPANY|3550 DEPAUW BLVD|||US|IN|INDIANAPOLIS||DAL|46268||||||GEN|||||||USD|||ABG, AN ADAYANA... (3 Replies)
Discussion started by: Oracle_User
3 Replies

7. UNIX for Dummies Questions & Answers

How to remove numeric characters in the flat file

HI, can any one help me please .. i have flat file like qwer123rt ass3242ccf jjk654 kjh838ppp nhdg453ok hdkk34 i want remove numeric characters in the flat file i want output like this qwerrt assccf jjk kjhppp nhdgok hdkk help me... (4 Replies)
Discussion started by: rafimd1985
4 Replies

8. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

9. UNIX for Dummies Questions & Answers

Remove duplicates and keep them in a separate file

Hi, I have a tablular separated file and I want to remove all the rows that have duplicates. The diuplicates I need to check are in column 13. I have tried to use awk but I have no Idea how to keep the duplicate file. awk 'FNR==NR{a++;next}(a> 1)' tomodify.txt tomodify.txt > new.txt ... (4 Replies)
Discussion started by: flacchy
4 Replies

10. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies
GLSHADEMODEL(3G)														  GLSHADEMODEL(3G)

NAME
glShadeModel - select flat or smooth shading C SPECIFICATION
void glShadeModel( GLenum mode ) PARAMETERS
mode Specifies a symbolic value representing a shading technique. Accepted values are GL_FLAT and GL_SMOOTH. The initial value is GL_SMOOTH. DESCRIPTION
GL primitives can have either flat or smooth shading. Smooth shading, the default, causes the computed colors of vertices to be interpo- lated as the primitive is rasterized, typically assigning different colors to each resulting pixel fragment. Flat shading selects the com- puted color of just one vertex and assigns it to all the pixel fragments generated by rasterizing a single primitive. In either case, the computed color of a vertex is the result of lighting if lighting is enabled, or it is the current color at the time the vertex was speci- fied if lighting is disabled. Flat and smooth shading are indistinguishable for points. Starting when glBegin is issued and counting vertices and primitives from 1, the GL gives each flat-shaded line segment i the computed color of vertex i+1, its second vertex. Counting similarly from 1, the GL gives each flat-shaded polygon the computed color of the vertex listed in the following table. This is the last vertex to specify the polygon in all cases except single polygons, where the first vertex specifies the flat-shaded color. ------------------------------------- Primitive Type of Polygon i Vertex ------------------------------------- Single polygon (i==1) 1 Triangle strip i+2 Triangle fan i+2 Independent triangle 3i Quad strip 2i+2 Independent quad 4i ------------------------------------- Flat and smooth shading are specified by glShadeModel with mode set to GL_FLAT and GL_SMOOTH, respectively. ERRORS
GL_INVALID_ENUM is generated if mode is any value other than GL_FLAT or GL_SMOOTH. GL_INVALID_OPERATION is generated if glShadeModel is executed between the execution of glBegin and the corresponding execution of glEnd. ASSOCIATED GETS
glGet with argument GL_SHADE_MODEL SEE ALSO
glBegin(3G), glColor(3G), glLight(3G), glLightModel(3G) GLSHADEMODEL(3G)
All times are GMT -4. The time now is 02:35 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy