Sponsored Content
Top Forums Shell Programming and Scripting Replace duplicate columns with values from first occurrence Post 302695533 by elixir_sinari on Monday 3rd of September 2012 05:33:17 AM
Old 09-03-2012
Code:
awk -F, '{c[$3,$4,$5]++
if(c[$3,$4,$5]==1)
{
one[$3,$4,$5]=$1
two[$3,$4,$5]=$2
}}
c[$3,$4,$5]>1{$1=one[$3,$4,$5];$2=two[$3,$4,$5]}1' OFS=, file

This User Gave Thanks to elixir_sinari For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Search and replace to first occurrence of string

Hi all, I have a very large; delimited file. In vi I would like to replace: CSACT_DY;AVG_UEACT1;uesPerActiveLinkSetSize_1;#;A CSACT_DY;AVG_UEACT2;uesPerActiveLinkSetSize_2;#;A CSACT_DY;AVG_UEACT3;uesPerActiveLinkSetSize_3;#;A with: CSACT_DY;AVG_UEACT1;Average... (7 Replies)
Discussion started by: gilmord
7 Replies

2. Shell Programming and Scripting

Replace second occurrence only

HPUX /bin/sh (posix) I have a file as such cat dog mouse deer elk rabbit mouse rat pig I would like to replace the second occurrence of mouse in this file with mouse2. The rest of the file has to stay exactly as is. I'm not sure exactly where mouse might be (could be first,second,third... (5 Replies)
Discussion started by: lyoncc
5 Replies

3. Shell Programming and Scripting

Find and replace duplicate column values in a row

I have file which as 12 columns and values like this 1,2,3,4,5 a,b,c,d,e b,c,a,e,f a,b,e,a,h if you see the first column has duplicate values, I need to identify (print it to console) the duplicate value (which is 'a') and also remove duplicate values like below. I could be in two... (5 Replies)
Discussion started by: nuthalapati
5 Replies

4. UNIX for Dummies Questions & Answers

How to replace particular occurrence of character in between a delimiter?

Hi, Hi, I have a file with following format 1|" "text " around " |" fire "guest"|" " 2| "xyz"" | "no guest"|"3" 3| """ test3""| "one" guest"|"4" My requirement is to replace all occurrences of " to ' which are occurring between |" "|delimiter so my output should look like this 1|"... (3 Replies)
Discussion started by: H_bansal
3 Replies

5. Programming

awk to count occurrence of strings and loop for multiple columns

Hi all, If i would like to process a file input as below: col1 col2 col3 ...col100 1 A C E A ... 3 D E G A 5 T T A A 6 D C A G how can i perform a for loop to count the occurences of letters in each column? (just like uniq -c ) in every column. on top of that, i would also like... (8 Replies)
Discussion started by: iling14
8 Replies

6. Shell Programming and Scripting

Replace values in columns

I have the following input format: AA00000712000 -0.17 0.90 -1.04 -0.37 -1.45 -1.13 -0.22 -0.34 -0.55 2.37 0.67 -0.48 -0.48 AA00000712001 0.15 0.03 0.47 0.62 2.04 1.14 0.29 -0.81 0.85 0.53 1.00 -0.10 -0.48 BB00000712000 1.32 -0.47 0.44 0.00 0.98 ... (4 Replies)
Discussion started by: ncwxpanther
4 Replies

7. Shell Programming and Scripting

Multiple columns replace with null values.

I am trying to replace the partcular columns(Col3,col5,col20,col44,col55,co56,col59,col60,col61,col62,col74,col75,col88,col90,col91,col93,col94,col95) with empty Input file Col1,col2,col3,col4,col5------,col100 1,2,3,4,5,---------,100 3,4,5,6,7,---------,300 Output : ... (3 Replies)
Discussion started by: onesuri
3 Replies

8. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

9. Shell Programming and Scripting

Do replace operation and awk to sum multiple columns if another column has duplicate values

Hi Experts, Please bear with me, i need help I am learning AWk and stuck up in one issue. First point : I want to sum up column value for column 7, 9, 11,13 and column15 if rows in column 5 are duplicates.No action to be taken for rows where value in column 5 is unique. Second point : For... (12 Replies)
Discussion started by: as7951
12 Replies

10. Shell Programming and Scripting

Match value in two files and replace values in selected columns

The purpose is to check if values for column 3 and 4 in file1 match with column 1 in file2. If any value match do: 1) Replace values in file2 for column 2 and 3 using the information of file1 columns 5 and 6 2) Replace string ($1,1,5) and string ($1,6,5) in file2 with values of columns 7... (8 Replies)
Discussion started by: jiam912
8 Replies
PG_COPY_FROM(3) 														   PG_COPY_FROM(3)

pg_copy_from - Insert records into a table from an array

SYNOPSIS
bool pg_copy_from (resource $connection, string $table_name, array $rows, [string $delimiter], [string $null_as]) DESCRIPTION
pg_copy_from(3) inserts records into a table from $rows. It issues a COPY FROM SQL command internally to insert records. PARAMETERS
o $connection - PostgreSQL database connection resource. o $table_name - Name of the table into which to copy the $rows. o $rows - An array of data to be copied into $table_name. Each value in $rows becomes a row in $table_name. Each value in $rows should be a delimited string of the values to insert into each field. Values should be linefeed terminated. o $delimiter - The token that separates values for each field in each element of $rows. Default is TAB. o $null_as - How SQL NULL values are represented in the $rows. Default is N ("\N"). RETURN VALUES
Returns TRUE on success or FALSE on failure. EXAMPLES
Example #1 pg_copy_from(3) example <?php $db = pg_connect("dbname=publisher") or die("Could not connect"); $rows = pg_copy_to($db, $table_name); pg_query($db, "DELETE FROM $table_name"); pg_copy_from($db, $table_name, $rows); ?> SEE ALSO
pg_copy_to(3). PHP Documentation Group PG_COPY_FROM(3)
All times are GMT -4. The time now is 06:01 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy