Sponsored Content
Full Discussion: duplicate a column with awk
Top Forums Shell Programming and Scripting duplicate a column with awk Post 302590807 by LMHmedchem on Tuesday 17th of January 2012 03:59:22 PM
Old 01-17-2012
Quote:
Originally Posted by Corona688
The output separator is controlled by the special variable OFS.

Perhaps you can cheat a little, putting two values in one column:

Code:
awk 'BEGIN { FS="\t"; OFS="\t" } { $2=$2 "\t" $2 } 1'

Thanks, that worked fine.

Another quick question, is there an easy way to append something like "MC_" to every field on the first line?

Code:
fieldName1    fieldName2    fieldName3

MC_fieldName1    MC_fieldName2    MC_fieldName3

LMHmedchem
This User Gave Thanks to LMHmedchem For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

awk: duplicate a column into a new one

Hi ! I have a "|" delimited file: field 1|field2|field3|field4 AAA|BBB|CCC|DDD EEE|FFF|GGG|HHH Using awk, I need to duplicate the 2nd column and print it into a 5th new column, like that: output: field 1|field2|field3|field4|field 2 AAA|BBB|CCC|DDD|BBB EEE|FFF|GGG|HHH|FFF Thanks... (1 Reply)
Discussion started by: lucasvs
1 Replies

2. Shell Programming and Scripting

Help with duplicate column 1 data

Input file Q6GZV8 AY548484>AAT09676.1>YP_031595.1>2947737>CLSP2512393 P0C9E9 AY261366 P0C9K3 AY261361>IPR004848>PF01639 P0C9I4 AY261363>IPR004848 Desired output file Q6GZV8 AY548484 Q6GZV8 AAT09676.1 Q6GZV8 YP_031595.1 Q6GZV8 2947737 Q6GZV8 CLSP2512393 P0C9E9 AY261366... (3 Replies)
Discussion started by: perl_beginner
3 Replies

3. Shell Programming and Scripting

Duplicate third column to every line

Dear All, I have file input like this: INP901 5173 4114 INP902 5227 INP903 5284 INP904 5346 INP905 5400 INP906 5456 INP907 5511 INP908 5572 INP909 5622 INP910 5678 INP911 5739 INP912 5796 INP913 5845 INP914 5910 INP915 5965 (2 Replies)
Discussion started by: attila
2 Replies

4. UNIX for Dummies Questions & Answers

awk to sum column field from duplicate row/lines

Hello, I am new to Linux environment , I working on Linux script which should send auto email based on the specific condition from log file. Below is the sample log file Name m/c usage abc xxx 10 abc xxx 20 abc xxx 5 xyz ... (6 Replies)
Discussion started by: asjaiswal
6 Replies

5. UNIX for Dummies Questions & Answers

awk solution to duplicate lines based on column

Hi experts, I have a tab-delimited file with one column containing values separated by a comma. I wish to duplicate the entire line for every value in that comma-delimited field. For example: $cat file 4444 4444 4444 4444 9990 2222,7777 6666 2222 ... (3 Replies)
Discussion started by: torchij
3 Replies

6. Shell Programming and Scripting

awk to sum a column based on duplicate strings in another column and show split totals

Hi, I have a similar input format- A_1 2 B_0 4 A_1 1 B_2 5 A_4 1 and looking to print in this output format with headers. can you suggest in awk?awk because i am doing some pattern matching from parent file to print column 1 of my input using awk already.Thanks! letter number_of_letters... (5 Replies)
Discussion started by: prashob123
5 Replies

7. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

8. UNIX for Beginners Questions & Answers

Duplicate and change a column

Hi, I have a file with list of items: a b c I would like to run a 1-liner (awk/perl) to duplicate and change the value of the existing column, i.e.: a a b b c c I can duplicate with awk: awk '{print $1 " " $1 }' but couldn't figure out how to do the character change, your help is... (5 Replies)
Discussion started by: yan1
5 Replies

9. Shell Programming and Scripting

Awk: duplicate column and print remaining as is

Hello there I'd like to make a copy of 2nd column and have it printed in place of column 1. Remaining columns are needed as it. test data: ProbeSet GeneSymbol X22565285 X22566285 ILMN_1050008 MYOCD 6.577 7.395 ILMN_1050014 GPRC6A 6.595 6.668 ILMN_1050017 ... (2 Replies)
Discussion started by: genome
2 Replies

10. Shell Programming and Scripting

Do replace operation and awk to sum multiple columns if another column has duplicate values

Hi Experts, Please bear with me, i need help I am learning AWk and stuck up in one issue. First point : I want to sum up column value for column 7, 9, 11,13 and column15 if rows in column 5 are duplicates.No action to be taken for rows where value in column 5 is unique. Second point : For... (12 Replies)
Discussion started by: as7951
12 Replies
COLRM(1)						    BSD General Commands Manual 						  COLRM(1)

NAME
colrm -- remove columns from a file SYNOPSIS
colrm [start [stop]] DESCRIPTION
The colrm utility removes selected columns from the lines of a file. A column is defined as a single character in a line. Input is read from the standard input. Output is written to the standard output. If only the start column is specified, columns numbered less than the start column will be written. If both start and stop columns are spec- ified, columns numbered less than the start column or greater than the stop column will be written. Column numbering starts with one, not zero. Tab characters increment the column count to the next multiple of eight. Backspace characters decrement the column count by one. ENVIRONMENT
The LANG, LC_ALL and LC_CTYPE environment variables affect the execution of colrm as described in environ(7). EXIT STATUS
The colrm utility exits 0 on success, and >0 if an error occurs. SEE ALSO
awk(1), column(1), cut(1), paste(1) HISTORY
The colrm command appeared in 3.0BSD. BSD
August 4, 2004 BSD
All times are GMT -4. The time now is 10:30 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy