Sponsored Content
Top Forums Shell Programming and Scripting How to transpose data elements in awk Post 302194505 by baruchgu on Tuesday 13th of May 2008 04:44:28 AM
Old 05-13-2008
awk -F, 'BEGIN {max_i=0} { \
for(i=1;i<=NF;i++) {text_arr[NR SUBSEP i]=$NF} \
if ( NF > max_i ) {max_i = NF}} \
END { for(i=1;i<=max_i;i++) { for(j=1;j<=NR;j++)\
printf text_arr[j SUBSEP i];print}}}' file.txt

I did not tested it, but should be very close to the final solution.
Luck
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to transpose a table of data using awk

Hi. I have this data below:- v1 28 14 1.72414 1.72414 1.72414 1.72414 1.72414 v2 77 7 7.47126 6.89655 6.89655 6.89655 6.89655 v3 156 3 21.2644 21.2644 20.6897 21.2644 20.6897 v4 39 3 1.72414 1.72414 1.72414 1.72414 1.72414 v5 155 1 21.2644 23.5632 24.1379 23.5632 24.1379 v6 62 2 2.87356... (2 Replies)
Discussion started by: ahjiefreak
2 Replies

2. Shell Programming and Scripting

Transpose columns to Rows : Big data

Hi, I did read a few posts on the subjects, tried out a few solutions, but did not solve my problem. https://www.unix.com/302121568-post11.html https://www.unix.com/shell-programming-scripting/137953-large-file-columns-into-rows-etc-4.html Please help. Problem very similar to the second link... (15 Replies)
Discussion started by: genehunter
15 Replies

3. Shell Programming and Scripting

Transpose Daily Data from Column to Row.

Hi I'm looking to transpose Linux data from a daily report that logs every 10mins like below. After the first "comma" I need the daily total for Col2 and Col3 transposed like below. The new transposed format below will then be exported to Microsoft Excel for Reporting. Any help would be... (9 Replies)
Discussion started by: ravzter
9 Replies

4. Shell Programming and Scripting

Transpose Data from Columns to rows

Hello. very new to shell scripting and would like to know if anyone could help me. I have data thats being pulled into a txt file and currently have to manually transpose the data which is taking a long time to do. here is what the data looks like. Server1 -- Date -- Other -- value... (7 Replies)
Discussion started by: Mikes88
7 Replies

5. Shell Programming and Scripting

Transpose Column of Data to Rows

I can no longer find my commands, but I use to be able to transpose data with common fields from a single column to rows using a command line. My data is separated as follows: NAME=BOB ADDRESS=COLORADO PET=CAT NAME=SUSAN ADDRESS=TEXAS PET=BIRD NAME=TOM ADDRESS=UTAH PET=DOG I would... (7 Replies)
Discussion started by: docdave78
7 Replies

6. Shell Programming and Scripting

Transpose data as rows using awk

Hi I have below requirement, need help One file contains the meta data information and other file would have the data, match the column from file1 and with file2 and extract corresponding column value and display in another file File1: CUSTTYPECD COSTCENTER FNAME LNAME SERVICELVL ... (1 Reply)
Discussion started by: ravlapo
1 Replies

7. Shell Programming and Scripting

Help with transpose data content

Hi, Below is my input file: c116_g1_i1 -,-,-,+ c118_g2_i1 +,+ c118_g3_i1 + c120_g1_i1 +,+,+,+ . . Desired Output File c116_g1_i1 - c116_g1_i1 - c116_g1_i1 - c116_g1_i1 + c118_g2_i1 + c118_g2_i1 + (3 Replies)
Discussion started by: perl_beginner
3 Replies

8. UNIX for Advanced & Expert Users

Transpose Messy Data

I have a messy, pipe-delimited ("|") input dataset. I would like to create a file of ID plus each component of field 4 which is delimited by ";" into a long, skinny shape for easier processing. A couple of complications are that field 4 may contain both commas and linefeed characters from the... (9 Replies)
Discussion started by: 91674io
9 Replies

9. UNIX for Beginners Questions & Answers

Transpose the data

Hi All, I have sort of a case to transpose data from rows to column input data Afghanistan|10000|1 Albania|25000|4 Algeria|25000|7 Andorra|10000|4 Angola|25000|47 Antigua and Barbuda|25000|23 Argentina|5000|3 Armenia|100000|12 Aruba|20000|2 Australia|50000|2 I need to transpose... (3 Replies)
Discussion started by: radius
3 Replies

10. UNIX for Beginners Questions & Answers

Transpose large data in UNIX

Hi I have the following sample of data: my full data dimention is 900,000* 1119 rs987435 C G 1 1 1 0 2 rs345783 C G 0 0 1 0 0 rs955894 G T 1 1 2 2 1 rs6088791 ... (7 Replies)
Discussion started by: marwah
7 Replies
apache_mod_perl-108~358::mod_perl-2.0.7::docs::api::ModPUser:Contributed Peapache_mod_perl-108~358::mod_perl-2.0.7::docs::api::ModPerl::PerlRun(3)

NAME
ModPerl::PerlRun - Run unaltered CGI scripts under mod_perl Synopsis # httpd.conf PerlModule ModPerl::PerlRun Alias /perl-run/ /home/httpd/perl/ <Location /perl-run> SetHandler perl-script PerlResponseHandler ModPerl::PerlRun PerlOptions +ParseHeaders Options +ExecCGI </Location> Description META: document that for now we don't chdir() into the script's dir, because it affects the whole process under threads. "ModPerl::PerlRunPrefork" should be used by those who run only under prefork MPM. Special Blocks "BEGIN" Blocks When running under the "ModPerl::PerlRun" handler "BEGIN" blocks behave as follows: o "BEGIN" blocks defined in scripts running under the "ModPerl::PerlRun" handler are executed on each and every request. o "BEGIN" blocks defined in modules loaded from scripts running under "ModPerl::PerlRun" (and which weren't already loaded prior to the request) are executed on each and every request only if those modules declare no package. If a package is declared "BEGIN" blocks will be run only the first time each module is loaded, since those modules don't get reloaded on subsequent requests. See also "BEGIN" blocks in mod_perl handlers. "CHECK" and "INIT" Blocks Same as normal mod_perl handlers. "END" Blocks Same as "ModPerl::Registry". Authors Doug MacEachern Stas Bekman See Also "ModPerl::RegistryCooker" and "ModPerl::Registry". perl v5.16.2 2011-02apache_mod_perl-108~358::mod_perl-2.0.7::docs::api::ModPerl::PerlRun(3)
All times are GMT -4. The time now is 08:48 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy