Sponsored Content
Top Forums Shell Programming and Scripting Format data by consolidating replicated obs Post 302936852 by RudiC on Saturday 28th of February 2015 01:08:00 PM
Old 02-28-2015
There's a well tested solution for converting single column data to tables/matrices published several times in these fora; some upfront conditioning of your data and we can use that ideally:
Code:
awk     'NR==1          {READ1=$(NF-2)
                         READ2=$(NF-1)
                         READ3=$NF
                         next
                        }
                        {print $1, READ1"_"$2, $(NF-2)
                         print $1, READ2"_"$2, $(NF-1)
                         print $1, READ3"_"$2, $(NF)
                        }
        ' FS="\t" OFS="\t" file2 |
awk  '       {LN[$2]; HD[$1]; MX[$2,$1]+=$3; CNT[$2,$1]++}
         END    {               printf "%10s", ""; for (i in HD) printf "%10s", i; print "";
                 for (j in LN) {printf "%10s",j;   for (i in HD) printf "%10s", MX[j,i]/CNT[j,i]; print ""}
                }
        ' FS="\t"
                  N1        N2        N3
     R3_I1       3.5         4         5
     R3_I2         3         4         5
     R3_I3         4         0       3.5
     R1_I1       1.5         1         3
     R1_I2         1       1.5         3
     R1_I3         1         2         0
     R2_I1       2.5         3         4
     R2_I2         2         3         4
     R2_I3         0         5         0

Of course some good sport might want to combine these two into one single script... and get dots in place of the zeroes in the result.
This User Gave Thanks to RudiC For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Adding values then consolidating lines

I'm a begginer with scripting... Have a script that pulls check information data. There are multiple lines for each check. ALL the info is the same on the line EXCEPT the charges. ie: same check number, same payee, same patient, same check date, DIFFERENT CHARGE. I need to total the charges... (0 Replies)
Discussion started by: tdavenpo
0 Replies

2. UNIX for Advanced & Expert Users

Consolidating Pipes

This is something I've given a lot of thought to and come up with no answer. Say you have a data stream passing from a file, through process A, into process B. Process A only modifies a few bytes of the stream, then prints the rest of the stream unmodified. Is there any way to stream the file... (4 Replies)
Discussion started by: Corona688
4 Replies

3. UNIX for Dummies Questions & Answers

converting a tabular format data to comma seperated data in KSH

Hi, Could anyone help me in changing a tabular format output to comma seperated file pls in K-sh. Its very urgent. E.g : username empid ------------------------ sri 123 to username,empid sri,123 Thanks, Hema:confused: (2 Replies)
Discussion started by: Hemamalini
2 Replies

4. Shell Programming and Scripting

How to replicated records using sed

I need to generate exactly 102 duplicates in a file using sed. Here's an example of my input: Grid-ref = 1, 148 Grid-ref = 1, 311 Grid-ref = 1, 428 I need: Grid-ref = 1, 148 Grid-ref = 1, 148 Grid-ref = 1, 148 Grid-ref = 1, 148 ... repeated 102 times, then Grid-ref = 1, 311... (2 Replies)
Discussion started by: Russ
2 Replies

5. Windows & DOS: Issues & Discussions

Consolidating Freespace to allow shrinking partition?

i have an "old" laptop with 84gb used space, 203gb free, running 32bit Windows Vista. i've tried all defragmenting programs i could find and though some offer Free Space Defrag, they don't seem to take into account where on the disk to consolidates the space to. what i am trying to achieve is... (4 Replies)
Discussion started by: Sterist
4 Replies

6. Shell Programming and Scripting

Preventing script from being replicated on a defined number of hosts

ok. i have an extensive program written predominantly in borne shell. i have to give an "evaluation" copy of this program to a user so she can test it out and see if she wants it. problem is, i dont have an evaluation copy. and even if i did, im worried the evaluation copy can be edited to... (8 Replies)
Discussion started by: SkySmart
8 Replies

7. Solaris

Working with disk sets from replicated LUNs

Hi everybody We have a Sun Cluster with two nodes, connected to a number of SAN disks over fiber cables. We made SVM disk sets on those disks for our application needs. Now we constructed another site in another metropolitan area, but with only one node (no cluster), and connected it to the main... (2 Replies)
Discussion started by: abohmeed
2 Replies

8. Shell Programming and Scripting

Help with consolidating mails sent by executing script

I have a script which checks for swap usage of the unix server. A cron job has been created to execute this script every hour and send the output via mail. This output is only required for a particular duration and it can range anywhere from 6 hours to 10 hours. So we are commenting out the... (1 Reply)
Discussion started by: kiran1112
1 Replies

9. Shell Programming and Scripting

Script to generate Excel file or to SQL output data to Excel format/tabular format

Hi , i am generating some data by firing sql query with connecting to the database by my solaris box. The below one should be the header line of my excel ,here its coming in separate row. TO_CHAR(C. CURR_EMP_NO ---------- --------------- LST_NM... (6 Replies)
Discussion started by: dani1234
6 Replies

10. Shell Programming and Scripting

Consolidating multiple outputs in one file

Dears, i am stuck here i have 3 scripts running at one time and all the three scripts finish at different time and each script generate 1 file with different name. so i will have three files. i dnt know which script finish first i want to have a script which check if all the there files are... (6 Replies)
Discussion started by: mirwasim
6 Replies
GFS_PIO_SET_VIEW_INDEX(3)												 GFS_PIO_SET_VIEW_INDEX(3)

NAME
gfs_pio_set_view_index - change file view to an individual fragment SYNOPSIS
#include <gfarm/gfarm.h> char *gfs_pio_set_view_index (GFS_File gf, int fragment_number, int fragment_index, char *host, int flags); DESCRIPTION
gfs_pio_set_view_index() changes the process's view of the data in the file specified by gf to a file fragment with the index frag- ment_index. When creating a new file, it is necessary to specify the total number of file fragments fragment_number. Every parallel process should specify the same fragment_number for the corresponding file. When the file exists, GFARM_FILE_DONTCARE can be specified. If fragment_num- ber is different from the total fragment number of the existent file, it is erroneous. host is used for explicitly specifying a filesystem node. If host is NULL, appropriate filesystem node is chosen. Values of flag are constructed by a bitwise-inclusive-OR of the following list. GFARM_FILE_SEQUENTIAL File will be accessed sequentially. GFARM_FILE_REPLICATE File may be replicated to a local filesystem when accessing remotely. This flag cannot be specified with GFARM_FILE_NOT_REPLICATE. GFARM_FILE_NOT_REPLICATE File may not be replicated to a local filesystem when accessing remotely. This flag cannot be specified with GFARM_FILE_REPLICATE. By default, Gfarm files are accessed as a whole file in global file view where each fragment can be seamlessly accessed. RETURN VALUES
NULL The function terminated successfully. GFARM_ERR_NO_MEMORY Insufficient memory was available. GFARM_ERR_OPERATION_NOT_PERMITTED The file is not a regular fragmented file. GFARM_ERR_FRAGMENT_NUMBER_DOES_NOT_MATCH The total number of file fragments is different from the existence one. GFARM_ERR_INVALID_ARGUMENT Invalid arguments are specified, for instance, GFARM_FILE_DONTCARE is specified as the total number of fragments of a newly created file. Others An error except the above occurred. The reason is shown by its pointed strings. SEE ALSO
gfs_pio_create(3), gfs_pio_open(3), gfs_pio_set_view_local(3) Gfarm 06 September 2005 GFS_PIO_SET_VIEW_INDEX(3)
All times are GMT -4. The time now is 08:29 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy