Transpose data from columns to lines for each event


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Transpose data from columns to lines for each event
# 8  
Old 01-14-2009
Hi again angheloko,

I notice the code split input2.txt in 4 subfiles, because in this input file are 4 EVENT blocks, but a complete input file could be of thousands of EVENT blocks. How to avoid generate those input2.txt.$X subfiles considering
a large input file?

Thanks for your help again.

And well, now this is what I get.

input.txt
Code:
 
EVENT
INTERNET CONNECTION
Date                       11/01/2009
Initial hour               07:30
Number of users            27
Average of use             32 min
Final hour                 19:00
EVENT
LOCAL CALL
Date                       11/01/2009
Initial hour               07:42
Number of users            15
Average of use             7 min
Final hour                 16:11
EVENT
INTERNATIONAL CALL
Date                       11/01/2009
Initial hour               09:14
Number of users            21
Average of use             5 min
Final hour                 16:17
EVENT
PRINTER USE
Date                       12/01/2009
Initial hour               07:30
Number of users            23
Average of pages printed   17
Final hour                 19:00

input2.txt
Code:
 
 
EVENT|INTERNET CONNECTION
Date|11/01/2009
Initial hour|07:30
Number of users|27
Average of use|32 min
Final hour|19:00
 
EVENT|LOCAL CALL
Date|11/01/2009
Initial hour|07:42
Number of users|15
Average of use|7 min
Final hour|16:11
 
EVENT|INTERNATIONAL CALL
Date|11/01/2009
Initial hour|09:14
Number of users|21
Average of use|5 min
Final hour|16:17
 
EVENT|PRINTER USE
Date|12/01/2009
Initial hour|07:30
Number of users|23
Average of pages printed|17
Final hour|19:00

headers.txt and rowheaders with 0 KB (totally blank)

colheaders.txt
Code:
input.txt:EVENT
input.txt:INTERNET CONNECTION
input.txt:Date                       11/01/2009
input.txt:Initial hour               07:30
input.txt:Number of users            27
input.txt:Average of use             32 min
input.txt:Final hour                 19:00
input.txt:
input.txt:EVENT
input.txt:LOCAL CALL
input.txt:Date                       11/01/2009
input.txt:Initial hour               07:42
input.txt:Number of users            15
input.txt:Average of use             7 min
input.txt:Final hour                 16:11
input.txt:
input.txt:EVENT
input.txt:INTERNATIONAL CALL
input.txt:Date                       11/01/2009
input.txt:Initial hour               09:14
input.txt:Number of users            21
input.txt:Average of use             5 min
input.txt:Final hour                 16:17
input.txt:
input.txt:EVENT
input.txt:PRINTER USE
input.txt:Date                       12/01/2009
input.txt:Initial hour               07:30
input.txt:Number of users            23
input.txt:Average of pages printed   17
input.txt:Final hour                 19:00
input2.txt:
input2.txt:EVENT|INTERNET CONNECTION
input2.txt:Date|11/01/2009
input2.txt:Initial hour|07:30
input2.txt:Number of users|27
input2.txt:Average of use|32 min
input2.txt:Final hour|19:00
input2.txt:
input2.txt:
input2.txt:EVENT|LOCAL CALL
input2.txt:Date|11/01/2009
input2.txt:Initial hour|07:42
input2.txt:Number of users|15
input2.txt:Average of use|7 min
input2.txt:Final hour|16:11
input2.txt:
input2.txt:
input2.txt:EVENT|INTERNATIONAL CALL
input2.txt:Date|11/01/2009
input2.txt:Initial hour|09:14
input2.txt:Number of users|21
input2.txt:Average of use|5 min
input2.txt:Final hour|16:17
input2.txt:
input2.txt:
input2.txt:EVENT|PRINTER USE
input2.txt:Date|12/01/2009
input2.txt:Initial hour|07:30
input2.txt:Number of users|23
input2.txt:Average of pages printed|17
input2.txt:Final hour|19:00
input2.txt.10:EVENT|LOCAL CALL
input2.txt.10:Date|11/01/2009
input2.txt.10:Initial hour|07:42
input2.txt.10:Number of users|15
input2.txt.10:Average of use|7 min
input2.txt.10:Final hour|16:11
input2.txt.18:EVENT|INTERNATIONAL CALL
input2.txt.18:Date|11/01/2009
input2.txt.18:Initial hour|09:14
input2.txt.18:Number of users|21
input2.txt.18:Average of use|5 min
input2.txt.18:Final hour|16:17
input2.txt.2:EVENT|INTERNET CONNECTION
input2.txt.2:Date|11/01/2009
input2.txt.2:Initial hour|07:30
input2.txt.2:Number of users|27
input2.txt.2:Average of use|32 min
input2.txt.2:Final hour|19:00
input2.txt.26:EVENT|PRINTER USE
input2.txt.26:Date|12/01/2009
input2.txt.26:Initial hour|07:30
input2.txt.26:Number of users|23
input2.txt.26:Average of pages printed|17
input2.txt.26:Final hour|19:00

output.txt
Code:
EVENT     input.txt:EVENT
 input.txt:INTERNET CONNECTION
 input.txt:Date                       11/01/2009
 input.txt:Initial hour               07:30
 input.txt:Number of users            27
 input.txt:Average of use             32 min
 input.txt:Final hour                 19:00
 input.txt:
 input.txt:EVENT
 input.txt:LOCAL CALL
 input.txt:Date                       11/01/2009
 input.txt:Initial hour               07:42
 input.txt:Number of users            15
 input.txt:Average of use             7 min
 input.txt:Final hour                 16:11
 input.txt:
 input.txt:EVENT
 input.txt:INTERNATIONAL CALL
 input.txt:Date                       11/01/2009
 input.txt:Initial hour               09:14
 input.txt:Number of users            21
 input.txt:Average of use             5 min
 input.txt:Final hour                 16:17
 input.txt:
 input.txt:EVENT
 input.txt:PRINTER USE
 input.txt:Date                       12/01/2009
 input.txt:Initial hour               07:30
 input.txt:Number of users            23
 input.txt:Average of pages printed   17
 input.txt:Final hour                 19:00 input2.txt:
 input2.txt:EVENT|INTERNET CONNECTION
 input2.txt:Date|11/01/2009
 input2.txt:Initial hour|07:30
 input2.txt:Number of users|27
 input2.txt:Average of use|32 min
 input2.txt:Final hour|19:00
 input2.txt:
 input2.txt:
 input2.txt:EVENT|LOCAL CALL
 input2.txt:Date|11/01/2009
 input2.txt:Initial hour|07:42
 input2.txt:Number of users|15
 input2.txt:Average of use|7 min
 input2.txt:Final hour|16:11
 input2.txt:
 input2.txt:
 input2.txt:EVENT|INTERNATIONAL CALL
 input2.txt:Date|11/01/2009
 input2.txt:Initial hour|09:14
 input2.txt:Number of users|21
 input2.txt:Average of use|5 min
 input2.txt:Final hour|16:17
 input2.txt:
 input2.txt:
 input2.txt:EVENT|PRINTER USE
 input2.txt:Date|12/01/2009
 input2.txt:Initial hour|07:30
 input2.txt:Number of users|23
 input2.txt:Average of pages printed|17
 input2.txt:Final hour|19:00 input2.txt.10:EVENT|LOCAL CALL
 input2.txt.10:Date|11/01/2009
 input2.txt.10:Initial hour|07:42
 input2.txt.10:Number of users|15
 input2.txt.10:Average of use|7 min
 input2.txt.10:Final hour|16:11
 input2.txt.18:EVENT|INTERNATIONAL CALL
 input2.txt.18:Date|11/01/2009
 input2.txt.18:Initial hour|09:14
 input2.txt.18:Number of users|21
 input2.txt.18:Average of use|5 min
 input2.txt.18:Final hour|16:17
 input2.txt.2:EVENT|INTERNET CONNECTION
 input2.txt.2:Date|11/01/2009
 input2.txt.2:Initial hour|07:30
 input2.txt.2:Number of users|27
 input2.txt.2:Average of use|32 min
 input2.txt.2:Final hour|19:00
 input2.txt.26:EVENT|PRINTER USE
 input2.txt.26:Date|12/01/2009
 input2.txt.26:Initial hour|07:30
 input2.txt.26:Number of users|23
 input2.txt.26:Average of pages printed|17
 input2.txt.26:Final hour|19:00

output2.txt
Code:
EVENT     input.txt:EVENT
input.txt:Initial hour               07:30
input.txt:Average of use             32 min
input.txt:Final hour                 19:00
input.txt:Date                       11/01/2009
input.txt:INTERNET CONNECTION
input.txt:Number of users            27

# 9  
Old 01-14-2009
Hi chkmal,

Like I said earlier, the solution was rushed. I do realize that this will not be the perfect solution. Anyway, the fault is in the creation of the headers which is why the succeeding steps failed. Let me try to get back to you later with a better solution.

For the mean time, how about cherry's solution?

You do have perl in your machine, right?
# 10  
Old 01-15-2009
Hello angheloko,

Well, it´s ok. I´ll wait, no problem, mean while I´m the most interested and would like to contribute with ideas.

I was thinking an algorithm, but Í can´t translate it to shell script, awk,
I´m very new with this of awk, or unix programming.

Something like.

1-) Put in column 2 in the same line, the word that is below "EVENT", for
example, "LOCAL CALL", "PRINTER USE", etc.

2-) Get unique values from column 1 and transpose them like headers
columns, beginnig the headers position in column 2 in the transposed
arrangement.

I´ve been trying doing my first steps to get unique values with some
tips from web examples of course.

Code:
 
#Extract first column
awk '{print $1}' input.txt > input1.txt
 
#Extract unique values in column counting frequency
cat input1.txt | sort | uniq -c | sort -n | more > output.txt

3-) Make a loop for every block that begins with "EVENT" and
transpose the values in column 2, putting them below the
respective header.

*The info for every block put before in vertical way, would pass to stay in horizontal way.
*The existent relation for values in column 1 and 2 in the same line,
would pass to be a relation of values in line 1(hearders line) and the line
X, in the same column.

I hope be an idea with some sense.


Related to your question about cherry´s solution, I´ve tryed and fails to me, I´m not sure why, I´m using UWIN,
a unix emulator for windows. I´ve been trying some examples of perl basic commands (like "Hello world") and seems to be working and being able to receive perl commands.

Code:
 
/c
$ print "Hello World.\n";
Hello World.
$

With summer_cherry´s script I get the error.

Code:
$ cherry.pl
cherry.pl[1]: sub: not found [No such file or directory]
cherry.pl: syntax error at line 2: `(' unexpected
$ sub ?
-ksh: sub: not found [No such file or directory]
$ sub help
-ksh: sub: not found [No such file or directory]
$ man sub
man page for sub not found
$

Well, will see what happens, I´ll continue trying over here, many thanks for your kind assistance so far.

Best regards.
# 11  
Old 01-15-2009
Hi cg,

What you suggested is very close to my first algo so we can go with that.

I just don't know why headers.txt didn't formed as expected.

Anyway, please see codes below, test it, and post the o/p (we may be getting different o/p(s):

This will get all the required headers
Code:
sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud

This should return the row headers (first column):
Code:
sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud | grep [A-Z][A-Z][A-Z]*

And finally, this should return the column headers:
Code:
sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud | grep -v [A-Z][A-Z][A-Z]*

These are my results:

input.txt:
Code:
EVENT
INTERNET CONNECTION
Date                       11/01/2009
Initial hour               07:30
Number of users            27
Average of use             32 min
Final hour                 19:00
EVENT
LOCAL CALL
Date                       11/01/2009
Initial hour               07:42
Number of users            15
Average of use             7 min
Final hour                 16:11
EVENT
INTERNATIONAL CALL
Date                       11/01/2009
Initial hour               09:14
Number of users            21
Average of use             5 min
Final hour                 16:17
EVENT
PRINTER USE
Date                       12/01/2009
Initial hour               07:30
Number of users            23
Average of pages printed   17
Final hour                 19:00

1st code o/p (get required headers):
Code:
Average of pages printed
Average of use
Date
EVENT
Final hour
INTERNATIONAL CALL
INTERNET CONNECTION
Initial hour
LOCAL CALL
Number of users
PRINTER USE

2nd code o/p (get row headers):
Code:
EVENT
INTERNATIONAL CALL
INTERNET CONNECTION
LOCAL CALL
PRINTER USE

3rd code o/p (get column headers):
Code:
Average of pages printed
Average of use
Date
Final hour
Initial hour
Number of users

Go try it and post your results. Then we can go from there.
# 12  
Old 01-15-2009
Hi angheloko,

I couln´t reply before.

Well, with your new codes I receive the same input.txt at the end, doing it code by code or put the 3 codes in a shell script togheter.
The first 2 codes dont seem to do anything when I run them. The third one shows the same input.txt as output.

Below what I get, step by step.

Code:
 
$ pwd
/c/Temp Folder
 
$ ls
anghel.sh   input.txt
 
$ sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud
 
sed: "input.txt", line 30: warning: newline appended
 
$ sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud | grep [A-Z][A-Z][A-Z]*
 
sed: "input.txt", line 30: warning: newline appended
 
$ sed 's/   */|/g;/^ *$/d' input.txt | awk -F"|" ' { print $1 } ' | sort | uniq -ud | grep -v [A-Z][A-Z][A-Z]*
 
sed: "input.txt", line 30: warning: newline appended
 
EVENT
INTERNET CONNECTION
Date                       11/01/2009
Initial hour               07:30
Number of users            27
Average of use             32 min
Final hour                 19:00
EVENT
LOCAL CALL
Date                       11/01/2009
Initial hour               07:42
Number of users            15
Average of use             7 min
Final hour                 16:11
EVENT
INTERNATIONAL CALL
Date                       11/01/2009
Initial hour               09:14
Number of users            21
Average of use             5 min
Final hour                 16:17
EVENT
PRINTER USE
Date                       12/01/2009
Initial hour               07:30
Number of users            23
Average of pages printed   17
Final hour                 19:00
$

I think is not going like in your machine, what could be?

I´ll follow seeing what happens.
# 13  
Old 01-15-2009
Hi cg,

Try the following instead:

Code:
awk ' BEGIN { FS="  " } { print $1 } ' foo | sort | sed '$!N; /^\(.*\)\n\1$/!P; D'

awk ' BEGIN { FS="  " } { print $1 } ' foo | sort | sed '$!N; /^\(.*\)\n\1$/!P; D' | grep ^[A-Z][A-Z]

awk ' BEGIN { FS="  " } { print $1 } ' foo | sort | sed '$!N; /^\(.*\)\n\1$/!P; D' | grep -v ^[A-Z][A-Z]

The reason we're doing this is to establish the required column and row headers and make it into an awk friendly table.
# 14  
Old 01-15-2009
Hi again angheloko,

I´ve tryed your last 3 lines and look like are workin in my pc now.

See my screen log when I sent the 3 codes one by one:

Code:
 
$ awk ' BEGIN { FS="  " } { print $1 } ' input.txt | sort | sed '$!N; /^\(.*\)\n\1$/!P; D'
 
Average of pages printed
Average of use
Date
EVENT
Final hour
INTERNATIONAL CALL
INTERNET CONNECTION
Initial hour
LOCAL CALL
Number of users
PRINTER USE
 
$ awk ' BEGIN { FS="  " } { print $1 } ' input.txt | sort | sed '$!N; /^\(.*\)\n\1$/!P; D' | grep ^[A-Z][A-Z]
 
EVENT
INTERNATIONAL CALL
INTERNET CONNECTION
LOCAL CALL
PRINTER USE
 
$ awk ' BEGIN { FS="  " } { print $1 } ' input.txt | sort | sed '$!N; /^\(.*\)\n\1$/!P; D' | grep -v ^[A-Z][A-Z]
 
Average of pages printed
Average of use
Date
Final hour
Initial hour
Number of users
$

As you can see, it´s works so far.

Many thanks for your help.Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Transpose rows to certain columns

Hello, I have the following data and I want to use awk to transpose each value to a certain column , so in case the value is not available the column should be empty. Example: Box Name: BoxA Weight: 1 Length :2 Depth :3 Color: red Box Name: BoxB Weight: 3 Length :4 Color: Yellow... (5 Replies)
Discussion started by: rahman.ahmed
5 Replies

2. Shell Programming and Scripting

Transpose columns to row

Gents Using the attached file and using this code. awk '{print substr($0,4,2)}' input.txt | sort -k1n | awk '{a++}END{for(i in a) print i,a}' | sort -k1 > output i got the this output. 00 739 01 807 02 840 03 735 04 782 05 850 06 754 07 295 08 388 09 670 10 669 11 762 (8 Replies)
Discussion started by: jiam912
8 Replies

3. Shell Programming and Scripting

Transpose comma delimited data in rows to columns

Hello, I have a bilingual database with the following structure a,b,c=d,e,f The right half is in a Left to right script and the second is in a Right to left script as the examples below show What I need is to separate out the database such that the first word on the left hand matches the first... (4 Replies)
Discussion started by: gimley
4 Replies

4. Shell Programming and Scripting

Transpose lines from individual blocks to unique lines

Hello to all, happy new year 2013! May somebody could help me, is about a very similar problem to the problem I've posted here where the member rdrtx1 and bipinajith helped me a lot. https://www.unix.com/shell-programming-scripting/211147-map-values-blocks-single-line-2.html It is very... (3 Replies)
Discussion started by: Ophiuchus
3 Replies

5. Shell Programming and Scripting

transpose selected columns

Can I transform input like the below ? Note: Insert zeros if there is no value to transform. Input key name score key1 abc 10 key2 abc 20 key1 xxx 100 key2 xxx 20 key1 zzz 0 key2 zzz 29 key3 zzz 129 key1 yyy 39output abc ... (1 Reply)
Discussion started by: quincyjones
1 Replies

6. Shell Programming and Scripting

Transpose Data from Columns to rows

Hello. very new to shell scripting and would like to know if anyone could help me. I have data thats being pulled into a txt file and currently have to manually transpose the data which is taking a long time to do. here is what the data looks like. Server1 -- Date -- Other -- value... (7 Replies)
Discussion started by: Mikes88
7 Replies

7. Shell Programming and Scripting

transpose rows to columns

Any tips on how I can awk the input data to display the desired output per below? Thanking you in advance. input test data: 2 2010-02-16 10:00:00 111111111111 bytes 99999999999 bytes 90% 4 2010-02-16 12:00:00 333333333333 bytes 77777777777 bytes 88% 5 2010-02-16 11:00:00... (4 Replies)
Discussion started by: ux4me
4 Replies

8. Shell Programming and Scripting

Transpose columns to Rows

I have a data A 1 B 2 C 3 D 4 E 5 i would like to change the data A B C D E 1 2 3 4 5 Pls suggest how we can do it in UNIX. Start using code tags, thanks. Also start reading your PM's you get from Mods as well read the Forum Rules. That might not do any harm. (24 Replies)
Discussion started by: aravindj80
24 Replies

9. Shell Programming and Scripting

Transpose Rows Into Columns

I'm aware there are a lot of resources dedicated to the question of transposing rows and columns, but I'm a total newbie at this and the task appears to be beyond me. I have 40 text files with content that looks like this: Dokument 1 von 146 Orange County Register (California) June 26, 2010... (2 Replies)
Discussion started by: spindoctor
2 Replies

10. Shell Programming and Scripting

Transpose columns to Rows : Big data

Hi, I did read a few posts on the subjects, tried out a few solutions, but did not solve my problem. https://www.unix.com/302121568-post11.html https://www.unix.com/shell-programming-scripting/137953-large-file-columns-into-rows-etc-4.html Please help. Problem very similar to the second link... (15 Replies)
Discussion started by: genehunter
15 Replies
Login or Register to Ask a Question