Sponsored Content
Top Forums UNIX for Dummies Questions & Answers awk: convert column to row in a specific way Post 302618419 by complex.invoke on Wednesday 4th of April 2012 02:18:01 AM
Old 04-04-2012
Code:
awk -F'|' '{for(i=3; i<=NF; ++i) print $1FS$2FS$i}'

This User Gave Thanks to complex.invoke For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

convert column into row with some modifier

A file content have 1 1:-0.289433 2:0.833778 3:0.314471 4:-0.289433 5:-0.81876 6:-0.456693 7:-0.17511 8:-0.644555 9:-0.00666341 10:-1.13603 I will like to have that column into row with numbers to be printed (red color) only after colon output shud be like that -0.289433... (1 Reply)
Discussion started by: cdfd123
1 Replies

2. Shell Programming and Scripting

column to row convert - script - help

Hi, I have a file named col.txt 1.000 2.000 3.000 4.000 5.000 6.000 7.000 8.000 I should get this 1.000 5.000 2.000 6.000 3.000 7.000 (10 Replies)
Discussion started by: G0Y
10 Replies

3. Shell Programming and Scripting

Insert a text from a specific row into a specific column using SED or AWK

Hi, I am having trouble converting a text file. I have been working for this whole day now, still i couldn't make it. Here is how the text file looks: _______________________________________________________ DEVICE STATUS INFORMATION FOR LOCATION 1: OPER STATES: Disabled E:Enabled ... (5 Replies)
Discussion started by: Issemael
5 Replies

4. Shell Programming and Scripting

convert a column to row output?

Getting tired of cut-and-paste...so I thought I would post a question. how do I change this column output to a single row? from this: # vgdisplay -v /dev/vgeva05 | grep dsk | awk '{print $3}' /dev/dsk/c6t0d5 /dev/dsk/c11t0d5 /dev/dsk/c15t0d5 /dev/dsk/c18t0d5 /dev/dsk/c7t0d5... (8 Replies)
Discussion started by: mr_manny
8 Replies

5. Shell Programming and Scripting

Convert row to column

Hi, I have a file like this 50 1 2 1374438 50 1 2 1682957 50 5 2 1453574 50 10 2 1985890 100 1 2 737307 100 5 2 1660204 100 10 2 2148483 and I want to convert this by... (1 Reply)
Discussion started by: gvj
1 Replies

6. Shell Programming and Scripting

convert row to column with respect of first column.

Input file A.txt :- C2062 -117.6 -118.5 -117.5 C5145 0 0 0 C5696 0 0 0 Output file B.txt C2062 X -117.6 C2062 Y -118.5 C2062 Z -117.5... (4 Replies)
Discussion started by: asavaliya
4 Replies

7. Shell Programming and Scripting

Print unique names in each row of a specific column using awk

Is it possible to remove redundant names in the 4th column? input cqWE 100 200 singapore;singapore AZO 300 400 brazil;america;germany;ireland;germany .... .... output cqWE 100 200 singapore AZO 300 400 brazil;america;germany;ireland (4 Replies)
Discussion started by: quincyjones
4 Replies

8. UNIX for Dummies Questions & Answers

awk to print first row with forth column and last row with fifth column in each file

file with this content awk 'NR==1 {print $4} && NR==2 {print $5}' file The error is shown with syntax error; what can be done (4 Replies)
Discussion started by: cdfd123
4 Replies

9. Shell Programming and Scripting

[Solved] Convert Row To column

Hi Folks, I am using db2 command -> db2 list tablespace show detail Tablespace ID = 10 Name = TSCDDHLMSUM Type = Database managed space Contents = All permanent data.... (5 Replies)
Discussion started by: ckwan
5 Replies

10. Shell Programming and Scripting

Using awk to change a specific column and in a specific row

I am trying to change the number in bold to 2400 01,000300032,193631306,190619,0640,1,80,,2/ 02,193631306,000300032,1,190618,0640,CAD,2/ I'm not sure if sed or awk is the answer. I was going to use sed and do a character count up to that point, but that column directly before 0640 might... (8 Replies)
Discussion started by: juggernautjoee
8 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 02:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy