Hi guys
Could anyone advise me how to convert my rows into columns from a file
My file would be similar to this:
A11 A12 A13 A14 A15 ... A1n
A21 A22 A23
A31
A41
A51
...
Am1 Am2 Am3 Am4 Am5 ... Amn
The number of rows is not the same to the number of columns
Thanks in advance (2 Replies)
hi,
Apologies if this has been covered.
I have requirement where i have to convert a single column into multiple column.
My data will be like this -
2
3
4
5
6
Output required -
2 3 4 5 6 (1 Reply)
Hey all, I have a list in the format ;
variable length with spaces
more variable information
some more variable information
and I would like to transform that 'column' into rows ;
variable length with spaces more variable information some more variable information
Any... (8 Replies)
Hello,
I have a huge tab delimited file with around 40,000 columns and 900 rows I want to convert columns to a row.
INPUT file look like this.
the first line is a headed of a file.
ID marker1 marker2 marker3 marker4
b1 A G A C ... (5 Replies)
Hi Gurus,
How to convert rows in to columns using linux shell scripting
Input is like (sample.txt)
ABC
DEF
GHI
JKL
MNO
PQR
STU
VWX
YZA
BCD
output should be (sampleoutput.csv)
ABC,DEF,GHI,JKL,MNO
PQR,STU,VWX,YZA,BCD (2 Replies)
I am looking to print the data in columns and after every 3 words it should be a new row.
cat example.out | awk 'END { for (i = 0; ++i < m;) print _;print _ }{ _ = _ x ? _ OFS $1 : $1}' m=1| grep -i INNER
I am looking to print in a new line after every 3 words.
... (2 Replies)
hi folks,
I have a sample data like what is shown below:
1,ID=1000
1,Org=CedarparkHospital
1,cn=john
1,sn=doe
1,uid=User001
2,uid=User002
2,ID=2000
2,cn=steve
2,sn=jobs
2,Org=Providence
I would like to convert it into the below format:
1,1000,CedarparkHospital,john,doe,User001... (11 Replies)
Discussion started by: vskr72
11 Replies
LEARN ABOUT DEBIAN
vend::ship::queryups
Vend::Ship::QueryUPS(3pm) User Contributed Perl Documentation Vend::Ship::QueryUPS(3pm)NAME
Vend::Ship::QueryUPS -- calculate UPS costs via www
SYNOPSIS
(catalog.cfg)
Shipping QueryUPS default_geo 45056
(shipping.asc)
ground: UPS Ground Commercial
origin 45056
service GNDCOM
min 0
max 0
cost e Nothing to ship!
min 0
max 150
cost s QueryUPS
min 150
max 99999999
cost e Too heavy for UPS.
DESCRIPTION
Calculates UPS costs via the WWW using Business::UPS.
To activate, configure any parameter in catalog.cfg. A good choice is the default origin zip.
Options:
weight
Weight in pounds. Required -- normally passed via CRIT parameter.
service
Any valid Business::UPS mode (required). Example: 1DA,2DA,GNDCOM. Defaults to the mode name.
geo Location of field containing zip code. Default is 'zip'.
country_field
Location of field containing country code. Default is 'country'.
default_geo
The ZIP code to use if none supplied -- for defaulting shipping to some value in absence of ZIP. No default -- will return 0 and error
if no zip.
default_country
The country code to use if none supplied -- for defaulting shipping to some value in absence of country. Default US.
aggregate
If 1, aggregates by a call to weight=150 (or $Variable->{UPS_QUERY_MODULO}). Multiplies that times number necessary, then runs a call
for the remainder. In other words:
[ups-query weight=400 mode=GNDCOM aggregate=1]
is equivalent to:
[calc]
[ups-query weight=150 mode=GNDCOM] +
[ups-query weight=150 mode=GNDCOM] +
[ups-query weight=100 mode=GNDCOM];
[/calc]
If set to a number above 1, will be the modulo to do repeated calls by. So:
[ups-query weight=400 mode=GNDCOM aggregate=100]
is equivalent to:
[calc]
[ups-query weight=100 mode=GNDCOM] +
[ups-query weight=100 mode=GNDCOM] +
[ups-query weight=100 mode=GNDCOM] +
[ups-query weight=100 mode=GNDCOM];
[/calc]
To aggregate by 1, use .999999.
cache_table
Set to the name of a table (default ups_cache) which can cache the calls so repeated calls for the same values will not require
repeated calls to UPS.
Table needs to be set up with:
Database ups_cache ship/ups_cache.txt __SQLDSN__
Database ups_cache AUTO_SEQUENCE ups_cache_seq
Database ups_cache DEFAULT_TYPE varchar(12)
Database ups_cache INDEX weight origin zip shipmode country
And have the fields:
code weight origin zip country shipmode cost updated
Typical cached data will be like:
code weight origin zip country shipmode cost updated
14 11 45056 99501 US 2DA 35.14 1052704130
15 11 45056 99501 US 1DA 57.78 1052704130
16 11 45056 99501 US 2DA 35.14 1052704132
17 11 45056 99501 US 1DA 57.78 1052704133
Cache expires in one day.
perl v5.14.2 2010-03-25 Vend::Ship::QueryUPS(3pm)