Could someone tell me how to perform calculations using numbers greater than 2150000000 in Korn Shell? When I tried to do it it gave me the wrong answer.
e.g. I have a ksh file with the contents below:
---------------------------------
#!/bin/ksh
SUM=`expr 2150000000 + 2`
PRODUCT=`expr... (3 Replies)
I have a large CSV files (e.g. 2 million records) and am hoping to do one of two things. I have been trying to use awk and sed but am a newbie and can't figure out how to get it to work. Any help you could offer would be greatly appreciated - I'm stuck trying to remove the colon and wildcards in... (6 Replies)
Hi
My input file looks like
field1 field2 field3 field4 field5
field1 field2 field3 field4 field5
field1 field2 field3 field4 field5
::::::::::::
::::::::::::
There may be one space of multiple spaces between fields and no fields contains spaces in them.
If field 1 to 4 are equal for... (3 Replies)
I am trying to enter a third column in this file, but the third column should that I call "Math" perform a some math calculations based on the value found in column #2.
Here is the input file:
Here is the desired output:
Output
GERk0203078$ Levir Math
Cotete_1... (5 Replies)
Dear All,
I indeed your help for managing resarch data file.
for example I have,
data1.txt :
type of atoms z vz
Si 34 54
O 20 56
H 14 13
Si 40 17
O ... (11 Replies)
Hi all!
I have a data set in this tab separated format : Label, Value1, Value2
An instance is "data.txt" :
0 1 1
-1 2 3
0 2 2
I would like to parse this data set and generate two files, one that has only data with the label 0 and the other with label -1, so my outputs should be, for... (1 Reply)
Need your help in solving this puzzle. Any kind of help will be appreciated and link for any documents to read and learn and to deal with such scenarios would be helpful
Concatenate column1 and column2 of file 1. Then check for the concatenated value in Column1 of File2. If found extract the... (14 Replies)
Please help me to get required output for both scenario 1 and scenario 2 and need separate code for both scenario 1 and scenario 2
Scenario 1
i need to do below changes only when column1 is CR and column3 has duplicates rows/values. This inputfile can contain 100 of this duplicated rows of... (1 Reply)
Hi Folks,
I'm trying tog ain further experience with shell programming and have set my a small goal of writing a little filesystem monitoring script. So far my output is as follows:
PACMYDB03
Filesystem Size Used Avail Use% Status
/usr/local/mysql/data ... (5 Replies)
Discussion started by: Axleuk
5 Replies
LEARN ABOUT CENTOS
column
COLUMN(1) User Commands COLUMN(1)NAME
column - columnate lists
SYNOPSIS
column [options] file...
DESCRIPTION
The column utility formats its input into multiple columns. Rows are filled before columns. Input is taken from file or, by default, from
standard input. Empty lines are ignored.
OPTIONS -c, --columns width
Output is formatted to a width specified as number of characters.
-t, --table
Determine the number of columns the input contains and create a table. Columns are delimited with whitespace, by default, or with
the characters supplied using the separator. Table output is useful for pretty-printing.
-s, --separator separators
Specify possible table delimiters (default is whitespace).
-o, --output-separator separators
Specify table output delimiter (default is two whitespaces).
-x, --fillrows
Fill columns before filling rows.
-h, --help
Print help and exit.
ENVIRONMENT
The environment variable COLUMNS is used to determine the size of the screen if no other information is available.
EXAMPLES
sed 's/#.*//' /etc/fstab | column -t
BUGS
The util-linux version 2.23 changed -s option to be non-greedy, for example:
$ printf "a:b:c
1::3
" | column -t -s ':'
old output:
a b c
1 3
new output (since util-linux 2.23)
a b c
1 3
SEE ALSO colrm(1), ls(1), paste(1), sort(1)HISTORY
The column command appeared in 4.3BSD-Reno.
AVAILABILITY
The column command is part of the util-linux package and is available from ftp://ftp.kernel.org/pub/linux/utils/util-linux/.
util-linux October 2010 COLUMN(1)