Sponsored Content
Top Forums Shell Programming and Scripting awk - calculation of probability density Post 302439012 by jim mcnamara on Wednesday 21st of July 2010 11:13:41 AM
Old 07-21-2010
It would work as is, if you want to group by values like 2.31313 and 2.31314, which may not be very useful - depends on the analysis you need to do. Otherwise you want to truncate decimals e.g., 2.31313 -> 2.31
Code:
awk '{h[sprintf("%.2f",$1) " " sprintf("%.2f",$2)]++}END{for (i in h){print i,h[i]/NR}}' infile

sprintf("%f.2", number) rounds a real to 2 decimals.
 

10 More Discussions You Might Find Interesting

1. Programming

Calculate scores and probability -- Syntax issue

Hi, I am totally new to C programming on Sun Solaris environment. I am an active member on the UNIX forum and a good shell programmer. I am trying to achieve some calculations in C programming. I have the pseudo code written down but don't know the syntax. I am reading a couple of books on C... (4 Replies)
Discussion started by: madhunk
4 Replies

2. Shell Programming and Scripting

awk calculation

Hallo all, I have a script which creates an output ... see below: root@a7germ:/tmp/pax > cat 20061117.txt 523.047 521.273 521.034 517.367 516.553 517.793 513.114 513.940 I would like to use awk to calculate the (a)total sum of the numbers (b) The average of the numbers. Please... (4 Replies)
Discussion started by: kekanap
4 Replies

3. UNIX for Advanced & Expert Users

Reattemps Calculation using awk

Dear All How are you I have files which look like this : 20080406_12:43:55.779 ISC Sprint- 39 21624032999 218925866728 20080406_12:44:07.811 ISC Sprint- 20 21620241815 218927736810 20080406_12:44:00.485 ISC Sprint- 50 21621910404 218913568053... (0 Replies)
Discussion started by: zanetti321
0 Replies

4. Shell Programming and Scripting

awk calculation problem

I have a list of coordinate data, sampled below. 54555209 784672723 I want it as: 545552.09 7846727.23 Below is my script: BEGIN {FS= " "; OFS= ","} {print $1*.01,$2*.01} This is my outcome: 5.5e7 7.8e8 How do I tell awk that I want to keep all the digits instead of outputting... (1 Reply)
Discussion started by: ndnkyd
1 Replies

5. Solaris

newfs – i where to look for changed inode density

Hi All, While creating the ufs file system with newfs - i where can I see the change, I mean if the density of inode has been increased where I can see it. I tried with fstyp –v <slice> however not sure as where to look for the information. Will appreciate if I can get... (0 Replies)
Discussion started by: kumarmani
0 Replies

6. Shell Programming and Scripting

Calculation in Multiple files using awk

Hi All, I have some 10 files named samp1.csv, samp2.csv,... samp10.csv Each file having the same number of fields like, Count, field1, field2, field3. And a source.csv file which has three fields field1, field2, field3. Now, i want to find the total count by taking the field1,... (8 Replies)
Discussion started by: johnwilliams.sp
8 Replies

7. Programming

arithmetic calculation using awk

hi there again, i need to do a simple division with my data with a number of rows. i think i wanted to have a simple output like this one: col1 col2 col3 val1 val2 val1/val2 valn valm valn/valm any suggestion is very much appreciated. thanks much. (2 Replies)
Discussion started by: ida1215
2 Replies

8. Programming

awk script for finding probability of distribution of numbers

Dear All I am having data file containing 0 to 40,000 like this... 0 5 1 65 2 159 3 356 ... ... 40000 19 I want to find the probability of distribution between the numbers. The second column values are angles from 0 to 360 and the 1st column is number of files. I am expecting... (2 Replies)
Discussion started by: bala06
2 Replies

9. Shell Programming and Scripting

awk split and awk calculation in the same command

I am trying to run the awk below. My question is when I split the input, then run anotherawk to perform a calculation using that splitas the input there are no issues. When I try to combine them the output is not correct, is the split not working or did I do it wrong? Thank you :). input ... (8 Replies)
Discussion started by: cmccabe
8 Replies

10. Shell Programming and Scripting

awk calculation with zero as N/A

In the below awk, I am trying to calculate percent for a given id. It is very close the problem is when the # being used in the calculation is zero. I am not sure how to code this condition into the awk as it happens frequently. The portion in italics was an attempt but that lead to an error. Thank... (13 Replies)
Discussion started by: cmccabe
13 Replies
SHAPELIB(1)							   User Commands						       SHAPELIB(1)

NAME
dbfcreate - Create an empty xBase DBF file SYNOPSIS
dbfcreate filename[[-s fieldname width] | [-n fieldname width decimals]]... DESCRIPTION
Creates an empty DBF file called filename with columns described by all the -s and -n options that follow. OPTIONS
-s fieldname width adds a column named fieldname taking textual values of at most width characters. -n fieldname width decimals adds a column named fieldname taking numeric values of at most width digits and precision of at most decimals decimal places. The number of digits of precision should be included in width. EXIT STATUS
0 Successful program execution. 1 Missing filename argument. 2 Failed to create the file filename for writing. 3 Missing fieldname,width, or decimals argument for a -s or -n option. 4 Failed to add a column given by a -s or -n option. DIAGNOSTICS
The following diagnostics may be issued on stdout: DBFCreate(filename) failed. DBFAddField(fieldname,FTString,width,0) failed. DBFAddField(fieldname,FTDouble,width,decimals) failed. Argument incomplete, or unrecognised: arg AUTHORS
Frank Warmerdam (warmerdam@pobox.com) is the maintainer of the shapelib shapefile library. Joonas Pihlaja (jpihlaja@cc.helsinki.fi) wrote this man page. SEE ALSO
dbfadd(1), dbfdump(1), dbf_dump(1), shpadd(1), shpcreate(1), shpdump(1), shprewind(1) shapelib OCTOBER 2004 SHAPELIB(1)
All times are GMT -4. The time now is 10:39 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy