Sponsored Content
Full Discussion: Correlator in awk
Top Forums Shell Programming and Scripting Correlator in awk Post 302415743 by jarowit on Friday 23rd of April 2010 08:45:02 AM
Old 04-23-2010
it will be the following sum which I would like to have as an output:
C(2) = 1/NR [ E(1)E(3) + E(2)E(4) + E(3)E(5)+....E(NR-2)E(NR) ]

so it is a kind of sum selectively chosen products of the content of second column

Thank you for your respond Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Awk problem: How to express the single quote(') by using awk print function

Actually I got a list of file end with *.txt I want to use the same command apply to all the *.txt Thus I try to find out the fastest way to write those same command in a script and then want to let them run automatics. For example: I got the file below: file1.txt file2.txt file3.txt... (4 Replies)
Discussion started by: patrick87
4 Replies

2. Shell Programming and Scripting

scripting/awk help : awk sum output is not comming in regular format. Pls advise.

Hi Experts, I am adding a column of numbers with awk , however not getting correct output: # awk '{sum+=$1} END {print sum}' datafile 2.15291e+06 How can I getthe output like : 2152910 Thank you.. # awk '{sum+=$1} END {print sum}' datafile 2.15079e+06 (3 Replies)
Discussion started by: rveri
3 Replies

3. Shell Programming and Scripting

awk: assign variable with -v didn't work in awk filter

I want to filter 2nd column = 2 using awk $ cat t 1 2 2 4 $ VAR=2 #variable worked in print $ cat t | awk -v ID=$VAR ' { print ID}' 2 2 # but variable didn't work in awk filter $ cat t | awk -v ID=$VAR '$2~/ID/ { print $0}' (2 Replies)
Discussion started by: honglus
2 Replies

4. Shell Programming and Scripting

Problem with awk awk: program limit exceeded: sprintf buffer size=1020

Hi I have many problems with a script. I have a script that formats a text file but always prints the same error when i try to execute it The code is that: { if (NF==17){ print $0 }else{ fields=NF; all=$0; while... (2 Replies)
Discussion started by: fate
2 Replies

5. Shell Programming and Scripting

Comparison and editing of files using awk.(And also a possible bug in awk for loop?)

I have two files which I would like to compare and then manipulate in a way. File1: pictures.txt 1.1 1.3 dance.txt 1.2 1.4 treehouse.txt 1.3 1.5 File2: pictures.txt 1.5 ref2313 1.4 ref2345 1.3 ref5432 1.2 ref4244 dance.txt 1.6 ref2342 1.5 ref2352 1.4 ref0695 1.3 ref5738 1.2... (1 Reply)
Discussion started by: linuxkid
1 Replies

6. UNIX for Dummies Questions & Answers

any one used SEC - Simple Event Correlator

Hi, I am trying to configure SEC - Simple Event Correlator to analyze the logs. Just wondering if any of you have used this before? If yes, can you help by showing few of the rules you created? Thanks. (1 Reply)
Discussion started by: samnyc
1 Replies

7. Shell Programming and Scripting

awk command to compare a file with set of files in a directory using 'awk'

Hi, I have a situation to compare one file, say file1.txt with a set of files in directory.The directory contains more than 100 files. To be more precise, the requirement is to compare the first field of file1.txt with the first field in all the files in the directory.The files in the... (10 Replies)
Discussion started by: anandek
10 Replies

8. Shell Programming and Scripting

HELP with AWK one-liner. Need to employ an If condition inside AWK to check for array variable ?

Hello experts, I'm stuck with this script for three days now. Here's what i need. I need to split a large delimited (,) file into 2 files based on the value present in the last field. Samp: Something.csv bca,adc,asdf,123,12C bca,adc,asdf,123,13C def,adc,asdf,123,12A I need this split... (6 Replies)
Discussion started by: shell_boy23
6 Replies

9. Shell Programming and Scripting

Passing awk variable argument to a script which is being called inside awk

consider the script below sh /opt/hqe/hqapi1-client-5.0.0/bin/hqapi.sh alert list --host=localhost --port=7443 --user=hqadmin --password=hqadmin --secure=true >/tmp/alerts.xml awk -F'' '{for(i=1;i<=NF;i++){ if($i=="Alert id") { if(id!="") if(dt!=""){ cmd="sh someScript.sh... (2 Replies)
Discussion started by: vivek d r
2 Replies

10. Shell Programming and Scripting

awk output yields error: awk:can't open job_name (Autosys)

Good evening, Im newbie at unix specially with awk From an scheduler program called Autosys i want to extract some data reading an inputfile that comprises jobs names, then formating the output to columns for example 1. This is the inputfile: $ more MapaRep.txt ds_extra_nikira_usuarios... (18 Replies)
Discussion started by: alexcol
18 Replies
TOTAL(1)						      General Commands Manual							  TOTAL(1)

NAME
total - sum up columns SYNOPSIS
total [ -m ][ -sE | -p | -u | -l ][ -i{f|d}[N] ][ -o{f|d} ][ -tC ][ -N [ -r ]] [ file .. ] DESCRIPTION
Total sums up columns of real numbers from one or more files and prints out the result on its standard output. By default, total computes the straigt sum of each input column, but multiplication can be specified instead with the -p option. Likewise, the -u option means find the upper limit (maximum), and -l means find the lower limit (minimum). Sums of powers can be computed by giving an exponent with the -s option. (Note that there is no space between the -s and the exponent.) This exponent can be any real number, positive or negative. The absolute value of the input is always taken before the power is computed in order to avoid complex results. Thus, -s1 will produce a sum of absolute values. The default power (zero) is interpreted as a straight sum without taking absolute values. The -m option can be used to compute the mean rather than the total. For sums, the arithmetic mean is computed. For products, the geomet- ric mean is computed. (A logarithmic sum of absolute values is used to avoid overflow, and zero values are silently ignored.) If the input data is binary, the -id or -if option may be given for 64-bit double or 32-bit float values, respectively. Either option may be followed immediately by an optional count, which defaults to 1, indicating the number of double or float binary values to read per record on the input file. (There can be no space between the option and this count.) Similarly, the -od and -of options specify binary double or float output, respectively. These options do not need a count, as this will be determined by the number of input channels. A count can be given as the number of lines to read before computing a result. Normally, total reads each file to its end before producing its result, but this behavior may be overridden by inserting blank lines in the input. For each blank input line, total produces a result as if the end-of-file had been reached. If two blank lines immediately follow each other, total closes the file and proceeds to the next one (after reporting the result). The -N option (where N is a decimal integer) tells total to produce a result and reset the calculation after every N input lines. In addition, the -r option can be specified to override reinitialization and thus give a running total every N lines (or every blank line). If the end of file is reached, the current total is printed and the calculation is reset before the next file (with or without the -r option). The -tC option can be used to specify the input and output tab character. The default tab character is TAB. If no files are given, the standard input is read. EXAMPLE
To compute the RMS value of colon-separated columns in a file: total -t: -m -s2 input To produce a running product of values from a file: total -p -1 -r input BUGS
If the input files have varying numbers of columns, mean values will certainly be off. Total will ignore missing column entries if the tab separator is a non-white character, but cannot tell where a missing column should have been if the tab character is white. AUTHOR
Greg Ward SEE ALSO
cnt(1), neaten(1), rcalc(1), rlam(1), tabfunc(1) RADIANCE
2/3/95 TOTAL(1)
All times are GMT -4. The time now is 05:08 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy