Sponsored Content
Full Discussion: Subtracting with awk?
Top Forums Shell Programming and Scripting Subtracting with awk? Post 302376859 by Scrutinizer on Wednesday 2nd of December 2009 01:09:05 PM
Old 12-02-2009
What is the idea with the FS="" and the RS="us". This way you will only get single digit fields, no?

Last edited by Scrutinizer; 12-02-2009 at 07:15 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Subtracting date / timestamps

I have looked through the forums and found many date / time manipulation tools, but cannot seem to find something that fits my needs for the following. I have a log file with date time stamps like this: Jun 21 17:21:52 Jun 21 17:24:56 Jun 21 17:27:59 Jun 21 17:31:03 Jun 21 17:34:07 Jun... (0 Replies)
Discussion started by: roadcyclist
0 Replies

2. Shell Programming and Scripting

subtracting 1.5 from column using awk and saving the changes

# foreach sub ( sub001 ) sub=sub001 cd /mnt/stor/smith/recog/$sub/event_files/timecorrected/ awk '{$1-1.5}' $sub_CR end What im trying to do is: 1. open a 1D file that consists of lists of integers in rows a columns 2. subtract 1.5 from each integer in the first column 3. save the file... (1 Reply)
Discussion started by: ac130pilot
1 Replies

3. Shell Programming and Scripting

Subtracting time with awk - BASH/Debian GNU Linux

I'm sure this is simple and I've been looking at examples for days on end but can't seem to come to grips with awk. What I have: mplayer -v dvd:// -identify -vo null -ao null -nolirc -nojoystick -frames 0 2>/dev/null >> /tmp/MplayerOut ChapterStart=($(grep CHAPTERS: /tmp/MplayerOut |sed... (3 Replies)
Discussion started by: rickenbacherus
3 Replies

4. Shell Programming and Scripting

Subtracting columns against each other

Hi All, I have a file of 100 lines of each having 1000 columns. I need to find the difference of each column against each other. That means, Col1-Col1; Col1-Col2; Col1-Col3;......Col1-Col1000; Col2-Col1; Col2-Col2; Col2-Col3;.... and so on ....up to Col1000-Col1000. Lets say the file is... (6 Replies)
Discussion started by: Fredrick
6 Replies

5. Shell Programming and Scripting

Subtracting each row from the first row in a single column file using awk

Hi Friends, I have a single column data like below. 1 2 3 4 5 I need the output like below. 0 1 2 3 4 where each row (including first row) subtracting from first row and the result should print below like the way shown in output file. Thanks Sid (11 Replies)
Discussion started by: ks_reddy
11 Replies

6. Shell Programming and Scripting

Subtracting two dates in PERL

Hi guys, First of all, I would like to say this is my first post in the unix.com forums. I am a beginner in PERL and have only started writing my first scripts. With that out of the way, I have a question regarding the calculation of time dates in PERL. I have two scalar variables with the... (1 Reply)
Discussion started by: DiRNiS
1 Replies

7. Shell Programming and Scripting

Searching columns and subtracting values in awk

Hi everyone, I had a similar question a couple days ago but my problem has gotten significantly (to me anyway) more complex. I have two files: File 1: 0808 166 166 62 9 0 1000fights 1 1 2 1 0 100places2visit 2 2 2 2 0 10veronica91 167 167 3 1 0 11thgorgeous 346 346 3806 1461 122... (2 Replies)
Discussion started by: collards
2 Replies

8. Shell Programming and Scripting

Subtracting values from variable

Legends, Please help me in , how do i subtract the variable values listed like below. the first value of orig should be subtracted from first value of prev and so on. san> echo $orig 346 316 340 239 410 107 291 139 128 230 167 147 159 159 172 116 110 260 177 0 177 169 168 186 165 366 195... (15 Replies)
Discussion started by: sdosanjh
15 Replies

9. Answers to Frequently Asked Questions

Subtracting two files

Hi, I want to subtract 2 files and save the remaining text in another file. Lets say, Hello Happy // Hi * Hungry File2 Happy Hi Output Hello (5 Replies)
Discussion started by: beginner_99
5 Replies

10. Shell Programming and Scripting

awk - Adding and Subtracting Numbers from 2 Columns

Hi Folks, I have a file with 2 columns TAB delimited and I want to add '1' to the first column and subtract '-1' from the second column. What I have tried so far is; awk -F"\t" '{ $1-=1;$2+=1}1' OFS='\t' file File 0623 0623 0624 0624 0643 0643 1059 1037 1037 1037 1038 1038... (2 Replies)
Discussion started by: pshields1984
2 Replies
CheckDigits::MXX_005(3pm)				User Contributed Perl Documentation				 CheckDigits::MXX_005(3pm)

NAME
CheckDigits::MXX_005 - compute check digits for ESR9 (CH) SYNOPSIS
use Algorithm::CheckDigits; $esr = CheckDigits('esr9'); if ($esr->is_valid('123456786')) { # do something } $cn = $esr->complete('12345678'); # $cn = '123456786' $cd = $esr->checkdigit('123456786'); # $cd = '6' $bn = $esr->basenumber('123456786'); # $bn = '12345678' DESCRIPTION
ALGORITHM 1. Digits are processed left to right. For the first digit applies the balance is 0. 2. The new balance is taken from the balance table according to the current balance (row) and the digit (column). 3. The check digit is the difference from the last balance to 10 taken modulo 10. METHODS is_valid($number) Returns true only if $number consists solely of numbers and the last digit is a valid check digit according to the algorithm given above. Returns false otherwise, complete($number) The check digit for $number is computed and concatenated to the end of $number. Returns the complete number with check digit or '' if $number does not consist solely of digits and spaces. basenumber($number) Returns the basenumber of $number if $number has a valid check digit. Return '' otherwise. checkdigit($number) Returns the checkdigit of $number if $number has a valid check digit. Return '' otherwise. EXPORT None by default. AUTHOR
Mathias Weidner, <mathias@weidner.in-bad-schmiedeberg.de> SEE ALSO
perl, CheckDigits, www.pruefziffernberechnung.de. perl v5.10.0 2008-05-17 CheckDigits::MXX_005(3pm)
All times are GMT -4. The time now is 03:42 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy