Sponsored Content
Top Forums Shell Programming and Scripting Enter third column & Perform Operation Post 302530282 by Corona688 on Monday 13th of June 2011 11:47:12 AM
Old 06-13-2011
Quote:
Originally Posted by Ernst
This is what I keep on getting when I try it.

Code:
awk: syntax error near line 1
awk: bailing out near line 1

Same issue with previous response
Did you copy paste what I gave you literally? The adaptation you tried last time needed a lot of work. Show what you actually did.

It'd also help to know what your system is.

I have no idea why it works with nawk but not awk. I tried it on the 'worst' version of awk I have available myself; as far as I know I did nothing that needed nawk.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to perform arithmetic operation on date

Hi all, I would appreciate if anyone knows how to perform adding to date. As for normal date, i can easily plus with any number. But when it comes to month end say for example 28 Jun, i need to perform a plus with number 3, it will not return 1 Jul. Thanks in advance for your help. (4 Replies)
Discussion started by: agathaeleanor
4 Replies

2. Shell Programming and Scripting

perform some operation on a specific coulmn starting from a specific line

I have a txt file having rows and coulmns, i want to perform some operation on a specific coulmn starting from a specific line. eg: 50.000000 1 1 1 1000.00000 1000.00000 50000.000 19 19 3.69797533E-07 871.66394 ... (3 Replies)
Discussion started by: shashi792
3 Replies

3. Shell Programming and Scripting

Column operation : cosne and sine operation

I have a txt file with several columns and i want to peform an operation on two columns and output it to a new txt file . file.txt 900.00000 1 1 1 500.00000 500.00000 100000.000 4 4 1.45257346E-07 899.10834 ... (4 Replies)
Discussion started by: shashi792
4 Replies

4. Shell Programming and Scripting

[Solved] Perform an operation to all directories

Sorry, about this thread - I solved my own problem! Thanks for taking a look. edit by bakunin: no problem, but it would have been a nice touch to actually tell us what the solution was. This would have been slightlich more educating than just knowing that you found it. I changed your title to... (0 Replies)
Discussion started by: Blue Solo
0 Replies

5. Shell Programming and Scripting

Help me to perform count & group by operation in shell scripting?

Hi All, I want to display the distinct values in the file and for each distinct value how may occurance or there. Test data: test1.dat 20121105 20121105 20121105 20121105 20121106 20121106 20121106 20121105 I need to display the output like Output (2 Replies)
Discussion started by: bbc17484
2 Replies

6. Homework & Coursework Questions

Using dbms_pipe with C++ to perform daabase operation

I am getting two result: string and int in c++ code. That I want to store into database. The request which generates result is very frequent. So each time performing db operation to store the result is costly for me. So how this can be achived using dbms_sql? I dont have any experience and how... (1 Reply)
Discussion started by: karimkhan
1 Replies

7. Shell Programming and Scripting

How To Perform Mathematical Operation Within If in awk?

Hi All, I am using an awk script as below: awk -F'|' 'BEGIN{OFS="|";} { if ($1==$3 && $3==$7 && $7==$13 && $2==$6 && $6==$11 && $15-$14+1==$11) print $0"|""TRUE"; else print $0"|""FALSE"; }' tempfile.txt In above script, all conditions are being checked except the one which is... (4 Replies)
Discussion started by: angshuman
4 Replies

8. Shell Programming and Scripting

Filter on one column and then perform conditional calculations on another column with a Linux script

Hi, I have a file (stats.txt) with columns like in the example below. Destination IP address, timestamp, TCP packet sequence number and packet length. destIP time seqNo packetLength 1.2.3.4 0.01 123 500 1.2.3.5 0.03 44 1500 1.3.2.5 0.08 44 1500 1.2.3.4 0.44... (12 Replies)
Discussion started by: Zooma
12 Replies

9. Shell Programming and Scripting

awk script to find data in three file and perform replace operation

Have three files. Any other approach with regards to file concatenation or splitting, etc is appreciated If column55(billngtype) of file1 contains YMNC or YPBC then pick the value of column13(documentnumber). Now find this documentnumber in column1(Billdoc) of file2 and grep the corresponding... (4 Replies)
Discussion started by: as7951
4 Replies

10. UNIX for Beginners Questions & Answers

Copy last few lines of a file, perform math operation and iterate further

Hi, I am trying to generate a data of following order: 4 0 1 642 643 4 642 643 1283 1284 4 1283 1284 1924 1925 4 1924 1925 2565 2566 4 2565 2566 3206 3207 4 3206 3207 3847 3848 4 3847 3848 4488 4489 4 4488 4489 5129 5130 ---------------------- 4 1 2 643 644 4 643 644 1284... (6 Replies)
Discussion started by: SaPa
6 Replies
WCWIDTH(3)						   BSD Library Functions Manual 						WCWIDTH(3)

NAME
wcwidth -- number of column positions of a wide-character code LIBRARY
Standard C Library (libc, -lc) SYNOPSIS
#include <wchar.h> int wcwidth(wchar_t wc); DESCRIPTION
The wcwidth() function determines the number of column positions required to display the wide character wc. RETURN VALUES
The wcwidth() function returns 0 if the wc argument is a null wide character (L''), -1 if wc is not printable, otherwise it returns the number of column positions the character occupies. EXAMPLES
This code fragment reads text from standard input and breaks lines that are more than 20 column positions wide, similar to the fold(1) util- ity: wint_t ch; int column, w; column = 0; while ((ch = getwchar()) != WEOF) { w = wcwidth(ch); if (w > 0 && column + w >= 20) { putwchar(L' '); column = 0; } putwchar(ch); if (ch == L' ') column = 0; else if (w > 0) column += w; } SEE ALSO
iswprint(3), wcswidth(3) STANDARDS
The wcwidth() function conforms to IEEE Std 1003.1-2001 (``POSIX.1''). BSD
August 17, 2004 BSD
All times are GMT -4. The time now is 07:35 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy