Sponsored Content
Full Discussion: $0 manipulation in awk
Top Forums Shell Programming and Scripting $0 manipulation in awk Post 302312196 by joeyg on Thursday 30th of April 2009 03:06:05 PM
Old 04-30-2009
Power $0 manipulation in awk

OK, so if $0 represent the entire record... can I change $2 and will that be reflected back in $0?

I think the following answers that YES, it does work. But is there anything I should be thinking about prior to doing this? What I am actually doing is part of 5 pages of scripting and awk commands to do some data manipulation where I must combine and de-dupe records based on criteria.

Another way to state this... consider $2 as a flag. And in certain situations must change this field from B to J. So, when I set $2 to J, it appears I can then write out $0 as the complete changed record.


Code:
> echo "A|B|C|D" | awk '{FS="|";$0=$0;print $0;print $2}'
A|B|C|D
B
> echo "A|B|C|D" | awk '{FS="|";$0=$0; print $0;print $2}'
A|B|C|D
B
> echo "A|B|C|D" | awk '{FS="|";$0=$0; $2="J";print $0;print $2}'
A J C D
J

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk manipulation

Hi , what a wonderful command but so hard to maintain ! i have a file like that : 03/07/2006 05:58:45 03/07/2006 06:58:45 03/07/2006 07:58:50 03/07/2006 08:58:50 and i want to read it and keep only the lines with 3rd field less than 07:00:00 writing it in a second file ! ... (2 Replies)
Discussion started by: Nicol
2 Replies

2. Shell Programming and Scripting

File manipulation using AWK

Hi All, I have a file having content, $ cat data1.txt 20060620 142 62310 959400 A 5.00 20060620 142 62310 959400 B 3.00 20060620 143 62310 959401 A 7.00 20060620 143 62310 959401 B 4.00 20060620 144 62310 959402 A 8.00 20060620 144 62310... (6 Replies)
Discussion started by: rinku11
6 Replies

3. Shell Programming and Scripting

awk string manipulation

Here is my awk code in a shell script: localRecDir=/somedirectory/ # awk -v LRD="$localRecDir" '{out = sprintf ("%s0000000%s"),LRD,substr($3,4) ; print > out; close(out)}' *.log Here is the contents of my *.log file I am trying to parse with my script: 000007 0110 07-0001583 000007 ... (2 Replies)
Discussion started by: zoo591
2 Replies

4. Shell Programming and Scripting

File manipulation with awk

Could you please help me to achieve the below: In a file I need to convert the multiple lines whose filed 1 and field 5 values are same into a single line but with the field 4 values comma separed as mentioned below. Fileds after 5 shall be discarded. Also here by default all other remaining... (6 Replies)
Discussion started by: dhams
6 Replies

5. Shell Programming and Scripting

File manipulation in awk

I have got a sample file below(colon(:) is the field separator) . The data is like col1:col2:col3:col4:col5:col6:col7:col8:col9:col10 11:12:012:aa:a a a:10::111:12: 311:321:320:caad::321:31:3333:: 2:22:222::bbb::cads::2222:20 :::::12:1234::12: :5:55::555:5555::::55550 Now I want to find... (9 Replies)
Discussion started by: rinku11
9 Replies

6. Shell Programming and Scripting

String manipulation using awk

I have the following string 512m512m I'm trying to split the string using awk awk '{ split(512m512m,a,"m") print $a; $a }' Sometimes the string could be 1024g1024g or 2048G2048G or 512M1024G how can i change the fieldsep to be a alphabet irrespective of case, and also... (8 Replies)
Discussion started by: ramky79
8 Replies

7. Shell Programming and Scripting

awk manipulation

Hallo Family, I have csv file which has over a million records in it. All i want to do is to change field 2 to have the same value as field 10. sample file:Now 0860093239,Anonymous,unconditional,+27381230283,Anonymous,unconditional,y,public,,2965511477:0A Desired output: ... (2 Replies)
Discussion started by: kekanap
2 Replies

8. Shell Programming and Scripting

awk manipulation

hello I have example file AA 11 BB 22 CC 33 And what I expect to have -a AA=11 -a BB=22 -a CC=33 can anyone help how I have this using awk? (1 Reply)
Discussion started by: vikus
1 Replies

9. Shell Programming and Scripting

Array manipulation with awk?

Dear friends, I'm wondering if we could do some simple math on two arrays with the same size? a1 Fe -0.21886700 -0.01417600 -0.24390300 C 2.20529400 0.89434100 -0.61061000 C -1.89657700 -0.74793000 -0.07778200 C ... (8 Replies)
Discussion started by: liuzhencc
8 Replies

10. Shell Programming and Scripting

awk manipulation

Hello all, Can someone help me with write part of code in awk to merge 2 files? Go through file1 check if number from column 3 exist in file2(column 2) if yes take value from column 1 and add to column 4 in file1. If value in column 4 exist in file1 skip it. file1... (2 Replies)
Discussion started by: vikus
2 Replies
VM(1)								      mgetty								     VM(1)

NAME
vm - VoiceModem is the program for handling the voice modem functionality from shell scripts ACTIONS
beep options [<frequency [<length in 0.001sec>]]> diagnostics options device name (e.g. ttyS2 dial options phone number help play options [<file names]+> record options file name shell options [<shell script [shell options]]> wait options [<time in seconds]> devicetest OPTIONS
-c n use compression type n -d n set i/o device -t, -m, -i, -e, -s, -H equals to -d <2,3,4,5,6,7> -l s set device string (e.g. -l ttyS2:ttyC0) -v verbose output -w use off / on hook signal from local handset to start and stop recording -x n set debug level -L n set maximum recording length in sec -P print first DTMF tone on stdout and exit -R read and print DTMF string on stdout and exit -S s set default shell for shell scripts (e.g. -S /bin/sh) -T n set silence timeout in 0.1sec -V n set silence threshold to <n> (0-100%%) SEE ALSO
vgetty(1) POD ERRORS
Hey! The above document had some coding errors, which are explained below: Around line 30: You forgot a '=back' before '=head1' Around line 32: '=item' outside of any '=over' Around line 71: You forgot a '=back' before '=head1' perl v5.10.1 2010-04-04 VM(1)
All times are GMT -4. The time now is 08:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy