Hi, I want to read lines from a file, and I'm using two methods
1 use
2 use
however, in both of them, I notice that the tab between fields are automatically converted to space
because I want to use awk over the lines, I hope the 'tab' can be kept
are there any ways? thanks
because I want to use awk over the lines, I hope the 'tab' can be kept
You are looping over the file lines to use awk on a line at a time? That'd be pretty inefficient. You may replace everything in the loop with just a single awk.
no , I need to do several processings for each line
not just use awk over each line
Quote:
Originally Posted by elixir_sinari
When evaluating, double-quote the variable line:
You are looping over the file lines to use awk on a line at a time? That'd be pretty inefficient. You may replace everything in the loop with just a single awk.
Awk can be used to do lots of things to individual lines, to every line, or to any line matching a pattern.
If you'd tell us what "processings" you need to perform, we may be able to help you come up with a MUCH more efficient way to perform them in a single invocation of awk even if you're processing multiple files.
I have many lines like the following in a file(there are also other kinds of lines)
and in these lines, fields which starts with "Host:", "Ports:" are separated by '\tab'. The number of fields in different lines may be different, which means that maybe in this line there are only fields starts with "Host:", "Ignored States:", while in another line the fields are as the above.
Now, I want to process these lines so that different fields are written into different files, like fields starting with "Hosts:" are written into file host.log, bla bla
Quote:
Originally Posted by Don Cragun
Awk can be used to do lots of things to individual lines, to every line, or to any line matching a pattern.
If you'd tell us what "processings" you need to perform, we may be able to help you come up with a MUCH more efficient way to perform them in a single invocation of awk even if you're processing multiple files.
My file looks like
3 33 210.01.10.0 2.1 1211 560 26 45 1298 98763451112 15412323499 INPUT OK
3 233 40.01.10.0 2.1 1451 780 54 99 1876 78787878784 15423210199 CANCEL OK
Aim is to replace the spaces in each line by tab
Used: sed -e 's/ */\t/g'
But I get output like this... (3 Replies)
I have a variable sumOfJEOutputFile which is the output file of an SQL command which contains the output of that SQL. The output looks like below:
-----------
58
I am using following code to manipulate the output:
(sed 1,2d $sumOfJEOutputFile > $newTemp1 | sed '$d' $newTemp1)... (4 Replies)
Hi,
I have a space delimited text file but I only want to change the first space to a tab and keep the rest of the spaces intact. How do I go about doing that? Thanks! (3 Replies)
Hello,
Is there a direct command to check if the delimiter in your file is a tab or a space? And how can they be converted from one to another.
Thanks,
G (4 Replies)
Hi,
So my file looks like this:
title number
JR 2
JR 2
JR 4
JR 5
NM 5
NM 8
NM 2
NM 8
I used this line that I wrote to convert it to rows so it will look like this:
awk -F"\t" '!/^$/{a=a" "$3} END {for ( i in a) {print i,a}}' occ_output.tab > test.txt
JR 2 2 4 5
NM 5 8... (4 Replies)
Wants to print line when there exist leading or trailing space or tab in fields 2,3 and 5
The below code prints all lines in file even if they dont have leading and trailing space or tab.
nawk -F"|" '{for(i=1;i<=NF;i++) {if ($i ~ "^*" || $i ~ "*$")}}1' file
file
Ouput required:
... (5 Replies)
i have a commad that display the total each directory size in KB.Below the commad and o/p:
ls -ltr | grep ^d | awk '{print $9}' | xargs du -sk
output:
what i want is the proper tab space b/w value and dir.? how to get that.
thanks in advance (10 Replies)
I'm reading from a file that is semi-colon delimited. One of the fields contains 2 spaces separating the first and last name (4th field in - "JOHN<space><space> DOE"):
e.g. TORONTO;ONTARIO;1 YONGE STREET;JOHN DOE;CANADA
When I read this record and either echo/print to screen or write to... (4 Replies)