Search Results

Search: Posts Made By: awk
9,510
Posted By awk
Hi Corona There are free utilityies we found...
Hi Corona

There are free utilityies we found called dump_txt and dump_csv, which will, at least, simplify your building csv files for input to excel.

These are Oracle functions, and a google...
4,108
Posted By awk
I like to use "rm -f" - the -f is force the...
I like to use "rm -f" - the -f is force the remove, best used in batch jobs.

If the file is there or not, it will be removed and return a "0" status code.

However, if for some reason, you...
7,318
Posted By awk
in your profile, you could set an alias for scp...
in your profile, you could set an alias for scp to be the scp -p command. This would work as long as you didn't do an absolute patch for scp.
10,333
Posted By awk
Well, without doing the entire thing for you ...
Well, without doing the entire thing for you

In awk

Use printf on the odd number lines. This allows you to put the output without a new-line at the end of the line.

Use print on the even...
38,828
Posted By awk
awk '{print $0 "\r"}' filename or awk...
awk '{print $0 "\r"}' filename

or

awk '{printf("%s\r\n", $0)}' filename
1,671
Posted By awk
On the cat command - you have ${1} instead of...
On the cat command - you have ${1} instead of ${i} - that will not do what you seem to want.
3,820
Posted By awk
Ideally, I should have had the word END on a line...
Ideally, I should have had the word END on a line by itself after the data. It shows the korn shell where to stop the input of data to the awk program.

Still the script terminated at that point,...
3,820
Posted By awk
awk -F, -v OFS=, '{ for (I=2; I<NF; I++) ...
awk -F, -v OFS=, '{ for (I=2; I<NF; I++)
{print $1, $I}
}' <<END |\
sort -kn1,1 -kn2,2 -t, -u |\
awk -F, 'NR==1{Save0=$1}
Save0 == $1{Line=Line ","...
19,062
Posted By awk
Here's the problem. On your server side, there...
Here's the problem. On your server side, there is a timeout for no - activity.

In Putty ( 0.60 ), there is a setting under the Category - Connection, that is # seconds between keepalives - Mine...
50,724
Posted By awk
Well, the trick is to use either the ">" or the...
Well, the trick is to use either the ">" or the "<" as your record separator.


awk -v RS="<" {print} <<+ |\
awk -F ">" '$1 ~ /^tns$/{TNS=$2}
$1 ~ /^user$/{USER=$2}
...
21,112
Posted By awk
grep -E "^.{11}Peter Robertson" file Sales ...
grep -E "^.{11}Peter Robertson" file
Sales Peter RobertsonGolden TigersRich Gardener PART


Make sure those are spaces and not tabs between the records. The grep is based on it being...
16,137
Posted By awk
I had a similar problem - submit up to and...
I had a similar problem - submit up to and including four background jobs, and when one or more finish, built back up to the max.


while read LINE # go through the output...
2,455
Posted By awk
find $searchDir -type f -exec grep "people" '{}'...
find $searchDir -type f -exec grep "people" '{}' /dev/null \;

grep will give the name if more than one file at a time is searched. This give it /dev/null as the second file, so you will get the...
8,233
Posted By awk
Sadly, my AIX box does not have dos2ux <sigh> -...
Sadly, my AIX box does not have dos2ux <sigh> - which I have used successfully on other systems.

The problem comes up that dos/windows uses two characters to indicate end-of-line. Unix uses one. ...
24,087
Posted By awk
echo 1234345678upthehappytheir | awk '{ START=1...
echo 1234345678upthehappytheir | awk '{
START=1
while (match(substr($0,START),"the") && RSTART > 0 )
{
print NR, RSTART+ START -1
START += RSTART
}
}' | more
1 13
1 21
8,559
Posted By awk
OK - can only give you a general type reply -...
OK - can only give you a general type reply - this will require AWK.

Make your RS=">", that will break the input into lines containing the different fields.
Then you would check to see if it is a...
3,864
Posted By awk
The "cat" command is awaiting standard in to feed...
The "cat" command is awaiting standard in to feed to the output file.

Probably, you want to cat "some file" to the appropriate output file.

I have this problem all the time when I type a grep...
8,725
Posted By awk
Try man csplit - The csplit command writes...
Try man csplit -

The csplit command writes the segments to files xx00 . . . xx99, depending on how many times the Argument parameter is specified (99 is the maximum).

I checked man on split -...
35,434
Posted By awk
try "rm -f res*[1-9]*" the -f will not give...
try "rm -f res*[1-9]*"

the -f will not give an error if the file does not exist.
4,607
Posted By awk
What have you done so far on this. Besides post...
What have you done so far on this. Besides post a very similar question 5 days ago?

https://www.unix.com/unix-dummies-questions-answers/83032-checking-files-ftp-location.html#post302241054
21,342
Posted By awk
Today's the 5th Yesterday was the 4th. I...
Today's the 5th
Yesterday was the 4th.

I am in CST

What timezone are you in???
30,827
Posted By awk
useing the xml format (which is an acceptable XLS...
useing the xml format (which is an acceptable XLS file, I have in oracle, created a spreadsheet with as many as 10 different workbooks.

Not having the Excel file in excel's XML format, I don't...
13,629
Posted By awk
A suffiently big file, not very well. Unless...
A suffiently big file, not very well.

Unless you did it in C - and use the rewrite command. This will work only if the new data is the same size as the old.

Perhaps a C class would be the way...
3,496
Posted By awk
Here is a code I did that regulates the flow of...
Here is a code I did that regulates the flow of programs running to 4

You should be able to modify as you need



while read LINE # go through the output file from the...
64,810
Posted By awk
And this is why we want to see the code ...
And this is why we want to see the code


#!/bin/bash
echo "Parameter $1"
echo "$?"
$SCHRODINGER/utilities/reagentprep -listfull | grep $1
echo "$?"

if [ $? == 1 ]then
echo "Error"
exit...
Showing results 1 to 25 of 131

 
All times are GMT -4. The time now is 10:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy