We are still waiting for mohtashims to tell us if:
works.
This find . -name star_st* -exec head -1 {} + | grep "1175 876330" gives output however, this awk '$0 ~ /1175 876330/ {nextfile}' star_st*does not show any Output
This find . -name star_st* -exec head -1 {} + | grep "1175 876330" gives output however, this awk '$0 ~ /1175 876330/ {next file}' star_st*does not show any Output
That is strange, before you said that:
gave you a syntax error (which it would unless you ran it in a directory where there is no more than one file with a name starting with star_st). We told you before that the -name primary's argument has to be quoted to work properly.
I see that RudiC has already explained that I didn't put that newline in the middle of my awk script just of the fun of it.
If this has still "too many arguments" then combine with find
@MadeInGermany: I was testing your suggestion verses that of Don Cragun's find . -name "star_st*" -exec head -1 {} + | grep "1175 876330"
I check if the result is found or not using if [ $? -eq 0 ]; then
I m not able to test n compare the performace of both as MadeInGermany's command always passes the condition of $? -eq 0 even if there are no results found. May be you [MadeInGermany] can provide a fix there ?
---------- Post updated at 09:18 AM ---------- Previous update was at 09:11 AM ----------
Quote:
Originally Posted by RudiC
Put a ; or a <newline> after the /.../. This is about 50% faster on my machine.
ok, I put a ; but it gives me error ---------- Post updated at 09:23 AM ---------- Previous update was at 09:18 AM ----------
Quote:
Originally Posted by Don Cragun
That is strange, before you said that:
gave you a syntax error (which it would unless you ran it in a directory where there is no more than one file with a name starting with star_st). We told you before that the -name primary's argument has to be quoted to work properly.
I see that RudiC has already explained that I didn't put that newline in the middle of my awk script just of the fun of it.
I am sorry Don i meant "star_st*" and not star_st*
Just to be clear, if the command:
yields exit status 0, all of the files with names starting with star_st were successfully processed by awk and the directory containing them was successfully processed by find, but it doesn't tell you if any lines were written to standard output while processing those files.
On the other hand, the commands:
and:
(which might or might not be slightly faster) will give you an exit status that indicates if grep or fgrep successfully printed a line read from find and awk (exit status 0), hit end-of-file before reading a line from the pipeline (exit status 1), or had some internal error (exit status greater than 1). It will tell you absolutely nothing about whether or not find or any of the awk commands it executed completed successfully or failed.
Some versions of some shells generate an array that contains the exit status of all elements of the latest pipeline executed, but every shell that provides it does it differently and many shells don't do it at all.
We can use subshells to save the exit status of each element of a pipeline that will work with any POSIX conforming shell, if that is what you need. So, please explain exactly what information you need to capture from the exit status of the pipeline elements.
I have many files which contain about two million lines.
Now I want to use sed to delete the 9th line and add a new
line behind the 8th line. I use the command as follows:
for((i=1;i<100;i++));
do
echo $i;
sed -i '9d' $i.dat;
sed -i '8a this is a new line' $i.dat;
done
But it is... (3 Replies)
Hi,
I have a large number of input files with two columns of numbers.
For example:
83 1453
99 3255
99 8482
99 7372
83 175
I only wish to retain lines where the numbers fullfil two requirements. E.g:
=83
1000<=<=2000
To do this I use the following... (10 Replies)
I am facing a performance problem on a Solaris 10 Sparc V890 server, it is an old one I know. The first time we realized there is a problem with the server, is the time when ftp transfers are made. There were 4 other identical servers doing much better. Network drivers are checked and there... (3 Replies)
Dear All,
I am using the following script to find and replace the date format in a file. The field18 in the file has the following format: "01/26/2010 11:55:14 GMT+04:00" which I want to convert into the following format "20100126115514" for this purpose I am using the following lines of codes:... (5 Replies)
Dear World,
I just wrote a script, which puzzled me somewhat. The siginficant code was:
for file in `ls splits*`; # splits* came from a split command executed earlier
do
tail -$SomeNumber $file | cut -d" " -f6 > $file;
done;
The interesting thing is this: A few of the $files were... (2 Replies)
All of the sudden scp got really slow ... from 2-3 seconds to 30 seconds.
This happened for 5 hours, and then it went back to running fast.
Why?
If I use the -q qualifier which "Disables the progress meter" could this have any adverse effect?
Thanks (1 Reply)
Hi,
I have an SCO-Unix server running.
There are some processes (unknown to me) which consume a lot of the system resources. This slows down the server dramatically.
Is there a command or program that monitors what processes are using the cpu, disk, etc.. and tell me how excessive and how... (3 Replies)
Discussion started by: Hansaplast
3 Replies
8. Post Here to Contact Site Administrators and Moderators