Checking for substring in a loop takes too long to complete.
I need to check if the files returned by ls command in the below script is a sub-string of the argument passed to the script i.e $1
The below script works fine but is too slow.
If the ls command take 12 secs to complete printing all files with while loop then; using posix substring check with grep -q in the script increases the total time to about 140 seconds.
If there a way to optimize the script for quicker results ?
Last edited by mohtashims; 10-15-2019 at 03:53 AM..
I was going to say I agree with RudiC, but realise $LINE should be a subset of $1, not the other way round, so Rudi's solution probably won't get the results you want. So let's look at what is wrong with your code:
Quote:
Originally Posted by mohtashims
First line:
In bash, when you pipe anything into a while loop you create a subshell. While I don't know what kind of performance hit you are getting from that, it does mean that any variables used in the loop are forgotten after the loop finishes. Not important here, but worth remembering. Second, the ls command is pointless here. The shell has expanded the file list for ls; it is printing them according to your whim, which is to print the files one-per-line. This first line would be much better off:
Next:
This is where your performance hit is. Every time you run this line you fork a new process for the grep. Imagine 50 files. 50 invocations of grep! Assuming bash (which I am), this would be much faster:
So this should be faster, unless I got it entirely wrong and Rudo got it right!
Andrew
I was going to say I agree with RudiC, but realise $LINE should be a subset of $1, not the other way round, so Rudi's solution probably won't get the results you want. So let's look at what is wrong with your code:
First line:
In bash, when you pipe anything into a while loop you create a subshell. While I don't know what kind of performance hit you are getting from that, it does mean that any variables used in the loop are forgotten after the loop finishes. Not important here, but worth remembering. Second, the ls command is pointless here. The shell has expanded the file list for ls; it is printing them according to your whim, which is to print the files one-per-line. This first line would be much better off:
Next:
This is where your performance hit is. Every time you run this line you fork a new process for the grep. Imagine 50 files. 50 invocations of grep! Assuming bash (which I am), this would be much faster:
So this should be faster, unless I got it entirely wrong and Rudo got it right!
Andrew
@Andrew you got it Right and sorry if my explanation in the OP was not clear.
I will test the performance of your code. By the way I'm using ksh and not bash on AiX 6.1
@Andrew you got it Right and sorry if my explanation in the OP was not clear.
I will test the performance of your code. By the way I'm using ksh and not bash on AiX 6.1
Okay, then my point about piping into a while statement creating a subshell was wrong - it is true for bash but not for ksh. I'm pretty sure my modifications will work for ksh.
Good evening.
because the folder has thousand of files it takes too long and have some trouble to get the largest files and then compress files or delete it, for instance
find . -size +10000000c -exec ls -ld {} \; |sort -k5n | grep -v .gz
The above commad took an hour and i have to cancel... (10 Replies)
Hi,
I have a lengthy script which i have trimmed down for a test case as below.
more run.sh
#!/bin/bash
paths="allpath.txt"
while IFS= read -r loc
do
echo "Working on $loc"
startdir=$loc
find "$startdir" -type f \( ! -name "*.log*" ! -name "*.class*" \) -print |
while read file
do... (8 Replies)
Hi,
I am trying to search for a Directory called "mont" under a directory path "/opt/app/var/dumps"
Although "mont" is in the very parent directory called "dumps" i.e "/opt/app/var/dumps/mont" and it can never be inside any Sub-Directory of "dumps"; my below find command which also checks... (5 Replies)
Hi,
Below is my find command
find /opt/app/websphere -name myfolder -perm -600 | wc -l
At time it even takes 20 mins to complete.
my OS is : SunOS mypc 5.10 Generic_150400-09 sun4v sparc SUNW,T5440 (10 Replies)
Hi,
I wish to check the return value for wget $url.
However, some urls are designed to take 45 minutes or more to return.
All i need to check if the URL can be reached or not using wget.
How can i get wget to return the value in a few seconds ? (8 Replies)
Hi,
we currently having a issue where when we send jobs to the server for the application lawson, it is taking a very long time to complete. here are the last few lines of the database log.
2012-09-18-10.35.55.707279-240 E244403536A576 LEVEL: Warning
PID : 950492 ... (1 Reply)
Dear experts
I have a 200MG text file in this format:
text \tab number
I try to sort using options -fd and it takes very long! is that normal or I can speed it up in some ways?
I dont want to split the file since this one is already splitted.
I use this command: sort -fd file >... (12 Replies)
Hello,
like the title says, how can i measure the time it takes to load a module in Linux, and how how can i measure the time it takes to load a statically compiled module.
/Best Regards Olle
---------- Post updated at 01:13 PM ---------- Previous update was at 11:54 AM ----------
For... (0 Replies)
Hi all,
I wrote this shell script to validate filed numbers for input file. But it take forever to complete validation on a file. The average speed is like 9mins/MB.
Can anyone tell me how to improve the performance of a shell script?
Thanks (12 Replies)
Hello,
I have a C program that takes anywhere from 5 to 100 arguments and I'd like to run it from a script that makes sure it doesnt take too long to execute. If the C program takes more than 5 seconds to execute, i would like the shell script to kill it and return a short message to the user. ... (3 Replies)