How to manipulate the conditions between every retry in wget?


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users How to manipulate the conditions between every retry in wget?
# 1  
Old 03-29-2012
How to manipulate the conditions between every retry in wget?

Hi ,

When i hit the URL using WGET command ,it is retrying according to the number of retry we mentioned along with Wget command.

my expectation :
1) If 1st try is failed and iam retrying again before 2nd retry i have to check for "xxxxxxx" entry in the log file.
2) If "XXXXXXX" entry is not available ,i have to allow 2nd retry.
3) If "XXXXXXX" entry is available,i shouldnot allow to retry again.

Can we check the condition between every retry,if yes please advice me how to achieve this..
thanks in advance.
# 2  
Old 03-29-2012
wget isn't a programming language and can't do specified things on different numbers of retries.

You could disable retries completely and do the retry functionality yourself.

Code:
TRIES=0

while ! wget --tries=1 http://url/
do
        # Break the loop early if this is the first try and XXXX isn't found in logfile
        [ "$TRIES" -eq 0 ] && ! grep -q "XXXXXXXX" /path/to/logfile && break
        TRIES=`expr $TRIES + 1`
done

# 3  
Old 04-03-2012
Coronna ,

Can you please give me some clear syntax for how to check the conditions between every retry?
wget --retries=3 https://url....

when first time it retries it should check the condition and second time it should check the same condition if satisfies it shouldnot allow that retry else it should allow.

Please advice me how to achieve this.

Thanks in advance.
# 4  
Old 04-03-2012
I repeat -- you can't do what you want that way. wget isn't a programming language. It can't understand "if x do y, if z quit".

The code I gave you does what you want. It doesn't use wget's own retry feature, but runs wget repeatedly with single tries.

To make it clearer, perhaps this:

Code:
TRIES=1

while ! wget --tries=1 http://url/
do
        case "$TRIES" in
        1)     # First failure.
                # if xxxx is in logfile, try again.
                grep -q xxxxx /path/to/logfile || break
                ;;
         *)   # Second and further failures.  Break the loop.
                break
                ;;
        esac

        TRIES=`expr $TRIES + 1`
done

This User Gave Thanks to Corona688 For This Post:
# 5  
Old 04-05-2012
Corona ,

Thanks a lot it is very helpful.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

2. Solaris

Unrecovered read error No retry

We encountered these error 2 times(e.g. Solaris 10 with NetWorker installed) with in the month of August, but we couldn't pin point the root cause, it might be bad sector, bad cable or software incompatibility? Do you experience these issue or please share your understanding about this? Thanks... (0 Replies)
Discussion started by: B@S
0 Replies

3. UNIX for Dummies Questions & Answers

Wget retry on 500 internal error

Hello Guys, I am trying to generate static site, I have perl script that wget the url, so the problem is sometimes wget has 500 internal error, this is failing to get that page. So I am thinking of retrying that url with 500 response. system $command = 'wget ... -i inputfile -o outfile" Is... (2 Replies)
Discussion started by: neal
2 Replies

4. Shell Programming and Scripting

If then else - Retry operation

I need to read a file line by line, then depending on the contents of each line, type in a code that will get written to an array. The problem I have is when I ask the user to confirm the input code, if it is wrong, how do i Return to ask again? Any thing I try increments the file to the next... (6 Replies)
Discussion started by: kcpoole
6 Replies

5. Shell Programming and Scripting

Shell Script to Retry and Exit

ok, so I'm trying to add a function to my local script that runs a command on a remote host. The reason why this is needed is that, there are other scripts that run different commands on the same remote host. so the problem is that many times there are multiple scripts being run on the remote... (1 Reply)
Discussion started by: SkySmart
1 Replies

6. Shell Programming and Scripting

Retry upon FTP failure

I am using the following code in a C Shell script to transfer files to a remote server: ftp -n logxx.xxxx.xxx.xxx.com <<DO_FTP1 quote user $user_name quote pass $password ascii put $js_file_name bin put $FinalZipFile quit DO_FTP1 This code works great except on those rare occasions... (8 Replies)
Discussion started by: phudgens
8 Replies

7. Shell Programming and Scripting

Retry every ten seconds while lockfile present

Hi, I have written below check lockfile script but need some tweaking on it. If there is a lockfile from present, I need the script to retry every 10 seconds to see if the lockfile is still there. After 120 seconds it should send an email. In my current version, if the script encounters... (6 Replies)
Discussion started by: Meert
6 Replies

8. Shell Programming and Scripting

retry process in ftp

hi #!/bin/bash SERVER=10.89.40.35 USER=xyz PASSWD=xyz ftp -in $SERVER<<EOF user $USER $PASSWD mkdir PPL cd /path of remote dir lcd /path of local dir hash bin put <file name> bye <<EOF The above ftp script i have to schedule in crontab at a particular instance of time run daily.... (2 Replies)
Discussion started by: rookie250
2 Replies

9. UNIX for Advanced & Expert Users

Enomem in Journal Retry Error

Hi, Does anyone seen this error before.. kernel: ENOMEM in journal_alloc_journal_head, retrying. I encounter this problem on IBM eServers where when the above error appears usually the machine is dead or hanged. Unless a hard reboot is been done. Is this something have to do with the memory... (1 Reply)
Discussion started by: killerserv
1 Replies

10. UNIX for Advanced & Expert Users

SCO Unix printer waiting for auto-retry

Running SCO 5.0.5 with all the updates available on a Compaq 800 and just ran into this problem of the printer that is hooked up to /dev/lp0 &or /dev/lp. The printer has been working fine for 5 plus years and still works on another machine. I've uninstalled & re-installed the parallel port and... (0 Replies)
Discussion started by: cfaiman
0 Replies
Login or Register to Ask a Question