Sponsored Content
Top Forums Shell Programming and Scripting Using wget and check for non-zero files Post 302141232 by weatherboys on Thursday 18th of October 2007 07:24:28 AM
Old 10-18-2007
Thanks for the quick reply!

So the script will now look like:

Code:
wget --spider -v http://www.test.nl/test.html
wget http://www.test.nl/test.html -O /home/myname/domains/myname.nl/public_html/data/test2.html --quiet

But still gets files if they are zero bytes and overwrites existing files. That's not what I want. I want to check the existence (using your spider option) and I want to allow overwriting in case the files to be copied are not empty.

Can you help me out?!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget to check an URL

I all, I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like: while ; do wget --spider <URL>:<port>/console > /dev/null 2>&1 rc=$? done This works perfectly because it's an HTML site and when server is... (2 Replies)
Discussion started by: AlbertGM
2 Replies

2. Shell Programming and Scripting

wget skips certain files.

I am trying to use wget to automate downloading of some mp3/wav files. However, I can't get it to follow the link to the mp3s. This is the line (it is not really the website): wget -prl 1 http://website.com/alarmsHowever, if I right-click and copy the link on the webpage in firefox, then... (4 Replies)
Discussion started by: Narnie
4 Replies

3. Shell Programming and Scripting

how to limit files downloaded by wget

I am trying to download a page and retrieve only wav and mp3 files via wget. the website is: Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites my command is : wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html When not using the -A wav,mp3... (2 Replies)
Discussion started by: Narnie
2 Replies

4. Shell Programming and Scripting

Help with WGET and renaming downloaded files :(

Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
Discussion started by: o0110o
10 Replies

5. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

6. Shell Programming and Scripting

Check if wget finishes

how can you check if wget finishes? i have this code to store the source of a website url into a variable: source=$(wget --timeout=15 -qO - $site) but this gets stuck sometimes e.g., one site had a virus and it stopped the script how can i check if wget finishes? i tried to run... (5 Replies)
Discussion started by: vanessafan99
5 Replies

7. Shell Programming and Scripting

Files download using wget

Hi, I need to implement below logic to download files daily from a URL. * Need to check if it is yesterday's file (YYYY-DD-MM.dat) * If present then download from URL (sample_url/2013-01-28.dat) * Need to implement wait logic if not present * if it still not able to find the file... (1 Reply)
Discussion started by: rakesh5300
1 Replies

8. Shell Programming and Scripting

Wget - how to ignore files in immediate directory?

i am trying to recursively save a remote FTP server but exclude the files immediately under a directory directory1 wget -r -N ftp://user:pass@hostname/directory1 I want to keep these which may have more files under them directory1/dir1/file.jpg directory1/dir2/file.jpg... (16 Replies)
Discussion started by: vanessafan99
16 Replies

9. Shell Programming and Scripting

Check wget return code

hello check this script it jump to else part, in n both cases, (if files exist or not) wget $MIRROR/kkk.zip && wget $MIRROR/jjj.zip RC="$?" if ] then echo -e "$RED Ooops, Download links are broken...! $RESET" else echo -e "$GREEN Everything is fine, Cheers ... $RESET" fi (4 Replies)
Discussion started by: nimafire
4 Replies

10. UNIX for Advanced & Expert Users

CMD to check status of the server using Wget

Hi All, Using Wget I'm able to get the status of the server.....only when the server is completely down or up.... but problem here in script is Suppose if the server got hang I mean to say that if the server is taking long time to login, for example normally the server takes 3 seconds to login... (3 Replies)
Discussion started by: manohar2013
3 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 07:20 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy