Sponsored Content
Top Forums Shell Programming and Scripting Using wget and check for non-zero files Post 302223969 by trey85stang on Tuesday 12th of August 2008 02:17:11 AM
Old 08-12-2008
Quote:
Originally Posted by weatherboys
Thanks for the quick reply!

So the script will now look like:

Code:
wget --spider -v http://www.test.nl/test.html
wget http://www.test.nl/test.html -O /home/myname/domains/myname.nl/public_html/data/test2.html --quiet

But still gets files if they are zero bytes and overwrites existing files. That's not what I want. I want to check the existence (using your spider option) and I want to allow overwriting in case the files to be copied are not empty.

Can you help me out?!
can you post the output of wget --spider -v http://www.test.nl/test.html ? Perhaps you just need to redirect that out put to a txt file run a few awk lines then check the file with if for the information you want and then download the file.. or not download it.

edit: sorry for posting this.. didnt realize it was a year old thread. Not sure how I even came across it Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget to check an URL

I all, I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like: while ; do wget --spider <URL>:<port>/console > /dev/null 2>&1 rc=$? done This works perfectly because it's an HTML site and when server is... (2 Replies)
Discussion started by: AlbertGM
2 Replies

2. Shell Programming and Scripting

wget skips certain files.

I am trying to use wget to automate downloading of some mp3/wav files. However, I can't get it to follow the link to the mp3s. This is the line (it is not really the website): wget -prl 1 http://website.com/alarmsHowever, if I right-click and copy the link on the webpage in firefox, then... (4 Replies)
Discussion started by: Narnie
4 Replies

3. Shell Programming and Scripting

how to limit files downloaded by wget

I am trying to download a page and retrieve only wav and mp3 files via wget. the website is: Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites my command is : wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html When not using the -A wav,mp3... (2 Replies)
Discussion started by: Narnie
2 Replies

4. Shell Programming and Scripting

Help with WGET and renaming downloaded files :(

Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
Discussion started by: o0110o
10 Replies

5. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

6. Shell Programming and Scripting

Check if wget finishes

how can you check if wget finishes? i have this code to store the source of a website url into a variable: source=$(wget --timeout=15 -qO - $site) but this gets stuck sometimes e.g., one site had a virus and it stopped the script how can i check if wget finishes? i tried to run... (5 Replies)
Discussion started by: vanessafan99
5 Replies

7. Shell Programming and Scripting

Files download using wget

Hi, I need to implement below logic to download files daily from a URL. * Need to check if it is yesterday's file (YYYY-DD-MM.dat) * If present then download from URL (sample_url/2013-01-28.dat) * Need to implement wait logic if not present * if it still not able to find the file... (1 Reply)
Discussion started by: rakesh5300
1 Replies

8. Shell Programming and Scripting

Wget - how to ignore files in immediate directory?

i am trying to recursively save a remote FTP server but exclude the files immediately under a directory directory1 wget -r -N ftp://user:pass@hostname/directory1 I want to keep these which may have more files under them directory1/dir1/file.jpg directory1/dir2/file.jpg... (16 Replies)
Discussion started by: vanessafan99
16 Replies

9. Shell Programming and Scripting

Check wget return code

hello check this script it jump to else part, in n both cases, (if files exist or not) wget $MIRROR/kkk.zip && wget $MIRROR/jjj.zip RC="$?" if ] then echo -e "$RED Ooops, Download links are broken...! $RESET" else echo -e "$GREEN Everything is fine, Cheers ... $RESET" fi (4 Replies)
Discussion started by: nimafire
4 Replies

10. UNIX for Advanced & Expert Users

CMD to check status of the server using Wget

Hi All, Using Wget I'm able to get the status of the server.....only when the server is completely down or up.... but problem here in script is Suppose if the server got hang I mean to say that if the server is taking long time to login, for example normally the server takes 3 seconds to login... (3 Replies)
Discussion started by: manohar2013
3 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 05:47 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy