09-27-2012
How about just letting it print instead of continually cramming everything into backticks? You don't have to always have to do that.
Also, how about checking their return codes instead of grepping their output? Every process you create returns an error code for success or failure, and wget is not an exception. If it fails to download a page, it willl tell you so directly. There's no need to grep for 'error' in the output.
If you're downloading multiple pages and wish to see which succeeded via script, wget has the -nv option, which outputs success or failure for individual files in a simple line-by-line list.
The real problem is, they're not one thing, they're two streams. stdout is used for data, stderr is used for errors.
If you want them both to go to stdout: wget ... 2>&1
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work.
curl -k https://site
curl -k -x IP:Port https://site
curl -k -x IP:443 https://id:pwd@site
... (0 Replies)
Discussion started by: dineshbabu01
0 Replies
2. Shell Programming and Scripting
I need a proxy that would enable me to use cli curl/wget with another ip address.
How do I find a paid proxy server that supports curl/wget? (1 Reply)
Discussion started by: locoroco
1 Replies
3. Shell Programming and Scripting
Hi,
I'm trying to write a script to download RedHat's errata digest.
It comes in a txt.gz format, and i can get it easily with firefox.
HOWEVER: output is VERY strange when donwloading it in a script. It seems I'm getting a file of the same size - but partially text and partly binary! It... (5 Replies)
Discussion started by: jstilby
5 Replies
4. Shell Programming and Scripting
Hello,
I am wondering does anyone know of a method using curl/wget or other where by I could specify the IP address of the server I wish to query for a website.
Something similar to editing /etc/hosts but that can be done directly from the command line. I have looked through the man pages... (4 Replies)
Discussion started by: colinireland
4 Replies
5. Shell Programming and Scripting
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies
6. Shell Programming and Scripting
Hi,
My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses.
My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies
7. Shell Programming and Scripting
Experts,
I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget.
Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this.
My idea is, get the Source page html from the... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies
8. Shell Programming and Scripting
Hello,
What I am trying to do is to get html data of a website automatically.
Firstly I decided to do it manually and via terminal I entered below code:
$ wget http://www.***.*** -q -O code.html
Unfortunately code.html file was empty.
When I enter below code it gave Error 303-304
$... (1 Reply)
Discussion started by: baris35
1 Replies
9. Shell Programming and Scripting
i'm using this command to post data to a remote host:
wget --post-data="My Data" http://<my-ip>:80 -O /dev/null -q
and
curl --data "My Data" http://<my-ip>:80
however, when i run the above, i see the following in my access log on the remote host:
Wget:
10.10.10.10 - - "POST /... (1 Reply)
Discussion started by: SkySmart
1 Replies
10. Web Development
What can I use instead of wget/curl when I need to log into websites that use javascript?
Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies
LEARN ABOUT DEBIAN
jigdo-lite
JIGDO-LITE(1) JIGDO-LITE(1)
NAME
jigdo-lite - Download jigdo files using wget
SYNOPSIS
jigdo-lite [ URL ]
DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download.
Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL.
wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template'
file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the
pieces.
`.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one
mirror out of a list of all Debian mirrors.
If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line
options are recognized:
-h --help
Output short summary of command syntax.
-v --version
Output version number.
--scan FILES
Do not ask for "Files to scan", use this path.
--noask
Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo-
lite from cron jobs or in other non-interactive environments.
SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget')
CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>.
AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images
for the Debian Linux distribution more convenient.
19 May 2006 JIGDO-LITE(1)