Sponsored Content
Full Discussion: Issues with wget
Top Forums UNIX for Advanced & Expert Users Issues with wget Post 302564222 by raghu.iv85 on Thursday 13th of October 2011 06:13:57 AM
Old 10-13-2011
URL which i used to test is correct only

could anyone let me know what is the exact reason for this error? and how to fix it pls?

Thanks

Regards,
VRN
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. Shell Programming and Scripting

wget help

i am trying to ftp files/dirs with wget. i am having an issue where the path always takes me to my home dir even when i specify something else. For example: wget -m ftp://USER:PASS@IP_ADDRESS/Path/on/remote/box ...but if that path on the remote box isn't in my home dir it doesn't change to... (0 Replies)
Discussion started by: djembeplayer
0 Replies

3. Shell Programming and Scripting

Help with wget

Hi, i need temperature hourly from a web page Im using wget to get the web page. I would like to save the page downloaded in a file called page. I check the file everytime i run the wget function but its not saving but instead creates a wx.php file....Each time i run it...a new wx.php file is... (2 Replies)
Discussion started by: vadharah
2 Replies

4. Shell Programming and Scripting

wget

Hi I want to download some files using wget , and want to save in a specified directory. Is there any way to save it.Please suggest me. (1 Reply)
Discussion started by: mnmonu
1 Replies

5. UNIX for Dummies Questions & Answers

Wget

...... (1 Reply)
Discussion started by: hoo
1 Replies

6. Shell Programming and Scripting

WGET help!

Hi Friends, I have an url like this https://www.unix.com/help/ In this help directory, I have more than 300 directories which contains file or files. So, the 300 directories are like this http://unix.com/help/ dir1 file1 dir2 file2 dir3 file3_1 file3_2... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

7. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

8. UNIX for Dummies Questions & Answers

Wget help

How can I download only *.zip and *.rar files from a website <index> who has multiple directories in root parent directory? I need wget to crawl every directory and download only zip and rar files. Is there anyway I could do it? (7 Replies)
Discussion started by: galford
7 Replies

9. Shell Programming and Scripting

Wget and gz

Can wget be used to goto a site and piped into a .gz extrated command? wget ftp://ftp.ncbi.nlm.nih.gov/pub/clinvar/vcf_GRCh37 | gunzip -d clinvar_20150603.vcf.gz (1 Reply)
Discussion started by: cmccabe
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
WMGRABIMGAE(1)						      General Commands Manual						    WMGRABIMGAE(1)

NAME
WMGRABIMGAE - Dockable WWW Image monitor. SYNOPSIS
wmGrabImage [-h] [-display <Display>] -url <Image URL> [-http <URL>] [-c] [-delay <Time>] DESCRIPTION
wmGrabImage is a WindowMaker DockApp that maintains a small thumbnail copy of your favorite image from the WWW. The image to monitor is specified via the "-url <Image URL>" command-line option and it gets updated approximately every 5 minutes. The update interval can be overridden via the "-delay <Time>" command-line option (Time is in seconds). Each of the three mouse buttons can be double clicked with the following effects; Left Mouse: Brings up the full-sized image in xv. Middle Mouse: Sends a URL (specified via the -http <URL> command-line option) to an already running netscape process or in a new netscape process if there arent any running. Right Mouse: Updates the image immediately. OPTIONS
-h Display list of command-line options. -display [display] Use an alternate X Display. -url <Image URL> The URL of the WWW image to monitor. -http <URL> The URL to send to netscape via a Middle double click. -c Center the image vertically within the icon. -delay <Time> The time between updates. The default is about 5 minutes. FILES
The original sized image and the thumbnail XPM image are both stored in ~/.wmGrabImage/ which gets created if it doesnt already exist. SEE ALSO
wget and the ImageMagick convert utility. BUGS
Who knows? -- its still Beta though. (Let me know if you find any). Oldish versions of the ImageMagick convert utility have a memory leak -- if you have that problem, upgrade to the latest version. AUTHOR
Michael G. Henderson <mghenderson@lanl.gov> 16 December 1998 WMGRABIMGAE(1)
All times are GMT -4. The time now is 10:02 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy