I'd like to point out a few useful keys I had used when the connection was not stable
-t to specify retries
-c to continue download where it was aborted before
as example:
Hi All,
I think wget command would not download any directories. But please confirm it. If it downloads directories, please let me know how to do it.
Thank you. (1 Reply)
hi
I was run thix command(wget) it was runing in some severs
but in some servers it wasn't run ??
......
how to it run in that server ?
thx
:( (1 Reply)
I'm using the "wget" command to get the date from Yahoo.com. So this is what I use on Solaris:
/usr/sfw/bin/wget --timeout=3 -S Yahoo!
This works well when my computer is linked to the Net. But when it's not, this command just hangs. I thought putting the timemout = 3 will make this... (2 Replies)
I was recently reading a manual of wget and there was command as "binding-address" and I read about tcp/ip binding but i don't understand one thing is...what is the use of binding address in wget..
Can anyone help me with this. (6 Replies)
I need to get the current date off a remote site, such as Google or Yahoo.
Does anyone know how to use the wget command on a Solaris 10 system to do this? I recall a long time ago, where using "wget" will get a bunch of info off a site, and then, you can extract the date from all of that info.
... (6 Replies)
Hi All,
While using below command I am getting some unusual character in Release.txt file.How could I remove or stop them to go into Release.txt file
wget -q http://m0010v.prod.wspan.com/nggfmonatl/Default.aspx
cat Default.aspx|egrep -in "EFS|HOTFIX" | awk -F/ '{print $(NF-1)}'|cut -d... (1 Reply)
Hi ,
Iam using " WGET " command to hit the URL,i.e. servlet url.
I can trigger the servlet using wget but when servlet is not responding this command retries automatically until it get the positive response from the server.
So this script is running for more than 8 hrs to get the positive... (2 Replies)
Hello friends,
I've been working on a solaris server,
I need to test responses of a web service using WGET command, if the response is successful, how quick it is etc.
I have scirpt like this, I modified it, i try to redirect the output of time command to total.txt but i couldn't manage, i... (4 Replies)
If there were 3 files put in a folder on /Desktop/Test then transferred to a site.
would
gzip -r /Desktop/Test zip them so that
wget --http-user cmccabe --http -passwd xxxx*** https://something.sharefile.com/login.aspx -O - | tar -zxf - could be used to connect to the site, login,... (6 Replies)
hi,
i need help for downloading excel file from internet and convert it to text file
wget http://spreadsheetpage.com/downloads/xl/wordfrequency.xls (4 Replies)
Discussion started by: raghur77
4 Replies
LEARN ABOUT DEBIAN
httpindex
httpindex(1) General Commands Manual httpindex(1)NAME
httpindex - HTTP front-end for SWISH++ indexer
SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ]
DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc-
tory structure) can be kept, deleted, or replaced with their descriptions after indexing.
OPTIONS
wget Options
The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the
EXAMPLE.)
httpindex Options
httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V.
The following options are unique to httpindex:
-d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display
file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See
the extract_description() function in WWW(3) for details about how descriptions are extracted.)
-D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with
copies of remote files.
EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally:
wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 |
httpindex -d -e'html:*.html,text:*.txt'
Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex.
EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise.
CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl
script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.''
The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want
to do:
httpindex -e'html:*.html' -e'text:*.txt'
do this instead:
httpindex -e'html:*.html,text:*.txt'
SEE ALSO
index++(1), wget(1), WWW(3)AUTHOR
Paul J. Lucas <pauljlucas@mac.com>
SWISH++ August 2, 2005 httpindex(1)