Sponsored Content
Full Discussion: Wget
Operating Systems Linux Red Hat Wget Post 302823805 by Yoda on Thursday 20th of June 2013 12:04:35 AM
Old 06-20-2013
Check wget manual. Look for option --directory-prefix
This User Gave Thanks to Yoda For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. Shell Programming and Scripting

wget help

i am trying to ftp files/dirs with wget. i am having an issue where the path always takes me to my home dir even when i specify something else. For example: wget -m ftp://USER:PASS@IP_ADDRESS/Path/on/remote/box ...but if that path on the remote box isn't in my home dir it doesn't change to... (0 Replies)
Discussion started by: djembeplayer
0 Replies

3. Shell Programming and Scripting

Help with wget

Hi, i need temperature hourly from a web page Im using wget to get the web page. I would like to save the page downloaded in a file called page. I check the file everytime i run the wget function but its not saving but instead creates a wx.php file....Each time i run it...a new wx.php file is... (2 Replies)
Discussion started by: vadharah
2 Replies

4. Shell Programming and Scripting

wget

Hi I want to download some files using wget , and want to save in a specified directory. Is there any way to save it.Please suggest me. (1 Reply)
Discussion started by: mnmonu
1 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. UNIX for Dummies Questions & Answers

Wget

...... (1 Reply)
Discussion started by: hoo
1 Replies

7. Shell Programming and Scripting

WGET help!

Hi Friends, I have an url like this https://www.unix.com/help/ In this help directory, I have more than 300 directories which contains file or files. So, the 300 directories are like this http://unix.com/help/ dir1 file1 dir2 file2 dir3 file3_1 file3_2... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

8. UNIX for Dummies Questions & Answers

Wget help

How can I download only *.zip and *.rar files from a website <index> who has multiple directories in root parent directory? I need wget to crawl every directory and download only zip and rar files. Is there anyway I could do it? (7 Replies)
Discussion started by: galford
7 Replies

9. Shell Programming and Scripting

Wget and gz

Can wget be used to goto a site and piped into a .gz extrated command? wget ftp://ftp.ncbi.nlm.nih.gov/pub/clinvar/vcf_GRCh37 | gunzip -d clinvar_20150603.vcf.gz (1 Reply)
Discussion started by: cmccabe
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 12:54 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy