Sponsored Content
Top Forums Shell Programming and Scripting wget using wildcards using http Post 69217 by vgersh99 on Wednesday 13th of April 2005 09:42:38 AM
Old 04-13-2005
Quote:
Originally Posted by hcclnoodles
Hi there, probably a really simple question but i want to download all .rpm files from a web repository which happens to be http and not ftp

Ive tried using wget, but as far as i can see it doesnt allow for wilcards (ie wget http://address/*.rpm)

does anybody know i can get all these files in one hit ?

ps - ive tried rsync aswell but to no avail
Quote:
Originally Posted by man wget
-g on/off --glob=on/off
Turn FTP globbing on or off. By default, globbing will
be turned on if the URL contains a globbing characters
(an asterisk, e.g.). Globbing means you may use the
special characters (wildcards) to retrieve more files
from the same directory at once, like wget
ftp://gnjilux.cc.fer.hr/*.msg. Globbing currently works
only on UNIX FTP servers.
try quoting your URL so that it does not get expanded by the shell:
wget ..... 'http://address/*.rpm'
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

cant make a http get request using wget

Hi all, Im trying to make an http get request to a web service from a linux machine like below and i get ERROR 500 wget http://10.1.21.236:8585/pns.asmx/Sen...&msgBody=werty 25018 $ --19:06:32-- http://10.1.21.236:8585/pns.asmx/Sen...erName=serverA Connecting to 10.1.21.236:8585...... (1 Reply)
Discussion started by: elthox
1 Replies

2. Shell Programming and Scripting

Unable to access http site using wget through proxy

Hi there I am currently trying to access an http site using the wget utility from a solaris box. I am going through proxies to do this and we have two types of proxies. For the first one, which is a netcache proxy, I am able to use the wget command to export the proxy information export... (2 Replies)
Discussion started by: memonks
2 Replies

3. Solaris

HTTP error while downloading solaris patches using wget

Hello, I am getting a HTTP error while downloading solaris patches using wget. 'Downloading unsigned patch 113096-03. --2010-06-18 03:51:15-- http://sunsolve.sun.com/pdownload.pl?target=113096-03&method=h Resolving sunsolve.sun.com (sunsolve.sun.com)... 192.18.108.40 Connecting to... (5 Replies)
Discussion started by: sunny_a_j
5 Replies

4. UNIX for Advanced & Expert Users

Wildcards

These 2 websites do a GREAT job of explaining different types of wildcards. I learned about the categories of characters which I never knew about at all. GNU/Linux Command-Line Tools Guide - Wildcards GREP (1 Reply)
Discussion started by: cokedude
1 Replies

5. Shell Programming and Scripting

Capture http response code from wget

Hi All, I am using wget to call a url..i am getting 202 if it is successful. if not i am forcing the response code to 417. how can i capture the response code and print 0 if it is 202 and 1 if it is not 202 any ideas, please share Thanks, Jack. (2 Replies)
Discussion started by: jack3698
2 Replies

6. Programming

sending http url through http socket programming..

hi am senthil am developing a software to send and receive SMS using HTTP connection first of all am forming a URL and sending that URL to a remote server using my Client Program i send that url through Socket(using Send() Function) if i send more than one URL one by one using the same... (0 Replies)
Discussion started by: senkerth
0 Replies

7. Shell Programming and Scripting

sending http url through http socket programming..

hi am senthil am developing a software to send and receive SMS using HTTP connection first of all am forming a URL and sending that URL to a remote server using my Client Program i send that url through Socket(using Send() Function) if i send more than one URL one by one using the same... (4 Replies)
Discussion started by: senkerth
4 Replies

8. Web Development

HTTP Headers Reference: HTTP Status-Codes

Hypertext Transfer Protocol -- HTTP/1.1 for Reference - HTTP Headers 10 Status Code Definitions Each Status-Code is described below, including a description of which method(s) it can follow and any metainformation required in the response. (1 Reply)
Discussion started by: Neo
1 Replies

9. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

10. Shell Programming and Scripting

awk script to find time difference between HTTP PUT and HTTP DELETE requests in access.log

Hi, I'm trying to write a script to determine the time gap between HTTP PUT and HTTP DELETE requests in the HTTP Servers access log. Normally client will do HTTP PUT to push content e.g. file_1.txt and 21 seconds later it will do HTTP DELETE, but sometimes the time varies causing some issues... (3 Replies)
Discussion started by: Juha
3 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 11:17 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy