Sponsored Content
Full Discussion: WGET help!
Top Forums Shell Programming and Scripting WGET help! Post 302706397 by jacobs.smith on Wednesday 26th of September 2012 04:01:01 PM
Old 09-26-2012
I googled somewhere and found this


Code:
wget -r -l 0 -U Mozilla -t 1 -nd  -A file_extension "url.com" -e robots=off

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. Shell Programming and Scripting

wget help

i am trying to ftp files/dirs with wget. i am having an issue where the path always takes me to my home dir even when i specify something else. For example: wget -m ftp://USER:PASS@IP_ADDRESS/Path/on/remote/box ...but if that path on the remote box isn't in my home dir it doesn't change to... (0 Replies)
Discussion started by: djembeplayer
0 Replies

3. Shell Programming and Scripting

Help with wget

Hi, i need temperature hourly from a web page Im using wget to get the web page. I would like to save the page downloaded in a file called page. I check the file everytime i run the wget function but its not saving but instead creates a wx.php file....Each time i run it...a new wx.php file is... (2 Replies)
Discussion started by: vadharah
2 Replies

4. Shell Programming and Scripting

wget

Hi I want to download some files using wget , and want to save in a specified directory. Is there any way to save it.Please suggest me. (1 Reply)
Discussion started by: mnmonu
1 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. UNIX for Dummies Questions & Answers

Wget

...... (1 Reply)
Discussion started by: hoo
1 Replies

7. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

8. UNIX for Dummies Questions & Answers

Wget help

How can I download only *.zip and *.rar files from a website <index> who has multiple directories in root parent directory? I need wget to crawl every directory and download only zip and rar files. Is there anyway I could do it? (7 Replies)
Discussion started by: galford
7 Replies

9. Shell Programming and Scripting

Wget and gz

Can wget be used to goto a site and piped into a .gz extrated command? wget ftp://ftp.ncbi.nlm.nih.gov/pub/clinvar/vcf_GRCh37 | gunzip -d clinvar_20150603.vcf.gz (1 Reply)
Discussion started by: cmccabe
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.16.2 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 04:21 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy