Sponsored Content
Full Discussion: Please Help Guys Important
Top Forums UNIX for Advanced & Expert Users Please Help Guys Important Post 302177983 by Franklin52 on Monday 24th of March 2008 08:49:43 AM
Old 03-24-2008
zanetti321,

Please read the forum rules carefully and stop with bumping your question!

https://www.unix.com/unix-dummies-que...om-forums.html

Regards
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

i need your guys help

HI. I just accidently wipe out my hard drive when i installed sun solaris, this is the last thing i remember, it ask me if want to delete the partition table, and i said yes" Im a retard" i thought solaris can't see xp partition table. Could anyone help me recover my data back, is it really... (21 Replies)
Discussion started by: souldier
21 Replies

2. BSD

Thank you guys for the advice...

Thank you guys for the advice, and finally i could connect to internet with FREEBSD, this is what i did. I did evrything what the handbook said and still couldn't connect, then i installed a new ethernet card that support freebsd and i did everything what the manual said in how to install the... (0 Replies)
Discussion started by: nobody
0 Replies

3. UNIX for Dummies Questions & Answers

Hi again guys

actually i have important question about unix / linux i'm working on visuall basic and visuall c++ i heared from someone that i can't open them while i useing unix / linux is that right? if yes .......... what's the solution if no ............. thanks :) :D :D thanks my friends (1 Reply)
Discussion started by: M_Hafez
1 Replies

4. AIX

Thanks guys

Hi guys, I would like to inform you that I have cleared .... IBM Certified Specialist - p5 and pSeries Administration and Support for AIX 5L V5.3 with 89% thanks to you all. Manu (0 Replies)
Discussion started by: b_manu78
0 Replies

5. Shell Programming and Scripting

Help guys!

Hello Guys, I have written the following script to do certain job. I have more than 300 files, all are .pdb & .out files. and the files are (1,3,5,7,11,13,15,17,21,.......787,791,793,795).pdb /.out . But the way I created the for loop in my script works only one file at a time. But that is not... (2 Replies)
Discussion started by: chuchu
2 Replies

6. UNIX for Advanced & Expert Users

Hi guys...

I want a Bash Shell Script for taking backup of all files created to day and killing all the Process still active of mine at evening every day (1 Reply)
Discussion started by: vinayraj
1 Replies

7. UNIX for Advanced & Expert Users

Please Help Me Guys

Dear All I have a pattern which look like this : 2 20080226_18:02:09.749 ISC-Libya Egypt-Cairo2 111 IAM 2913258040 218927157966 b 61 REL f 143 RLC :COMMA:NCI=10,FCI=6101,CPC=0A,TMR=00,OFI=80,USI: :COMMB:: :RELCAUSE:15: 2 20080226_18:02:11.629 ISC-Libya Egypt-Cairo2 170 IAM 93572641... (8 Replies)
Discussion started by: zanetti321
8 Replies

8. Shell Programming and Scripting

Please help me guys...

Hi All! I need to write a script which reads a file and tries to insert into the DB using those values... file format: var1 var2 var3 var4 var5 var6 Now I want to read from the above file and trying to insert like... insert into table1 values( var1, var2, var3 ); in a loop to... (2 Replies)
Discussion started by: games_icon
2 Replies

9. Cybersecurity

Hey guys

Hey guys, new geek here, sorry I didnt see a intro section. But I do have a question and hope to make my stay here perma. I am interested in IT security, ands I really want to learn, I was hoping for whatever questions I had you guys could lead me through the narrowest path with a broad selection... (1 Reply)
Discussion started by: abeja
1 Replies

10. Shell Programming and Scripting

Help me guys

how to print first line of each repeated 2 fields only if i got files like : USA|Tony|12:25:22:431 USA|John|14:22:42:981 USA|John|08:22:12:349 France|Adam|14:22:42:981 Italy|Tony|18:22:42:212 Italy|Tony|04:22:42:212 Italy|Tony|08:22:42:212 to make output like : ... (9 Replies)
Discussion started by: teefa
9 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.18.2 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:08 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy