Sponsored Content
Top Forums Shell Programming and Scripting Help needed for writing a script Post 302149282 by matrixmadhan on Wednesday 5th of December 2007 01:09:21 PM
Old 12-05-2007
whats is this ? SmilieSmilieSmilie

Quote:
This site seems to have a policy of frowning on people who post 'homework'. So, don't expect much help from anyone.
Please read the rules of the forum ! Smilie

https://www.unix.com/unix-dummies-que...om-forums.html
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

need help writing a script

Hello everyone. Well, I will get right to the point. I am new to Perl and trying to learn it as much as I can. I have been assigned the task of writing a perl script to extract information from firewall logs. Like I said, I am new to Perl and I am having a tough time because I think what I am... (3 Replies)
Discussion started by: tarballed
3 Replies

2. Shell Programming and Scripting

Help needed in writing awk script for xml source

Hi, i am not able to get an approach for converting xml file to flat file using awk programming. Can anyone help me out. The input xml is like this: <outer> <field1>one</field1> <field2>two</field2> <field3>three<Error Code=777 Description=12345/></field3> <field4>four</field4> </outer>... (2 Replies)
Discussion started by: naren_0101bits
2 Replies

3. Shell Programming and Scripting

needed help in writing a script!

Hi all, I needed help in writing a script for the following question.please help. Non-recursive shell script that accepts any number of arguments and prints them in the Reverse order. (For example, if the script is named rargs, then executing rargs A B C should produce C B A on... (5 Replies)
Discussion started by: wrapster
5 Replies

4. Shell Programming and Scripting

help needed in writing a script

when i run a command the output is something like this: #3136 0.872914 01/17/08 22:06:36 #24817 1.231532 01/18/08 05:00:44 #15371 1.291679 01/18/08 03:00:08 #21279 2.130480 01/18/08 04:03:16 #7835 27.892056 01/18/08 00:01:32 i need to check if any one of the second column is... (5 Replies)
Discussion started by: cybersandex
5 Replies

5. UNIX for Dummies Questions & Answers

Need help writing this script

Here is the script I am trying to write along with my answer I wrote. Please help me understand why it doesn't work. Create an executable script file called "newname" that will perform the followings: 1. Rename a file upon the user's request. If the file exists, prompt the user for... (1 Reply)
Discussion started by: wiggles
1 Replies

6. Shell Programming and Scripting

Help me in writing the script

Hi, I have written a script which converts a give hexdecimal value to binary value in perl. But now, the problem is I should read every bit of it ( if its 10101010, i should read the value in each position and if the value in that position is 1 i should print a string and should exit if its... (1 Reply)
Discussion started by: prakashreddy
1 Replies

7. Shell Programming and Scripting

Please help me in writing my script

hello all, I have a script, used to search for the strings from the set of 5 similar pattern file from the log dir. So here it goes . The input parameter is a part of the file name. When during the script execution, the script should parse the input parameter to original file's with the same... (0 Replies)
Discussion started by: baraghun
0 Replies

8. Shell Programming and Scripting

Help in writing script

i need some help in donig some actions on files in a library. i want to get the n last files, and print to the screen their name, date, and how many times a specific string appears in each file.. how can i do this..?... (6 Replies)
Discussion started by: eee
6 Replies

9. Shell Programming and Scripting

Help needed in writing a menu driven script

Hi, I was wondering if someone could help me write a shell script in Linux that backsup/restores data to anywhere I choose but it needs to be menu driven? Thanks, I'm new to Linux/Unix but liking it so far...just hoping to get to grips with the scripts! :) (7 Replies)
Discussion started by: Nicole
7 Replies

10. HP-UX

Please help me writing this script

I work on a production server. I have to check one folder named "spool" and delete files under it , which are more than 5 minutes old. I do it manually by writing two commands. touch -t YYMMDDHHMMSS /tmp/timeinfo find /spool ! -newer /tmp/timeinfo -exec rm -rf {} \; I want to... (4 Replies)
Discussion started by: manalisharmabe
4 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.16.2 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 03:34 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy