10-31-2017
Bumping up posts or double posting is not permitted in these forums.
Please read the
rules, which you agreed to when you registered, if you have not already done so.
You may receive an infraction for this. If so, don't worry, just try to follow the rules more carefully. The infraction will expire in the near future
Thank You.
The UNIX and Linux Forums.
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
We currently take files (via FTP) off of a mainframe and save them as a text file on our server. This is done via a script. The next thing that is done to that text file is it gets zipped (using ZIP). This all works fine, but it doesn't appear that ZIP (the free version) has any way to password... (2 Replies)
Discussion started by: dsimpg1
2 Replies
2. UNIX for Dummies Questions & Answers
Hi,
I am trying to print from HP unix machine to a Toshiba printer which is password protected. How can I print?
Thanks.
Anuj (1 Reply)
Discussion started by: Anuj
1 Replies
3. AIX
Can it be done? Ive read in a few places that the crypt program no longer exists on AIX...if its do-able please tell me how. (2 Replies)
Discussion started by: rdudejr
2 Replies
4. Shell Programming and Scripting
Hi All,
I want to make my script password protected.
i e: if somebody runs my script it should prompt for password.
Can somebody help me in to execute the same??
Thanks in Advance :b: (11 Replies)
Discussion started by: achararun
11 Replies
5. UNIX for Advanced & Expert Users
I need to convert a password protected excel file which will be in UNIX server to a comma separated file. For this I need to open the excel file in UNIX box but the UNIX box doesn't prompt for password instead it is opened in an encrypted manner.
I could manually ftp the excel file to local... (2 Replies)
Discussion started by: Devivish
2 Replies
6. Shell Programming and Scripting
Hie Friends,
I need your help once again.
I have 77 “password protected” winzip files in linux/unix server. I want to decrypt it through an automated script. Password of every file is same and it is mhd*tt.
Please help me.
Usually I unzip it as follows, manually one by one.
unzip <file name> ... (6 Replies)
Discussion started by: anushree.a
6 Replies
7. Shell Programming and Scripting
Hi,
I want to make Excel files password protected present in my Unix directory. I have a application which will basically invoke a shell script with a parameter passed. The purpose of shell script would be to pop up a box to enter password. This password will be validated against database entry... (1 Reply)
Discussion started by: anil029
1 Replies
8. Shell Programming and Scripting
I want to give my long scripts to customer. The customer must not be able to read the scripts even if he has the password. The following command locks and unlocks the script but the set +x is simply ignored.
The code:
read -p 'Script: ' S && C=$S.crypt H='eval "$((dd if=$0 bs=1 skip=//|gpg... (7 Replies)
Discussion started by: frad
7 Replies
9. Shell Programming and Scripting
Hi Gurus,
I need to encrypt the Db passwords which are stored in a configuration file (.txt) as below:
stage_db_pwd=ABC
this is test line
content_db_pwd=123def
This is test line 2
stg_db_name=xyz
I want to encrypt all the password fields (identified by "pwd"), encrypt them in the same... (3 Replies)
Discussion started by: ashishpanchal85
3 Replies
10. Shell Programming and Scripting
All,
I have requirement to send password protected excel file in an email from unix/linux box without zipping it. Any help would be appreciated.
Thanks.. (8 Replies)
Discussion started by: Durgesh Gupta
8 Replies
LEARN ABOUT REDHAT
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotsRules - Parse robots.txt files
SYNOPSIS
require WWW::RobotRules;
my $robotsrules = new WWW::RobotRules 'MOMspider/1.0';
use LWP::Simple qw(get);
$url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
$url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in
<http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access
to parts of their web site.
The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can parse multiple /robots.txt files.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://info.webcrawler.com/mak/projects/robots/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)