08-25-2017
Seriously: DON'T register another account if one of your
threads is blocked / closed for rules' violations. If your request is homework / classwork, do as required in the rules and in your other thread.
If it is not homework, explain the background of your request, e.g. the company or project you work for. If you don't, this thread will be closed as well.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I'm trying to create a program that includes variety of duties. One of the duties includes deleting a user if the user name exist in the /etc/passwd file.
how do i make that happen. those of you that know about this shell programming, please tell me what i should do after the shell reads... (4 Replies)
Discussion started by: TRUEST
4 Replies
2. IP Networking
Hi I have a bit of c code which I'm trying to use as a relay between apache and a scgi cluster.
Example of problem code is below:
while((n = recv(scgiSock, local_data, MAX_LENGTH, 0)) > 0)
{
time(&t2);
time_now = t2 - t1;
if(time_now > TIMEOUT)
... (2 Replies)
Discussion started by: fishman2001
2 Replies
3. Shell Programming and Scripting
Hi
This will be useful who is looking for checking the files in a directory
#chmod 777 /cronacle/tools/teradata/opo/bin/file_check.sh
SUBJECT=`echo "File Not Found"`
SUBJECT1=`echo "File Found"`
#RECIPIENT=Madhu.Reddy@ge.com
cd /cronacle/tools/teradata/opo/bin
file_list=attach.sh
if
... (3 Replies)
Discussion started by: ksmbabu
3 Replies
4. Shell Programming and Scripting
How can I check if a file exists in shell script. Basically, I want to check if a file Test_msgs has been created today. If it has been then append data to it. Otherwise, create it. I have written the following but it does not work.
todaysdate=$(date +%d%m%Y)
timenow=$(date +%H%M%S)... (4 Replies)
Discussion started by: gugs
4 Replies
5. Shell Programming and Scripting
Hai All,
please help me in solving this assignment!!!
i need a unix script that has to check the text file exists or not in all directories and sub directories if textfile exists display the directory path else display does not exists!!
example: kamal.txt that i want to search if the... (5 Replies)
Discussion started by: G.K.K
5 Replies
6. Shell Programming and Scripting
How can I check if a file exists in csh? I know there is "-e $file" but do not know exactly how to use it.
I have tried the below but I'm getting "Bad : modifier in $ ( )."
foreach f ($AfullnameLst)
if (-e $f) then
echo "$f: file exists"
endif
end (6 Replies)
Discussion started by: kristinu
6 Replies
7. Shell Programming and Scripting
Hi All,
what is the difference between -f and -e.
Regards,
ch33ry (1 Reply)
Discussion started by: ch33ry
1 Replies
8. Shell Programming and Scripting
Hey, I am new to scripting and was wondering what is wrong with this if statement. I want to check if file exists and the if it does to unzip it. I program it as follows
if ; then
gunzip *_filename.gz
fi
Thanks in advance!
Please use code tags next time for your code and data. (10 Replies)
Discussion started by: mostarac2487
10 Replies
9. Shell Programming and Scripting
Hi All,
I am facing a problem while checking for existence of file over ssh !
Basically, i want to ssh and check if file exists.. If file exists return 1. If file does not exits return 0 (or any value)
I am using the below code
file_avail=`ssh username@host "if ]; then exit 1;... (10 Replies)
Discussion started by: galaxy_rocky
10 Replies
10. Shell Programming and Scripting
In several scripts that process files matched by name pattern I needed to add a check for file existence. Just to illustrate let's say I need to process all N??? files:
/tmp$ touch N100 N101
/tmp$ l ?10
-rw-rw-r-- 1 moss group 0 Apr 19 11:22 N100
-rw-rw-r-- 1 moss group ... (10 Replies)
Discussion started by: migurus
10 Replies
LEARN ABOUT REDHAT
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotsRules - Parse robots.txt files
SYNOPSIS
require WWW::RobotRules;
my $robotsrules = new WWW::RobotRules 'MOMspider/1.0';
use LWP::Simple qw(get);
$url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
$url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in
<http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access
to parts of their web site.
The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can parse multiple /robots.txt files.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://info.webcrawler.com/mak/projects/robots/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)