07-07-2006
Before you start moving stuff around, check for files that may be causing the space issue such as log files, core files, tar files,...
See
aplawrence.com no space
This User Gave Thanks to RTM For This Post:
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hello,
I am trying to monitor disk space for each node on the machine. I am able to get all individual nodes but for the '/' node. For example:
df -k:
bash-2.05b# df -k
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/xxx 4127108 2415340 1502120 62% /
/dev/yyy ... (3 Replies)
Discussion started by: chiru_h
3 Replies
2. UNIX for Dummies Questions & Answers
Hello everyone -
Please forgive me if I violate the forum's etiquette as this is my very first post. I'm posting this on both the dummies and the advance section with the hope for any responses.
I stumbled on this forum while frantically looking for an answer to a dumb, ignorant thing I did... (2 Replies)
Discussion started by: kevindoman
2 Replies
3. UNIX for Advanced & Expert Users
Hello everyone -
Please forgive me if I violate the forum's etiquette as this is my very first post. I'm posting this on both the dummies and the advance section with the hope for any responses.
I stumbled on this forum while frantically looking for an answer to a dumb, ignorant thing I did... (5 Replies)
Discussion started by: kevindoman
5 Replies
4. Linux
We are intending to protect a set of user specified files using LVM mirroring where the protected space on which the user files are stored is mirrored on an LV on a different disk. Our problem is that for a user with a custom layout has installed linux with 2 partitons for swap and / and there is... (0 Replies)
Discussion started by: kickdgrass
0 Replies
5. Linux
Hi OS Experts
I would like to increase root partition from another partition so that I can save more documents in Home and Desktop. whether it is possible without formating root partition if so please explain
here is o/p of df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9... (8 Replies)
Discussion started by: Akshay Hegde
8 Replies
6. Shell Programming and Scripting
Hi,
Please guide me how to add nodev option for /dev/shm partition.
I am new to scripting and looking to do via command line.
Thanks
Litu (13 Replies)
Discussion started by: Litu1988
13 Replies
7. UNIX for Advanced & Expert Users
Hello,
I have rebooted the RHEL VM but after rebooting the vm it not showing all the partition mounted on OS level, if i'll execute the fdisk -l command, then i'm able to see the same disk. below is the fdisk output :
# fdisk -l
Disk /dev/sda: 107.6 GB, 107639996416 bytes
255 heads, 63... (1 Reply)
Discussion started by: purushottamaher
1 Replies
8. AIX
Hi all,
I have about 5-6 daemons specific to my application running in the background. I am trying to write a script to stop them. Usually, I run them as a non-root ID, which is fine. But for some reason the client insists on using root.
I do have sudo.
I just tried something like this
... (4 Replies)
Discussion started by: jeffs42885
4 Replies
9. Red Hat
For instance, root partition is full so I don't need to know about /ABC/XYZ when /ABC/XYZ is a separate mount point. (But /ABC isnt).
Can I run a du command or similar and just look at contents effecting the space on that mount point (/)? (2 Replies)
Discussion started by: psychocandy
2 Replies
10. UNIX for Beginners Questions & Answers
Hello everyone,
I am having an issue here with CentOS release 6.6 (Final) that shows all of the space used up, but I can't tell where the space went.
Seemingly I am using up 100%, according to
df -h
Filesystem Size Used Avail Use% Mounted on... (27 Replies)
Discussion started by: DannyBoyCentOS
27 Replies
LEARN ABOUT REDHAT
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotsRules - Parse robots.txt files
SYNOPSIS
require WWW::RobotRules;
my $robotsrules = new WWW::RobotRules 'MOMspider/1.0';
use LWP::Simple qw(get);
$url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
$url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in
<http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access
to parts of their web site.
The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can parse multiple /robots.txt files.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://info.webcrawler.com/mak/projects/robots/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)