Sponsored Content
Operating Systems HP-UX /var partition full need help Post 302602112 by Peasant on Sunday 26th of February 2012 03:29:24 AM
Old 02-26-2012
Looks like so much services so little disk scenario Smilie

You should check if you can purge DP database (carefully and planed )
See if you need files like 'OldServerTrace.txt'.

I have similar situation on inherited DP cell manager, with grossly oversized DP database, and no window to do proper purge (constant backups, tape replications).

To properly fix this situation you need some downtime and a good plan.

I would present some disk space (if possible) and make a symlink on most used /var part as an emergency measure.
Also, implement some kind of log rotation ( logrotate is avalible on HPUX porting centre ).
Porting And Archive Centre For HP-UX
 

10 More Discussions You Might Find Interesting

1. HP-UX

i-node full on /var

Can anyone tell me how would I troubleshoot when /var becomes full with inodes? This is on HP11.11 system. Where used is 92%, ifree is 1891 iuse is 88%. Thanks. (3 Replies)
Discussion started by: catwomen
3 Replies

2. UNIX for Dummies Questions & Answers

Full Partition?

Hi Everyone, I think I've filled up one of the partitions on my drive. I suspect that one of the applications I've been running has been spitting out junk files to this partition - most of which can be deleted. The problem is that I have no idea how to go look at what's on that partition and... (2 Replies)
Discussion started by: Choppy
2 Replies

3. AIX

/var 100% full

What to do if /var filesystem in Aix is completely full ? (2 Replies)
Discussion started by: kkhan
2 Replies

4. Filesystems, Disks and Memory

partition out /var

Hi If You were the systems administrator of a mail server that services approximately 3,000 users. 2,000 users access their email via a POP-3 service, while the remaining 1,000 users access their email via a Unix mail reader. Recently users have complained about speed of disk access, so a new 10... (1 Reply)
Discussion started by: semaphore
1 Replies

5. BSD

Moving /var partition to USB stick

I am currently running DesktopBSD as a live-CD and need to have a large /var partition because it is currently too small. I have a USB stick which is BSD formatted, and would like to have the /var partition moved over to it. How can this be done? Could I for instance use a symlink? (1 Reply)
Discussion started by: figaro
1 Replies

6. AIX

/var filesystem is full

Hi, Is there a way to clear the temp files from /var/tmp? Is root access required to delete the files? Thanks, Narayan (2 Replies)
Discussion started by: narayanv
2 Replies

7. AIX

/var/spool/squeue gets full frequently

hi, im new in aix administration.. months ago, I received mails, everytime a cron was executed. but now, I don't receive these mails.. and the /var/spool/squeue, gets full frequently. i'd like to know more information about this, what can i do?? sendmail is up, because, I executed ps -ef |grep... (5 Replies)
Discussion started by: fdeivis
5 Replies

8. Solaris

Install with /var in separate partition - Zfs / 10

This is my first time working with ZFS on Solaris 10. I am trying to set up /var in a separate partition from /. During the installation, I came across the ZFS settings where I selected disks 0 and 1 to be mirrored with ZFS. Next was the option to have /var and / on separate datasets. Is... (3 Replies)
Discussion started by: 6L71
3 Replies

9. UNIX for Dummies Questions & Answers

How can partition out /var with these two separate 10 gigabyte disks?

In my company ,there is a mail server that services approximately 3,000 users. 2,000 users access their email via a POP-3 service, while the remaining 1,000 users access their email via a Unix mail reader. Recently users have complained about speed of disk access, so a new 10 gigabyte disk has... (1 Reply)
Discussion started by: lemon_06
1 Replies

10. UNIX for Dummies Questions & Answers

/var/audit full

Hi, I have Solaris-10 (having multiple non global zones running on it). Its /var is getting full to 100% and I can see, there are files getting added to /var/audit. There are large in number, so even if I clearing them, it is filling /var. In past 24 hours, there are 53000 files are added. I am... (1 Reply)
Discussion started by: solaris_1977
1 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.18.2 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 07:47 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy