06-08-2010
Configuring ACL on AIX 5.3
Hello All,
Let me quickly come down to my problem..
I have a file with following description:
==
root:/me01/tia/filetr ans # ls -lrt DW_NUM_OF_ROWS_TSP.txt
-rwxrwxr-- 1 tiaoas oinstall 43 Jun 07 17:12 DW_NUM_OF_ROWS_TSP.txt
root:/me01/tia/filetrans # aclget DW_NUM_OF_ROWS_TSP.txt
*
* ACL_type AIXC
*
attributes:
base permissions
owner(tiaoas): rwx
group(oinstall): rwx
others: rwx
extended permissions
enabled
permit rw- u:tiaadm,g: oinstall
==
I wanted user 'tiaadm:staff' to be able to do a 'chmod 777 DW_NUM_OF_ROWS_TSP.txt' -- for which I permitted this user for 'rw' as shown in the 'aclget' o/p above. However, my purpose was that any file which begins with 'DW', user 'tiaadm:staff' should be able to do a 'chmod', hence setting ACL for individual files wasn't quite a right option.
Google'ng around, someone suggested to set the ACL for the parent directory under which files starting with 'DW' exists, so this is what I did:
==
aclget DW_NUM_OF_ROWS_TSP.txt | aclput -R /me01/tia/filetrans
==
With this, my purpose wasn't solved as new files(getting created) starting with 'DW' under ~/filetrans directory didn't allow user 'tiaadm:staff' to do a 'chmod 777'.
Later, some online forum suggested to use "Extended Attribute Format" for inheritance under directories, so this is what I did:
==
root:/ # chfs -a ea=v2 /me01
root:/me01/tia # aclconvert -t NFS4 filetrans
root:/me01/tia # aclget filetrans
*
* ACL_type NFS4
*
*
* Owner: tiaesg
* Group: oinstall
*
s: (OWNER@): a rwpRWxDaAdcCs
s: (OWNER@): d o
s: (GROUP@): a rwpRWxDadcs
s: (GROUP@): d Co
s: (EVERYONE@): a rwpRWxDadcs
s: (EVERYONE@): d Co
u:tiaadm: a rwpxo fidi
g: oinstall: a rwpxo fidi
==
Even with the above options implemented, this is what I see:
==
tiaadm:/me01/tia/scripts% chmod 777 /me01/tia/filetrans/DW*
chmod: /me01/tia/filetrans//DW_NUM_OF_ROWS_TSP.txt: Operation not permitted.
==
I find myself absolutely lost & messed up here. Could someone kindly suggest/help getting this sorted?
Cheers,
Souvik
10 More Discussions You Might Find Interesting
1. Cybersecurity
Hi all,
I've just been handled the responsibility for a FTP-site. Having no experiens of UNIX at all. And now one of my users needs to have full access to the usr directory and all it's subdirectories, don't know why just trying to do what the boss tells me. The type of UNIX is FreeBSD and the... (4 Replies)
Discussion started by: -tri-
4 Replies
2. AIX
Hi All,
I have Network color laser printer which is to be configured in AIX5L.
The Model of the printer is OKI C3200.
Will it is supported with AIX 5..? I could not find any drivers for this.
Will any compatible drivers are available for this printer... I tried with the default drivers hplj-4... (2 Replies)
Discussion started by: helloajith
2 Replies
3. AIX
Hi,
We have two IBM pSeires servers running IBM AIX. I want to configure the IBM servers to print on the network printer.
Here is the following configuration
IBM Server-I IP Add: 192.168.0.5
IBM Server-II IP Add: 192.168.0.6
Network Printer IP Add: 192.168.0.15
I'm new to IBM AIX OS.... (0 Replies)
Discussion started by: berhanemt
0 Replies
4. AIX
We run two p5 nodes running AIX 5L in a cluster mode (HACMP), both the nodes share external disk arrays. Only the primary node can access the shared disks at a given point of time.
We are in the process of adding two new disks to the disk arrays so as to make them available to the existing... (3 Replies)
Discussion started by: dnicky
3 Replies
5. AIX
Hi All
I am new to Apache and AIX and trying to configure Subversion tool in IBM machine with AIX 5.3.Apache is required for client to connect to Subversion Server with http protocol.
1.Installed Apache 2.2.11 and its dependencies via RPM
2.Configured httpd.conf file.
3.Started "httpd"... (1 Reply)
Discussion started by: rajivdp
1 Replies
6. AIX
Hey Guys
Has anyone out here tried configuring AIX as AD clients for authentication?
I have seen redbooks explaining stuffs but has it worked well for anyone?
Thanks
Bala (1 Reply)
Discussion started by: balaji_prk
1 Replies
7. Linux
Hi, I want to know what does the "effective" comment means in the output of the getfacl and whether it has to do with the acl mask...
thanks (0 Replies)
Discussion started by: Gartlar
0 Replies
8. AIX
Need help in configuring /etc/host.equiv file. Though i configure the /etc/hosts.equiv file to prevent users from connecting remotely to the local server " -host" in /etc/hosts.equiv file, it allows the users to login from remote machine. we don't have .rhosts file on the local server. Could you... (3 Replies)
Discussion started by: saikiran_1984
3 Replies
9. Web Development
Hi Friends,
Could any one please help me how to configure access control list (ACL) in Iplanet server in unix mode ? (0 Replies)
Discussion started by: nicky
0 Replies
10. Solaris
Can i get the synopsis for add multiple users in single command for ACL access for a directory or a file
thanks in advance
dinu (3 Replies)
Discussion started by: dinu
3 Replies
LEARN ABOUT SUSE
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotRules - database of robots.txt-derived permissions
SYNOPSIS
use WWW::RobotRules;
my $rules = WWW::RobotRules->new('MOMspider/1.0');
use LWP::Simple qw(get);
{
my $url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$rules->parse($url, $robots_txt) if defined $robots_txt;
}
{
my $url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$rules->parse($url, $robots_txt) if defined $robots_txt;
}
# Now we can check if a URL is valid for those servers
# whose "robots.txt" files we've gotten and parsed:
if($rules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://www.robotstxt.org/wc/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that
constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and
will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be
the first field in a new record.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
Unrecognized records are ignored.
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
This is an example of a malformed robots.txt file.
# robots.txt for ancientcastle.example.com
# I've locked myself away.
User-agent: *
Disallow: /
# The castle is your home now, so you can go anywhere you like.
User-agent: Belle
Disallow: /west-wing/ # except the west wing!
# It's good to be the Prince...
User-agent: Beast
Disallow:
This file is missing the required blank lines between records. However, the intention is clear.
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
perl v5.12.1 2009-10-03 WWW::RobotRules(3)