05-17-2011
ahh .. a simple modification and its working fine..
.. i kept $0 after a[$1],b[$1]
thank you both for your codes.. they work perfect..
8 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I have files with an x amounts of rows with each row having 2 columns seperated by delimiter "|" .
File contains following records for example.
15|69
15|70
15|71
15|72
15|73
15|74
16|2
16|3
16|4
16|5
16|6
16|7
16|8
16|9
16|10
16|11
16|12 (4 Replies)
Discussion started by: Alex_P
4 Replies
2. Shell Programming and Scripting
Hi everyone,
I once again got stuck with merging tables and was wondering if someone could help me out on that problem.
I have a number of tab delimited tables which I need to merge into one big one. All tables have the same header but a different number of rows (this could be changed if... (6 Replies)
Discussion started by: TuAd
6 Replies
3. Shell Programming and Scripting
Hi all,
I have a complex (beyond my biological expertise) problem at hand.
I need to merge multiple files into 1 big matrix. Please help me with some code.
Inp1
Ang_0 chr1 98 T A
Ang_0 chr1 352 G A
Ang_0 chr1 425 C T
Ang_0 chr2 ... (1 Reply)
Discussion started by: newbie83
1 Replies
4. Shell Programming and Scripting
Hi,
I have a file (sorted by sort) with 8 tab delimited columns. The first column contains duplicated fields and I need to merge all these identical lines.
My input file:
comp100002 aaa bbb ccc ddd eee fff ggg
comp100003 aba aba aba aba aba aba aba
comp100003 fff fff fff fff fff fff fff... (5 Replies)
Discussion started by: falcox
5 Replies
5. Shell Programming and Scripting
Hi all,
In a directory of many files, I need to merge only files which do not have identical lines and also the resulatant merge file should not be more than 50000 lines. Basically I need to cover up all text files in that directory and turn them to Merge files.txt with 50000 lines each
... (2 Replies)
Discussion started by: pravfraz
2 Replies
6. Shell Programming and Scripting
Hi all,
I need some help to remove duplicates from a file before merging.
I have got 2 files:
file1 has data in format
4300 23456
4301 2357
the 4 byte values on the right hand side is uniq, and are not repeated anywhere in the file
file 2 has data in same format but is not in... (10 Replies)
Discussion started by: roy121
10 Replies
7. UNIX for Dummies Questions & Answers
Hi,
Please excuse me , i have searched unix forum, i am unable to find what i expect ,
my query is , i have 2 files of same structure and having 1 similar field/column , i need to merge 2 tables/files based on the one matched field/column (that is field 1),
file 1:... (5 Replies)
Discussion started by: karthikram
5 Replies
8. Shell Programming and Scripting
please help solving the following. I have access to redhat linux cluster having 32gigs of ram.
I have duplicate ids for variable names, in the file 1,2 are duplicates;3,4 and 5 are duplicates;6 and 7 are duplicates. My objective is to use only the first occurrence of these duplicates.
Lookup... (4 Replies)
Discussion started by: ritakadm
4 Replies
LEARN ABOUT POSIX
fs_cleanacl
FS_CLEANACL(1) AFS Command Reference FS_CLEANACL(1)
NAME
fs_cleanacl - Remove obsolete entries from an ACL
SYNOPSIS
fs cleanacl [-path <dir/file path>+] [-help]
fs cl [-p <dir/file path>+] [-h]
DESCRIPTION
The fs cleanacl command removes from the access control list (ACL) of each specified directory or file any entry that refers to a user or
group that no longer has a Protection Database entry. Such an entry appears on the ACL as an AFS user ID number (UID) rather than a name,
because without a Protection Database entry, the File Server cannot translate the UID into a name.
Cleaning access control lists in this way not only keeps them from becoming crowded with irrelevant information, but also prevents the new
possessor of a recycled AFS UID from obtaining access intended for the former possessor of the AFS UID. (Note that recycling UIDs is not
recommended in any case.)
OPTIONS
-path <dir/file path>+
Names each directory for which to clean the ACL (specifying a filename cleans its directory's ACL). If this argument is omitted, the
current working directory's ACL is cleaned.
Specify the read/write path to each directory, to avoid the failure that results from attempting to change a read-only volume. By
convention, the read/write path is indicated by placing a period before the cell name at the pathname's second level (for example,
/afs/.abc.com). For further discussion of the concept of read/write and read-only paths through the filespace, see the fs mkmount
reference page.
-help
Prints the online help for this command. All other valid options are ignored.
OUTPUT
If there are no obsolete entries on the ACL, the following message appears:
Access list for <path> is fine.
Otherwise, the output reports the resulting state of the ACL, following the header
Access list for <path> is now
At the same time, the following error message appears for each file in the cleaned directories:
fs: '<filename>': Not a directory
EXAMPLES
The following example illustrates the cleaning of the ACLs on the current working directory and two of its subdirectories. Only the second
subdirectory had obsolete entries on it.
% fs cleanacl -path . ./reports ./sources
Access list for . is fine.
Access list for ./reports is fine.
Access list for ./sources is now
Normal rights:
system:authuser rl
pat rlidwka
PRIVILEGE REQUIRED
The issuer must have the "a" (administer) permission on each directory's ACL (or the ACL of each file's parent directory); the directory's
owner and the members of the system:administrators group have the right implicitly, even if it does not appear on the ACL.
SEE ALSO
fs_listacl(1), fs_mkmount(1)
COPYRIGHT
IBM Corporation 2000. <http://www.ibm.com/> All Rights Reserved.
This documentation is covered by the IBM Public License Version 1.0. It was converted from HTML to POD by software written by Chas
Williams and Russ Allbery, based on work by Alf Wachsmann and Elizabeth Cassell.
OpenAFS 2012-03-26 FS_CLEANACL(1)