Sponsored Content
Top Forums Shell Programming and Scripting Extract duplicate fields in rows Post 302147888 by summer_cherry on Thursday 29th of November 2007 12:47:17 AM
Old 11-29-2007
awk

hi

code:
Code:
awk 'BEGIN{FS=" ;"}
{
if (temp=="")
{
	temp=$2
	t_line=$0
}
else if (temp==$2)
{
	print t_line
	print $0
	temp=""
	t_line=""
}
else
{
	temp=$2
	t_line=$0
}
}' filename

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

duplicate rows in a file

hi all can anyone please let me know if there is a way to find out duplicate rows in a file. i have a file that has hundreds of numbers(all in next row). i want to find out the numbers that are repeted in the file. eg. 123434 534 5575 4746767 347624 5575 i want 5575 please help (3 Replies)
Discussion started by: infyanurag
3 Replies

2. Shell Programming and Scripting

How to extract duplicate rows

I have searched the internet for duplicate row extracting. All I have seen is extracting good rows or eliminating duplicate rows. How do I extract duplicate rows from a flat file in unix. I'm using Korn shell on HP Unix. For.eg. FlatFile.txt ======== 123:456:678 123:456:678 123:456:876... (5 Replies)
Discussion started by: bobbygsk
5 Replies

3. HP-UX

How to get Duplicate rows in a file

Hi all, I have written one shell script. The output file of this script is having sql output. In that file, I want to extract the rows which are having multiple entries(duplicate rows). For example, the output file will be like the following way. ... (7 Replies)
Discussion started by: raghu.iv85
7 Replies

4. Shell Programming and Scripting

How to extract duplicate rows

Hi! I have a file as below: line1 line2 line2 line3 line3 line3 line4 line4 line4 line4 I would like to extract duplicate lines (not unique, triplicate or quadruplicate lines). Output will be as below: line2 line2 I would appreciate if anyone can help. Thanks. (4 Replies)
Discussion started by: chromatin
4 Replies

5. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

6. Shell Programming and Scripting

Extract fields from different rows.

Hi, I have data like below. SID=D6EB96CC0 HID=9C246D6 CSource=xya Cappe=1 Versionc=3670 MAR1=STL MARS2=STL REQ_BUFFER_ENCODING=UTF-8 REQ_BUFFER_ORIG_ENCODING=UTF-8 RESP_BODY_ENCODING=UTF-8 CON_ID=2713 I want to select CSource=xya (18 Replies)
Discussion started by: chetan.c
18 Replies

7. Shell Programming and Scripting

Delete duplicate rows

Hi, This is a followup to my earlier post him mno klm 20 76 . + . klm_mango unix_00000001; alp fdc klm 123 456 . + . klm_mango unix_0000103; her tkr klm 415 439 . + . klm_mango unix_00001043; abc tvr klm 20 76 . + . klm_mango unix_00000001; abc def klm 83 84 . + . klm_mango... (5 Replies)
Discussion started by: jacobs.smith
5 Replies

8. Shell Programming and Scripting

Extract and count number of Duplicate rows

Hi All, I need to extract duplicate rows from a file and write these bad records into another file. And need to have a count of these bad records. i have a command awk ' {s++} END { for(i in s) { if(s>1) { print i } } }' ${TMP_DUPE_RECS}>>${TMP_BAD_DATA_DUPE_RECS}... (5 Replies)
Discussion started by: Arun Mishra
5 Replies

9. Shell Programming and Scripting

Extract duplicate rows with conditions

Gents Can you help please. Input file 5490921425 1 7 1310342 54909214251 5490921425 2 1 1 54909214252 5491120937 1 1 3 54911209371 5491120937 3 1 1 54911209373 5491320785 1 ... (4 Replies)
Discussion started by: jiam912
4 Replies

10. Shell Programming and Scripting

Extract and exclude rows based on duplicate values

Hello I have a file like this: > cat examplefile ghi|NN603762|eee mno|NN607265|ttt pqr|NN613879|yyy stu|NN615002|uuu jkl|NN607265|rrr vwx|NN615002|iii yzA|NN618555|ooo def|NN190486|www BCD|NN628717|ppp abc|NN190486|qqq EFG|NN628717|aaa HIJ|NN628717|sss > I can sort the file by... (5 Replies)
Discussion started by: CHoggarth
5 Replies
rlm_passwd(5)							 FreeRADIUS Module						     rlm_passwd(5)

NAME
rlm_passwd - FreeRADIUS Module DESCRIPTION
The rlm_passwd module provides authorization via files similar in format to /etc/passwd. The lm_passwd module allows you to retrieve any account information from any files with passwd-like format (/etc/passwd, /etc/group, smb- passwd, .htpasswd, etc). Every field of the file may be mapped to a RADIUS attribute, with one of the fields used as a key. The module reads the file when it initializes, and caches the data in memory. As a result, it does not support dynamic updates of the files (the server has to be HUP'd), but it is very fast, even for files with thousands of lines. The configuration item(s): filename The path to the file. delimiter = ":" The character to use as a delimiter between fields. The default is ":" hashsize The size of the hashtable. If 0, then the passwords are not cached and the passwd file is parsed for every request. We do not rec- ommend such a configuration. A larger hashsize means less probability of collision and faster search in hashtable. Having a hash- size in the range of 30-100% of the number of passwd file records is reasonable. allowmultiplekeys If set to 'yes', and more than one record in file matches the request, then the attributes from all records will be used. If set to 'no' (the default) the module will warn about duplicated records. ignorenislike If set to 'yes', then all records from the file beginning with the '+' sign will be ignored. The default is 'no'. format The format of the fields in the file, given as an example line from the file, with the content of the fields as the RADIUS attributes which the fields map to. The fields are seperated by the ':' character. The key field is signified by being preceded with a '*' character, which indicates that the field has only one key, like the /etc/passwd file. The key field may instead be preceded with '*,', which indicates that the field has multiple possible keys, like the /etc/group file. The other fields signify RADIUS attributes which, by default, are added to the configuration items for a request. To add an attribute to the request (as though it was sent by the NAS), prefix the attribute name in the "format" string with the '~' char- acter. To add an attribute to the reply (to be sent back to the NAS) prefix the attribute name in the "format" string with the '=' character. ignoreempty This configuration item defaults to "yes". If there is no value for the attribute, then the attribute is not added. By setting this value to "no", you can force the attribute to be added, even if there is no value. EXAMPLES
format = "My-Group:::*,User-Name" Parse a file similar to the /etc/group file. An entry matches a request when the name in a User-Name attribute exists in the comma- seperated list of a line in the file. When an entry matches, a "My-Group" attribute will be created and added to the configuration items for the request. The value of that attribute will be taken from the first field of the matching line in the file. The ":::" in the format string means that there are extra two fields in the line, in between the group name and list of user names. Those fields do not map to any RADIUS attribute, and are therefore ignored. For this example to work in practice, you will have to add the My-Group attribute to the dictionary file. See the dictionary manual page for details on how this may be done. format = "~My-Group:::*,User-Name" Similar to the previous entry, except the My-Group attribute is added to the request, as though it was sent by the NAS. SECTIONS
authorize FILES
/etc/raddb/radiusd.conf SEE ALSO
radiusd(8), radiusd.conf(5) dictionary(5), AUTHOR
Alan DeKok <aland@freeradius.org> 14 April 2004 rlm_passwd(5)
All times are GMT -4. The time now is 02:07 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy