Sponsored Content
Top Forums Shell Programming and Scripting Finding duplicate data in a file Post 32972 by dinplant on Wednesday 11th of December 2002 07:35:33 PM
Old 12-11-2002
Finding duplicate data in a file

A pogram named LOGGEDON returns an output of:

Ref_num IP Address Logged on User
12000 10.10.12.12 12-02-2002 11:00 john
12004 10.10.12.13 12-03-2002 14:00 mary
12012 10.10.12.14 12-03-2002 11:30 bob
12024 10.10.12.12 12-03-2002 09:00 john
12088 10.10.12.14 12-01-2002 21:00 bob

Another program REMUSER will terminate a session according to the "Ref_num"

i.e. REMUSER 12004 would kill mary's session

If you notice the same IP Address has has more than one Logged on session. If there is more than one occurance of the same "IP Address" I want to kill the oldest "Logged on" session.

i.e.

REMUSER 12000 <<< kills john's oldest session
REMUSER 12088 <<< kills bob's oldest session

1. The script would call LOGGEDON and direct its ouput to a temp file.

2. Each duplicate IP Address would be found in the temp file

3. REMUSER would be called with the "Ref_num of the oldest duplicate entry.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

finding data in a file

Hello, I am new to shell programming, and Unix development overall. I have a following text file with the following contents: .. Sunny Monday: x Tuesday: x Wednesday: x Thursday: x Friday: x Cloudy Monday: x Tuesday: x Wednesday: x ... (11 Replies)
Discussion started by: Selma
11 Replies

2. Shell Programming and Scripting

finding duplicate files by size and finding pattern matching and its count

Hi, I have a challenging task,in which i have to find the duplicate files by its name and size,then i need to take anyone of the file.Then i need to open the file and find for more than one pattern and count of that pattern. Note:These are the samples of two files,but i can have more... (2 Replies)
Discussion started by: jerome Sukumar
2 Replies

3. Shell Programming and Scripting

finding null records in data file

I am having a "|" delimited flat file and I have to pick up all the records with the 2nd field having null value. Please suggest. (3 Replies)
Discussion started by: dsravan
3 Replies

4. Shell Programming and Scripting

Finding Duplicate files

How do you delete and and find duplicate files? (1 Reply)
Discussion started by: Jicom4
1 Replies

5. Shell Programming and Scripting

Finding the duplicate in a file....

Hi Unix Guru's I had generated the uniqe code for every day date ranging from 20000101 to 21990101(200 years alomost 73000 uniqe codes ) and redirected it to text file. Now My problem is i want to check whether there are any duplicates in unique code not PRESENT in the textfile ? ... (2 Replies)
Discussion started by: Reddy482
2 Replies

6. Shell Programming and Scripting

Delete duplicate data and pertain the latest month data.

Hi I have a file with following records It contains three months of data, some data is duplicated,i need to access the latest data from the duplicate ones. for e.g; i have foll data "200","0","","11722","-63","","","","11722","JUL","09" "200","0","","11722","-63","","","","11722","JUL","09"... (10 Replies)
Discussion started by: vee_789
10 Replies

7. Shell Programming and Scripting

Finding standard deviation for all columns in a data file

Hi All, I want someone to modify the below script from this forum so that it can be used for all columns in the file( instead of only printing column 3 mean and standard deviation values). I don't know how to loop around all the columns. ... (3 Replies)
Discussion started by: ks_reddy
3 Replies

8. Shell Programming and Scripting

Finding Data in the file

Hi All I have a file having columns such as AAA,BBB,CCC,Aug 13 2013 AAA,BBB,CCC,Aug 11 2013 AAA,BBB,CCC,Aug 12 2013 AAA,BBB,CCC,Aug 13 2013 Now I need to extract the records which are not of todays date (Considering Today date is 13-Aug-2013 So I should get below records... (10 Replies)
Discussion started by: Prashantckc
10 Replies

9. Programming

Finding duplicate files in two base directories

Hello All, I have got some assignment to complete till this Monday and problem statement is as follow :- Problem :- Find duplicate files (especially .c and .cpp) from two project base directories with following requirement :- 1.Should be extendable to search in multiple base... (4 Replies)
Discussion started by: anand.shah
4 Replies
SVN::Hooks::DenyChanges(3pm)				User Contributed Perl Documentation			      SVN::Hooks::DenyChanges(3pm)

NAME
SVN::Hooks::DenyChanges - Deny some changes in a repository. VERSION
version 1.19 SYNOPSIS
This SVN::Hooks plugin is used to disallow the addition, deletion, or modification of parts of the repository structure. It's active in the "pre-commit" hook. It's configured by the following directives. DENY_ADDITION(REGEXP, ...) This directive denies the addition of new files matching the Regexps passed as arguments. DENY_ADDITION(qr/.(doc|xls|ppt)$/); # ODF only, please DENY_DELETION(REGEXP, ...) This directive denies the deletion of files matching the Regexps passed as arguments. DENY_DELETION(qr/contract/); # Can't delete contracts DENY_UPDATE(REGEXP, ...) This directive denies the modification of files matching the Regexps passed as arguments. DENY_UPDATE(qr/^tags/); # Can't modify tags DENY_EXCEPT_USERS(LIST) This directive receives a list of user names which are to be exempt from the rules specified by the other directives. DENY_EXCEPT_USERS(qw/john mary/); This rule exempts users "john" and "mary" from the other deny rules. AUTHOR
Gustavo L. de M. Chaves <gnustavo@cpan.org> COPYRIGHT AND LICENSE
This software is copyright (c) 2012 by CPqD. This is free software; you can redistribute it and/or modify it under the same terms as the Perl 5 programming language system itself. perl v5.14.2 2012-06-24 SVN::Hooks::DenyChanges(3pm)
All times are GMT -4. The time now is 04:43 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy