Sponsored Content
Full Discussion: Help finding non duplicates
Top Forums Shell Programming and Scripting Help finding non duplicates Post 302526885 by chipblah84 on Wednesday 1st of June 2011 07:42:55 PM
Old 06-01-2011
Help finding non duplicates

I am currently creating a script to find filenames that are listed once in an input file (find non duplicates). I then want to report those single files in another file. Here is the function that I have so far:

Code:
function dups_filenames
{
file2=""
file1=""
file=""
dn=""
ch=""
pn=""
 
while read file dn ch pn
do
if [[ $file != $file1 && $file1 != $file2 ]]; then
echo "FILE \t\t\t\t\t CHECKSUM" >> "$dirs"_"$host"_singlefilelog
echo "---- \t\t\t\t\t --------" >> "$dirs"_"$host"_singlefilelog
printf "%-40s%-50s\n" $file1  $ch1 >> "$dirs"_"$host"_singlefilelog
printf "%-20s%-20s\n" "PATH-> "$dn1 >> "$dirs"_"$host"_singlefilelog
echo  >> "$dirs"_"$host"_singlefilelog
fi
 
file2=$file1
file1=$file
dn1=$dn
ch1=$ch
pn1=$pn
done < "$dirs"_"$host"_filelists
}

"$dirs"_"$host"_singlefilelog = the output file

The above code does find single occurrances, but it does not report single files at the bottom of the text file ("$dirs"_"$host"_filelists). If I printf "$file" instead of "$file1," the output does not report the top file in the text file.

Any suggestions?
Thank you
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

finding duplicates with perl

I have a huge file (over 30mb) that I am processing through with perl. I am pulling out a list of filenames and placing it in an array called @reports. I am fine up till here. What I then want to do is go through the array and find any duplicates. If there is a duplicate, output it to the screen.... (3 Replies)
Discussion started by: dangral
3 Replies

2. Shell Programming and Scripting

finding duplicate files by size and finding pattern matching and its count

Hi, I have a challenging task,in which i have to find the duplicate files by its name and size,then i need to take anyone of the file.Then i need to open the file and find for more than one pattern and count of that pattern. Note:These are the samples of two files,but i can have more... (2 Replies)
Discussion started by: jerome Sukumar
2 Replies

3. HP-UX

getting duplicates

how to get duplicates in a file containing data in columns using command or scripting? (4 Replies)
Discussion started by: megh
4 Replies

4. Shell Programming and Scripting

finding duplicates in columns and removing lines

I am trying to figure out how to scan a file like so: 1 ralphs office","555-555-5555","ralph@mail.com","www.ralph.com 2 margies office","555-555-5555","ralph@mail.com","www.ralph.com 3 kims office","555-555-5555","kims@mail.com","www.ralph.com 4 tims... (17 Replies)
Discussion started by: totus
17 Replies

5. Shell Programming and Scripting

Finding duplicates from positioned substring across lines

I have million's of records each containing exactly 50 characters and have to check the uniqueness of 4 character substring of 50 character (postion known prior) and report if any duplicates are found. Eg. data... AAAA00000000000000XXXX0000 0000000000... upto50 chars... (2 Replies)
Discussion started by: gapprasath
2 Replies

6. Shell Programming and Scripting

Non Duplicates

I have input file like below. I00789524 0213 5212 D00789524 0213 5212 I00778787 2154 5412 The first two records are same(Duplicates) except I & D in the first character. I want non duplicates(ie. 3rd line) to be output. How can we get this . Can you help. Is there any single AWK or SED... (3 Replies)
Discussion started by: awk_beginner
3 Replies

7. Shell Programming and Scripting

finding duplicates in csv based on key columns

Hi team, I have 20 columns csv files. i want to find the duplicates in that file based on the column1 column10 column4 column6 coulnn8 coulunm2 . if those columns have same values . then it should be a duplicate record. can one help me on finding the duplicates, Thanks in advance. ... (2 Replies)
Discussion started by: baskivs
2 Replies

8. UNIX for Dummies Questions & Answers

Finding duplicates then copying, almost there, maybe?

Hi everyone. I'm trying to help my wife with a project, she has exported 200 images from many different folders, unfortunately there was a problem with the export and I need to find the master versions so that she doesn't have to go through and select them again. I need to: For each image in... (2 Replies)
Discussion started by: Rhinoskin
2 Replies

9. Shell Programming and Scripting

Finding duplicates in a file excluding specific pattern

I have unix file like below >newuser newuser <hello hello newone I want to find the unique values in the file(excluding <,>),so that the out put should be >newuser <hello newone can any body tell me what is command to get this new file. (7 Replies)
Discussion started by: shiva2985
7 Replies

10. Shell Programming and Scripting

UNIX scripting for finding duplicates and null records in pk columns

Hi, I have a requirement.for eg: i have a text file with pipe symbol as delimiter(|) with 4 columns a,b,c,d. Here a and b are primary key columns.. i want to process that file to find the duplicates and null values are in primary key columns(a,b) . I want to write the unique records in which... (5 Replies)
Discussion started by: praveenraj.1991
5 Replies
RCORDER(8)						    BSD System Manager's Manual 						RCORDER(8)

NAME
rcorder -- print a dependency ordering of interdependent files SYNOPSIS
rcorder [-k keep] [-s skip] file ... DESCRIPTION
The rcorder utility is designed to print out a dependency ordering of a set of interdependent files. Typically it is used to find an execu- tion sequence for a set of shell scripts in which certain files must be executed before others. Each file passed to rcorder must be annotated with special lines (which look like comments to the shell) which indicate the dependencies the files have upon certain points in the sequence, known as ``conditions'', and which indicate, for each file, which ``conditions'' may be expected to be filled by that file. Within each file, a block containing a series of ``REQUIRE'', ``PROVIDE'', ``BEFORE'' and ``KEYWORD'' lines must appear. The format of the lines is rigid. Each line must begin with a single '#', followed by a single space, followed by ``PROVIDE:'', ``REQUIRE:'', ``BEFORE:'', or ``KEYWORD:''. No deviation is permitted. Each dependency line is then followed by a series of conditions, separated by whitespace. Multi- ple ``PROVIDE'', ``REQUIRE'', ``BEFORE'' and ``KEYWORD'' lines may appear, but all such lines must appear in a sequence without any interven- ing lines, as once a line that does not follow the format is reached, parsing stops. The options are as follows: -k Add the specified keyword to the ``keep list''. If any -k option is given, only those files containing the matching keyword are listed. -s Add the specified keyword to the ``skip list''. If any -s option is given, files containing the matching keyword are not listed. An example block follows: # REQUIRE: networking syslog # REQUIRE: usr # PROVIDE: dns nscd This block states that the file in which it appears depends upon the ``networking'', ``syslog'', and ``usr'' conditions, and provides the ``dns'' and ``nscd'' conditions. A file may contain zero ``PROVIDE'' lines, in which case it provides no conditions, and may contain zero ``REQUIRE'' lines, in which case it has no dependencies. There must be at least one file with no dependencies in the set of arguments passed to rcorder in order for it to find a starting place in the dependency ordering. DIAGNOSTICS
The rcorder utility may print one of the following error messages and exit with a non-zero status if it encounters an error while processing the file list. Requirement %s has no providers, aborting. No file has a ``PROVIDE'' line corresponding to a condition present in a ``REQUIRE'' line in another file. Circular dependency on provision %s, aborting. A set of files has a circular dependency which was detected while processing the stated con- dition. Circular dependency on file %s, aborting. A set of files has a circular dependency which was detected while processing the stated file. SEE ALSO
rc(8) HISTORY
The rcorder utility first appeared in NetBSD 1.5. AUTHORS
Written by Perry E. Metzger <perry@piermont.com> and Matthew R. Green <mrg@eterna.com.au>. BUGS
The ``REQUIRE'' keyword is misleading: It doesn't describe which daemons have to be running before a script will be started. It describes which scripts must be placed before it in the dependency ordering. For example, if your script has a ``REQUIRE'' on ``named'', it means the script must be placed after the ``named'' script in the dependency ordering, not necessarily that it requires named(8) to be started or enabled. BSD
August 5, 2011 BSD
All times are GMT -4. The time now is 12:29 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy