Sponsored Content
Top Forums Shell Programming and Scripting getting rid of duplicate files Post 92324 by moxxx68 on Friday 9th of December 2005 09:59:47 AM
Old 12-09-2005
getting rid of duplicate files

i have a bad problem with multiple occurances of the same file in
different directories.. how this happened i am not sure! but I know
that i can use awk to scan multiple directory trees to find an
occurance of the same file... some of these files differ somwhat
but that does not matter! the name of the files are the same and
the context is basically the same....
i have seen an awk script that can be run on the command line using
a syntax where var=file:r and dup=var++ and var < 1 or to the
extent of this but can not remember exactly how this works.......
using the C shell;
i need to find occurances of var and if they are greater than one
and remove them leaving one occurance .. .
any examples or clues as to how to piece this together would be
appreciated since i don't use awk that often.
moxxx68
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Renaming multiple files, to get rid of extension

I have a good script to rename multiple files, but what's the best way I can remove some text from multiple filenames? Say I have a directory with 35 files with a .XLS at the end, how can I rename them to remove the .XLS but keep everything the same, without having to mv manually. Thanks. (6 Replies)
Discussion started by: nj78
6 Replies

2. UNIX for Dummies Questions & Answers

Getting rid of files with no ownership

I am in the process of learning how to do system administration (just on my own Linux machine) and have been working with the find command. One of the things I tried was find / -nouser -o -nogroup I redirected the output of my find query into a text file, and when I did a wc -l on it, it... (1 Reply)
Discussion started by: kermit
1 Replies

3. Shell Programming and Scripting

Finding Duplicate files

How do you delete and and find duplicate files? (1 Reply)
Discussion started by: Jicom4
1 Replies

4. Shell Programming and Scripting

Getting Rid of Having to Write to Flat Files

Ok, so i've been having to write to flat files lately and then making my script read information from the flat file and then work off of that. i dont want to keep doing that because i believe it creates a mess. i like to keep my work all to one script instead of having that one script... (7 Replies)
Discussion started by: SkySmart
7 Replies

5. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

6. UNIX for Dummies Questions & Answers

how to get rid of last _ in the files name?

ex: I have list of files in a folder. abc_def_geh_.txt abc_.txt abc_def_geh_12345_.txt ab134c_d345345ef_444geh_12345_.txt i need to rename all files to get rid of the _ before .txt result should look like this: abc_def_geh.txt abc.txt abc_def_geh_12345.txt... (2 Replies)
Discussion started by: lv99
2 Replies

7. Shell Programming and Scripting

Remove duplicate files

Hi, In a directory, e.g. ~/corpus is a lot of files and subdirectories. Some of the files are named: 12345___PP___0902___AA.txt 12346___PP___0902___AA. txt 12347___PP___0902___AA. txt The amount of files varies. I need to keep the highest (12347___PP___0902___AA. txt) and remove... (5 Replies)
Discussion started by: corfuitl
5 Replies

8. Shell Programming and Scripting

Duplicate files

Hi Gents, I have 1 files as seen below. 44571009 100 42381900 101 23482389 102 44571009 103 28849007 104 28765648 105 25689908 106 28765648 107 42381900 108 44571009 109 17298799 110 44571009 111 I would like to get something like it 44571009 100 103 109 111 (3 Replies)
Discussion started by: jiam912
3 Replies

9. Shell Programming and Scripting

Trying to get rid of a duplicate output line...

Hi folks, I'm trying to work on a script that will grab a router interface report and generate the numbers of "in use" and "un-used" ports per device. Right now, I've got a cut down of the report as follows: sing /usr/apps/siteName/etc/DCAFT-9K.cmds for send text Connecting using... (11 Replies)
Discussion started by: Marc G
11 Replies

10. Shell Programming and Scripting

Finds all duplicate files

Hi, How would you write bash script that given a directory as an argument and finds all duplicate files (with same contents - by using bytewise comparison) there and prints their names? (6 Replies)
Discussion started by: elior
6 Replies
INSPECT(1)                                                         User Commands                                                        INSPECT(1)

NAME
inspect - Boost code inspection tool SYNOPSIS
inspect [-cvs] [-text] [-brief] [options...] DESCRIPTION
It is not uncommon for various common errors or guideline violations to creep into the Boost libraries. The inspect program detects and reports several common problems. It can be used to scan a proposed Boost submission to identify various failures. The program is run in the directory to be scanned for errors. Sub-directories are also included in the scan. If the first program argument is -cvs, only files and directories in the CVS tree of the current directory are scanned. Otherwise all files and sub-directories are included in the scan. Options: -license -copyright -crlf -link -path_name -tab -ascii -minmax -unnamed The default is to have all checks on; otherwise options specify desired checks. SEE ALSO
The full documentation for inspect is maintained in HTML; see /usr/share/doc/libboostX.Y-doc/HTML/tools/inspect/index.html inspect July 2009 INSPECT(1)
All times are GMT -4. The time now is 04:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy