Sponsored Content
Top Forums Shell Programming and Scripting getting rid of duplicate files Post 92388 by moxxx68 on Friday 9th of December 2005 04:58:21 PM
Old 12-09-2005
find ./path -print -exec basename {} \; | awk -v var=arr 'arr[$0]++;i >= 2; i$var' | xargs mv --target-directory=test..


this worked to a certain extent although I am getting some error
messages with the cp command and the basename is contingious to
all names of the same type (ex, file.txt-{1,2,3,4}) and I am not
sure if this gives the exact result as far as leaving one file
although i tried diff two test directories against each other it
seemed real close.. please leave affirmation of any correct syntax
used (if any?)..
moxxx68
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Renaming multiple files, to get rid of extension

I have a good script to rename multiple files, but what's the best way I can remove some text from multiple filenames? Say I have a directory with 35 files with a .XLS at the end, how can I rename them to remove the .XLS but keep everything the same, without having to mv manually. Thanks. (6 Replies)
Discussion started by: nj78
6 Replies

2. UNIX for Dummies Questions & Answers

Getting rid of files with no ownership

I am in the process of learning how to do system administration (just on my own Linux machine) and have been working with the find command. One of the things I tried was find / -nouser -o -nogroup I redirected the output of my find query into a text file, and when I did a wc -l on it, it... (1 Reply)
Discussion started by: kermit
1 Replies

3. Shell Programming and Scripting

Finding Duplicate files

How do you delete and and find duplicate files? (1 Reply)
Discussion started by: Jicom4
1 Replies

4. Shell Programming and Scripting

Getting Rid of Having to Write to Flat Files

Ok, so i've been having to write to flat files lately and then making my script read information from the flat file and then work off of that. i dont want to keep doing that because i believe it creates a mess. i like to keep my work all to one script instead of having that one script... (7 Replies)
Discussion started by: SkySmart
7 Replies

5. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

6. UNIX for Dummies Questions & Answers

how to get rid of last _ in the files name?

ex: I have list of files in a folder. abc_def_geh_.txt abc_.txt abc_def_geh_12345_.txt ab134c_d345345ef_444geh_12345_.txt i need to rename all files to get rid of the _ before .txt result should look like this: abc_def_geh.txt abc.txt abc_def_geh_12345.txt... (2 Replies)
Discussion started by: lv99
2 Replies

7. Shell Programming and Scripting

Remove duplicate files

Hi, In a directory, e.g. ~/corpus is a lot of files and subdirectories. Some of the files are named: 12345___PP___0902___AA.txt 12346___PP___0902___AA. txt 12347___PP___0902___AA. txt The amount of files varies. I need to keep the highest (12347___PP___0902___AA. txt) and remove... (5 Replies)
Discussion started by: corfuitl
5 Replies

8. Shell Programming and Scripting

Duplicate files

Hi Gents, I have 1 files as seen below. 44571009 100 42381900 101 23482389 102 44571009 103 28849007 104 28765648 105 25689908 106 28765648 107 42381900 108 44571009 109 17298799 110 44571009 111 I would like to get something like it 44571009 100 103 109 111 (3 Replies)
Discussion started by: jiam912
3 Replies

9. Shell Programming and Scripting

Trying to get rid of a duplicate output line...

Hi folks, I'm trying to work on a script that will grab a router interface report and generate the numbers of "in use" and "un-used" ports per device. Right now, I've got a cut down of the report as follows: sing /usr/apps/siteName/etc/DCAFT-9K.cmds for send text Connecting using... (11 Replies)
Discussion started by: Marc G
11 Replies

10. Shell Programming and Scripting

Finds all duplicate files

Hi, How would you write bash script that given a directory as an argument and finds all duplicate files (with same contents - by using bytewise comparison) there and prints their names? (6 Replies)
Discussion started by: elior
6 Replies
IGAWK(1)							 Utility Commands							  IGAWK(1)

NAME
igawk - gawk with include files SYNOPSIS
igawk [ all gawk options ] -f program-file [ -- ] file ... igawk [ all gawk options ] [ -- ] program-text file ... DESCRIPTION
Igawk is a simple shell script that adds the ability to have ``include files'' to gawk(1). AWK programs for igawk are the same as for gawk, except that, in addition, you may have lines like @include getopt.awk in your program to include the file getopt.awk from either the current directory or one of the other directories in the search path. OPTIONS
See gawk(1) for a full description of the AWK language and the options that gawk supports. EXAMPLES
cat << EOF > test.awk @include getopt.awk BEGIN { while (getopt(ARGC, ARGV, "am:q") != -1) ... } EOF igawk -f test.awk SEE ALSO
gawk(1) Effective AWK Programming, Edition 1.0, published by the Free Software Foundation, 1995. AUTHOR
Arnold Robbins (arnold@skeeve.com). Free Software Foundation Nov 3 1999 IGAWK(1)
All times are GMT -4. The time now is 04:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy