Usage of find and cp with duplicate


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Usage of find and cp with duplicate
# 1  
Old 06-14-2013
Usage of find and cp with duplicate

Hi All ! I am trying to copy all files with extension .sh to one folder, following command I am using

Code:
find . -name \*.sh -print0 | xargs -I{} -0 cp -v {} Scripts/

above command working fine but I have some .sh file with same base name different directory, so I would copy all .sh file including duplicate, if suppose base name of 2 or more files are same then I want to rename them by adding number and then copy to destination. For example script.sh in directory foo and another script.sh in directory foo1 then destination directory Scripts should contain both script and it should have name like this script_1.sh script_2.sh

Note : I am trying to run above command from home directory so that all scripts in Desktop, Document and other folders can be copied to destination
please let me know how to modify above command
# 2  
Old 06-14-2013
Hi.

Some versions of cp will do a numbered backup for you:
Code:
#!/usr/bin/env bash

# @(#) s1	Demonstrate cp making "backup" copies of file that exist.

# Utility functions: print-as-echo, print-line-with-visual-space, debug.
# export PATH="/usr/local/bin:/usr/bin:/bin"
pe() { for _i;do printf "%s" "$_i";done; printf "\n"; }
pl() { pe;pe "-----" ;pe "$*"; }
db() { ( printf " db, ";for _i;do printf "%s" "$_i";done;printf "\n" ) >&2 ; }
db() { : ; }
C=$HOME/bin/context && [ -f $C ] && $C cp

FILE=data
N=${1-3}

pl " Input data file $FILE, making $N copies:"
cat $FILE

rm -rf d1
mkdir d1
pl " Initial content of directory d1:"
ls d1

pl " Content after making $N copies:"
for i in $( seq 1 $N )
do
  pe " Making copy $i"
  cp --backup=numbered $FILE d1
done
ls d1

exit 0

producing:
Code:
% ./s1

Environment: LC_ALL = C, LANG = C
(Versions displayed with local utility "version")
OS, ker|rel, machine: Linux, 2.6.26-2-amd64, x86_64
Distribution        : Debian GNU/Linux 5.0.8 (lenny) 
bash GNU bash 3.2.39
cp (GNU coreutils) 6.10

-----
 Input data file data, making 3 copies:
Now is the time.

-----
 Initial content of directory d1:

-----
 Content after making 3 copies:
 Making copy 1
 Making copy 2
 Making copy 3
data  data.~1~	data.~2~

See man cp for details.

Best wishes ... cheers, drl
# 3  
Old 06-14-2013
Could this help you ?
Code:
 
 find . -name \*.sh -print | awk -F"/" '{if(!a[$NF]){print $0" "1;a[$NF]++} else{a[$NF]++;print $0" "a[$NF]}}' testfile | while read line
 do
 Count=$(echo "$line" | awk '{print $NF}')
 OrigFilename=$(echo "$line" | awk '{print $1}')
 
 if [ $Count -gt 1 ]
 then
 bname=$(basename $OrigFilename ".sh")
 Newname=${bname}_${Count}".sh"
 echo "cp $OrigFilename /TAR/DIR/$Newname"
else
 echo "cp $OrigFilename /TAR/DIR/"
fi
done

# 4  
Old 06-14-2013
Quote:
Originally Posted by Akshay Hegde
I am trying to copy all files with extension .sh to one folder
Why flatten the hierarchy? Aside from complicating the code, you lose information.

Regards,
Alister
# 5  
Old 06-14-2013
@alister

I am so scared to run script, but I could able to copy scripts from Praveen's script and drl's code creates back up of file which is existing

Last edited by Akshay Hegde; 06-14-2013 at 03:18 PM..
# 6  
Old 06-14-2013
It certainly slows things down to not use 'cp many-files dir', so consider telling cp not to overwrite = -n, capture stderr back to stdout in a wrapper ()2>&1 | and process it to do the up-version work only.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

2. Shell Programming and Scripting

How to find duplicate entries

I have a file contails as below I/P: 123456 123456 234567 987654 678905 678905 Like above i have 1000's of entries I need output as below O/P: 123456 678905 I'm using uniq -d filename it is showing results but it is missing few duplicate entries and i dont know why.Please... (9 Replies)
Discussion started by: buzzme
9 Replies

3. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

4. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

5. AIX

How to monitor the IBM AIX server for I/O usage,memory usage,CPU usage,network..?

How to monitor the IBM AIX server for I/O usage, memory usage, CPU usage, network usage, storage usage? (3 Replies)
Discussion started by: laknar
3 Replies

6. Shell Programming and Scripting

Find Duplicate files, not by name

I have a directory with images: -rw-r--r-- 1 root root 26216 Mar 19 21:00 020109.210001.jpg -rw-r--r-- 1 root root 21760 Mar 19 21:15 020109.211502.jpg -rw-r--r-- 1 root root 23144 Mar 19 21:30 020109.213002.jpg -rw-r--r-- 1 root root 31350 Mar 20 00:45 020109.004501.jpg -rw-r--r-- 1 root... (2 Replies)
Discussion started by: Ikon
2 Replies

7. Shell Programming and Scripting

find duplicate records... again

Hi all: Let's suppose I have a file like this (but with many more records). XX ME 342 8688 2006 7 6 3c 60.029 -38.568 2901 0001 74 4 7603 8 969.8 958.4 3.6320 34.8630 985.5 973.9 3.6130 34.8600 998.7 986.9 3.6070 34.8610 1003.6 991.7 ... (4 Replies)
Discussion started by: rleal
4 Replies

8. HP-UX

how can I find cpu usage memory usage swap usage and logical volume usage

how can I find cpu usage memory usage swap usage and I want to know CPU usage above X% and contiue Y times and memory usage above X % and contiue Y times my final destination is monitor process logical volume usage above X % and number of Logical voluage above can I not to... (3 Replies)
Discussion started by: alert0919
3 Replies

9. Shell Programming and Scripting

Find duplicate value and create an

I need a perl script, which will run every midnight via cronjob and e-mail few users once it finds any duplicated value in a file which is located /etc/hosts, the file name is called hosts and the format of the file has 3 colums and some time 2 columns. The script will look for duplicate IP or... (3 Replies)
Discussion started by: ricky007
3 Replies

10. Shell Programming and Scripting

how to find duplicate files with find ?

hello all I like to make search on files , and the result need to be the files that are duplicated? (8 Replies)
Discussion started by: umen
8 Replies
Login or Register to Ask a Question