Sponsored Content
Top Forums UNIX for Advanced & Expert Users AIX find duplicate backup files Post 303043327 by MadeInGermany on Friday 24th of January 2020 09:09:45 AM
Old 01-24-2020
I guess you mean the file names?
Then consider sort and unique to sort out duplicates.
Show the duplicate file names:
Code:
$ sort your_example | uniq -d
server2_1-23-2020

Show the non-duplicates:
Code:
$ sort your_example | uniq -u
server1_1-20-2020
server1_1-21-2020
server1_1-22-2020
server2_1-20-2020
server2_1-21-2020
server2_1-22-2020

Show each file name once
Code:
$ sort your_example | uniq
$ sort -u your_example
server1_1-20-2020
server1_1-21-2020
server1_1-22-2020
server2_1-20-2020
server2_1-21-2020
server2_1-22-2020
server2_1-23-2020

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to find duplicate files with find ?

hello all I like to make search on files , and the result need to be the files that are duplicated? (8 Replies)
Discussion started by: umen
8 Replies

2. Shell Programming and Scripting

Find Duplicate files, not by name

I have a directory with images: -rw-r--r-- 1 root root 26216 Mar 19 21:00 020109.210001.jpg -rw-r--r-- 1 root root 21760 Mar 19 21:15 020109.211502.jpg -rw-r--r-- 1 root root 23144 Mar 19 21:30 020109.213002.jpg -rw-r--r-- 1 root root 31350 Mar 20 00:45 020109.004501.jpg -rw-r--r-- 1 root... (2 Replies)
Discussion started by: Ikon
2 Replies

3. UNIX for Dummies Questions & Answers

Not Listing Duplicate Backup Files

I've searched about a bit and more than likely, I'm not entering the correct search phrase, but my question is as follows: When a user does an 'ls' (and whatever various flags needed) and in the results, we see duplicate or backup files from editing. IE: file1.txt file1.txt~ Is there a... (3 Replies)
Discussion started by: Brusimm
3 Replies

4. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

5. Shell Programming and Scripting

Find duplicate files by file size

Hi! I want to find duplicate files (criteria: file size) in my download folder. I try it like this: find /Users/frodo/Downloads \! -type d -exec du {} \; | sort > /Users/frodo/Desktop/duplicates_1.txt; cut -f 1 /Users/frodo/Desktop/duplicates_1.txt | uniq -d | grep -hif -... (9 Replies)
Discussion started by: Dirk Einecke
9 Replies

6. Shell Programming and Scripting

find duplicate string in many different files

I have more than 100 files like this: SVEAVLTGPYGYT 2 SVEGNFEETQY 10 SVELGQGYEQY 28 SVERTGTGYT 6 SVGLADYNEQF 21 SVGQGYEQY 32 SVKTVLGYEQF 2 SVNNEQF 12 SVRDGLTNSPLH 3 SVRRDREGLEQF 11 SVRTSGSYEQY 17 SVSVSGSPLQETQY 78 SVVHSTSPEAF 59 SVVPGNGYT 75 (4 Replies)
Discussion started by: xshang
4 Replies

7. Shell Programming and Scripting

Find duplicate files but with different extensions

Hi ! I wonder if anyone can help on this : I have a directory: /xyz that has the following files: chsLog.107.20130603.gz chsLog.115.20130603 chsLog.111.20130603.gz chsLog.107.20130603 chsLog.115.20130603.gz As you ca see there are two files that are the same but only with a minor... (10 Replies)
Discussion started by: fretagi
10 Replies

8. Shell Programming and Scripting

Find duplicate rows between files

Hi champs, I have one of the requirement, where I need to compare two files line by line and ignore duplicates. Note, I hav files in sorted order. I have tried using the comm command, but its not working for my scenario. Input file1 srv1..development..employee..empname,empid,empdesg... (1 Reply)
Discussion started by: Selva_2507
1 Replies

9. Shell Programming and Scripting

Find help in shell - that clears away duplicate files

I am so frustrated!!! I want a nice command that clears away duplicate files: find . -type f -regex '.*{1,3}\..*' | xargs -I## rm -v '##' should work in my opinion. But it finds nothing even though I have files that have the file name: Scooby-Doo-1.txt Himalaya-2.jpg Camping... (8 Replies)
Discussion started by: Mr.Glaurung
8 Replies

10. Shell Programming and Scripting

To Find Duplicate files using latest in Linux

I have tried the following code and with that i couldnt achieve what i want. #!/usr/bin/bash find ./ -type f \( -iname "*.xml" \) | sort -n > fileList sed -i '/\.\/fileList/d' fileList NAMEOFTHISFILE=$(echo $0|sed -e 's/\/()$*.^|/\\&/g') sed -i "/$NAMEOFTHISFILE/d"... (2 Replies)
Discussion started by: gold2k8
2 Replies
backup(1M)																backup(1M)

NAME
backup - backup or archive file system SYNOPSIS
[-archive] DESCRIPTION
The command uses find(1) and cpio(1) to save a archive of all files that have been modified since the modification time of on the default tape drive should be invoked periodically to ensure adequate file backup. The option suppresses warning messages regarding optional access control list entries. backup(1M) does not backup optional access control list entries in a file's access control list (see acl(5)). Normally, a warning message is printed for each file having optional access control list entries. The option causes backup to save all files, regardless of their modification date, and then update using touch(1). prompts you to mount a new tape and continue if there is no more room on the current tape. Note that this prompting does not occur if you are running from cron(1M). The option causes to start a file system consistency check (without correction) after the backup is complete. For correct results, it is important that the system be effectively single-user while is running, especially if is allowed to automatically fix whatever inconsisten- cies it finds. does not ensure that the system is single-user. You can edit to customize it for your system. Several local values are used that can be customized: specifies which directories to back up recursively (usually meaning all directories); file name where start and finish times, block counts, and error messages are logged; file name whose date is the date of the last archive; file name that is checked by to remind the next person who logs in to change the backup tape; file name where start and finish times and output is logged. You may want to make other changes, such as whether or not does automatic correction (according to its arguments), where output is directed, other information logging, etc. In all cases, the output from is a normal archive file (or volume) which can be read using with the option. File Recovery creates archive tapes with all files and directories specified relative to the root directory. When recovering files from an archive tape created by you should be in the root directory and specify the directory path names for recovered files relative to the root directory When specifying the directory path name for file recovery by do not precede the leading directory name with a slash. If you prefer, you can also use with a option to determine how files and directories are named on the archive tape before attempting recovery. WARNINGS
Refer to in cpio(1). When runs out of tape, it sends an error to standard error and demands a new special file name from To continue, rewind the tape, mount the new tape, type the name of the new special file at the system console, and press If is being run unattended from cron(1M) and the tape runs out, terminates, leaving the process still waiting. Kill this process when you return. FILES
parameterized file names SEE ALSO
cpio(1), find(1), touch(1), cron(1M), fbackup(1M), frecover(1M), fsck(1M), acl(5). backup(1M)
All times are GMT -4. The time now is 01:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy