I apologize Scrutinizer, I didn't know that.
But it still dosn't work?!?
gives me zero results I am running this in /bin/bash under cygwin.
I also tried the exact same command in /bin/bash under Ubuntu, same thing - doesn't work.
Tried tcsh under Ubuntu, same thing - doesn't work.
CSH same thing - doesn't work.
I apologize for my rude answer since I really thought the \{ was the culprit and made the expression fail. But it has to be something else.
I have also tried to swap out the 1,3 to 1..3 and 1.3 to check the perl notation inside {} but that doesn't work either.
I need a perl script which will create an output file after comparing two diff file in a directory path:
/export/home/abc/file1
/export/home/abc/file2
File Format: <IP>TAB<DeviceName><TAB>DESCRIPTIONS
file1:
10.1.2.1.3<tab>abc123def<tab>xyz.mm1.ppp.... (2 Replies)
Hi!
I want to find duplicate files (criteria: file size) in my download folder.
I try it like this:
find /Users/frodo/Downloads \! -type d -exec du {} \; | sort > /Users/frodo/Desktop/duplicates_1.txt;
cut -f 1 /Users/frodo/Desktop/duplicates_1.txt | uniq -d | grep -hif -... (9 Replies)
Hi !
I wonder if anyone can help on this : I have a directory: /xyz that has the following files:
chsLog.107.20130603.gz
chsLog.115.20130603
chsLog.111.20130603.gz
chsLog.107.20130603
chsLog.115.20130603.gz
As you ca see there are two files that are the same but only with a minor... (10 Replies)
Hi champs,
I have one of the requirement, where I need to compare two files line by line and ignore duplicates. Note, I hav files in sorted order.
I have tried using the comm command, but its not working for my scenario.
Input file1
srv1..development..employee..empname,empid,empdesg... (1 Reply)
I have tried the following code and with that i couldnt achieve what i want.
#!/usr/bin/bash
find ./ -type f \( -iname "*.xml" \) | sort -n > fileList
sed -i '/\.\/fileList/d' fileList
NAMEOFTHISFILE=$(echo $0|sed -e 's/\/()$*.^|/\\&/g')
sed -i "/$NAMEOFTHISFILE/d"... (2 Replies)
I would like find and delete old backup files in aix. How would I go about doing this? For example:
server1_1-20-2020
server1_1-21-2020
server1_1-22-2020
server1_1-23-2020
server2_1-20-2020
server2_1-21-2020
server2_1-22-2020
server2_1-23-2020
How would I go about finding and... (3 Replies)
Discussion started by: cokedude
3 Replies
LEARN ABOUT CENTOS
hardlink
hardlink(1) General Commands Manual hardlink(1)NAME
hardlink - Consolidate duplicate files via hardlinks
SYNOPSIS
hardlink [-c] [-n] [-v] [-vv] [-h] directory1 [ directory2 ... ]
DESCRIPTION
This manual page documents hardlink, a program which consolidates duplicate files in one or more directories using hardlinks.
hardlink traverses one or more directories searching for duplicate files. When it finds duplicate files, it uses one of them as the mas-
ter. It then removes all other duplicates and places a hardlink for each one pointing to the master file. This allows for conservation of
disk space where multiple directories on a single filesystem contain many duplicate files.
Since hard links can only span a single filesystem, hardlink is only useful when all directories specified are on the same filesystem.
OPTIONS -c Compare only the contents of the files being considered for consolidation. Disregards permission, ownership and other differ-
ences.
-f Force hardlinking across file systems.
-n Do not perform the consolidation; only print what would be changed.
-v Print summary after hardlinking.
-vv Print every hardlinked file and bytes saved. Also print summary after hardlinking.
-h Show help.
AUTHOR
hardlink was written by Jakub Jelinek <jakub@redhat.com>.
Man page written by Brian Long.
Man page updated by Jindrich Novy <jnovy@redhat.com>
BUGS
hardlink assumes that its target directory trees do not change from under it. If a directory tree does change, this may result in hardlink
accessing files and/or directories outside of the intended directory tree. Thus, you must avoid running hardlink on potentially changing
directory trees, and especially on directory trees under control of another user.
hardlink(1)