Sponsored Content
Top Forums Shell Programming and Scripting Duplicate files and output list Post 302720181 by rdrtx1 on Tuesday 23rd of October 2012 02:16:33 PM
Old 10-23-2012
If you mean duplicate lines then use:
Code:
uniq -d

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing duplicate files from list with different path

I have a list which contains all the jar files shipped with the product I am involved with. Now, in this list I have some jar files which appear again and again. But these jar files are present in different folders. My input file looks like this /path/1/to a.jar /path/2/to a.jar /path/1/to... (10 Replies)
Discussion started by: vino
10 Replies

2. Shell Programming and Scripting

Dynamically redirect output to duplicate files ???

Hi There are many posts in this forum regarding reditecting output, but mine is a different problem, please have a look. My shell script is redirecting output to a log file dynamically. That is it is using - exec > log1.txt 2>&1 Hence all the traces are appearing in the log1.txt. I want... (3 Replies)
Discussion started by: nsinha
3 Replies

3. Shell Programming and Scripting

Find duplicate value comparing 2 files and create an output

I need a perl script which will create an output file after comparing two diff file in a directory path: /export/home/abc/file1 /export/home/abc/file2 File Format: <IP>TAB<DeviceName><TAB>DESCRIPTIONS file1: 10.1.2.1.3<tab>abc123def<tab>xyz.mm1.ppp.... (2 Replies)
Discussion started by: ricky007
2 Replies

4. Shell Programming and Scripting

I need a script to find socials in files and output a list of those files

I am trying to find socail security numbers in files in (and under) a specific directory and output a list of the files where they are found... the format would be with no dashes just 9 numeric characters in a row. I have tried this: find /DirToLookIn -exec grep '\{9\}' /dev/null {} \; >>... (1 Reply)
Discussion started by: NewSolarisAdmin
1 Replies

5. Shell Programming and Scripting

list files command output

Hi All, Below is the 2 different ouputs of the command "ls -lrt", my question is what exactly "total 0" & "total 8" means here ? $ ls -rtl total 0 -rw-r--r-- 1 oracle dba 0 Feb 10 20:16 c -rw-r--r-- 1 oracle dba 0 Feb 10 20:16 b -rw-r--r-- 1... (1 Reply)
Discussion started by: kannan84
1 Replies

6. Shell Programming and Scripting

How to process select list of files and output to the same file?

Hi, I've a list of files ac_info.tps, subscription_array.tps, .......and many other files one of the file, bin_range_list.tps has the following content CREATE OR REPLACE TYPE "BIN_RANGE_LIST" AS TABLE OF BIN_RANGE_ELEM; / grant execute on... (4 Replies)
Discussion started by: jediwannabe
4 Replies

7. Shell Programming and Scripting

List duplicate files based on Name and size

Hello, I have a huge directory (with millions of files) and need to find out duplicates based on BOTH file name and File size. I know fdupes but it calculates MD5 which is very time-consuming and especially it takes forever as I have millions of files. Can anyone please suggest a script or... (7 Replies)
Discussion started by: prvnrk
7 Replies

8. Shell Programming and Scripting

List files output only for the last line

Hi, "ls -tl directory1" will list files to be sorted in mtime, but I don't want to see all the files in each directory, I want to only see output the last line (the oldest mtime) for each directory. $ ls -tl test1 -rw-r--r-- 1 hce hce 1714397 May 30 2013 b.txt -rw-r--r-- 1 hce hce 4678 May... (2 Replies)
Discussion started by: hce
2 Replies

9. Shell Programming and Scripting

How to list files names and sizes in a directory and output result to the file?

Hi , I'm trying to list the files and output is written to a file. But when I execute the command , the output file is being listed. How to exclude it ? /tmp file1.txt file2.txt ls -ltr |grep -v '-' | awk print {$9, $5} > output.txt cat output.txt file1.txt file2.txt output.txt (8 Replies)
Discussion started by: etldeveloper
8 Replies

10. UNIX for Beginners Questions & Answers

Using find to output list of files with specific strings

This is my problem, I am using the following code to extract the file names with specific strings 0.01: find ./ -name "*.txt" -exec grep -H '0.01' {} + It works wonders with a small sample. However, when I use it in a real scenario it produces an empty file -even though I am sure there are... (11 Replies)
Discussion started by: Xterra
11 Replies
uniq(1) 							   User Commands							   uniq(1)

NAME
uniq - report or filter out repeated lines in a file SYNOPSIS
uniq [-c | -d | -u] [-f fields] [-s char] [ input_file [output_file]] uniq [-c | -d | -u] [-n] [ + m] [ input_file [output_file]] DESCRIPTION
The uniq utility will read an input file comparing adjacent lines, and write one copy of each input line on the output. The second and suc- ceeding copies of repeated adjacent input lines will not be written. Repeated lines in the input will not be detected if they are not adjacent. OPTIONS
The following options are supported: -c Precedes each output line with a count of the number of times the line occurred in the input. -d Suppresses the writing of lines that are not repeated in the input. -f fields Ignores the first fields fields on each input line when doing comparisons, where fields is a positive decimal integer. A field is the maximal string matched by the basic regular expression: [[:blank:]]*[^[:blank:]]* If fields specifies more fields than appear on an input line, a null string will be used for comparison. -s chars Ignores the first chars characters when doing comparisons, where chars is a positive decimal integer. If specified in con- junction with the -f option, the first chars characters after the first fields fields will be ignored. If chars specifies more characters than remain on an input line, a null string will be used for comparison. -u Suppresses the writing of lines that are repeated in the input. -n Equivalent to -f fields with fields set to n. +m Equivalent to -s chars with chars set to m. OPERANDS
The following operands are supported: input_file A path name of the input file. If input_file is not specified, or if the input_file is -, the standard input will be used. output_file A path name of the output file. If output_file is not specified, the standard output will be used. The results are unspeci- fied if the file named by output_file is the file named by input_file. EXAMPLES
Example 1: Using the uniq command The following example lists the contents of the uniq.test file and outputs a copy of the repeated lines. example% cat uniq.test This is a test. This is a test. TEST. Computer. TEST. TEST. Software. example% uniq -d uniq.test This is a test. TEST. example% The next example outputs just those lines that are not repeated in the uniq.test file. example% uniq -u uniq.test TEST. Computer. Software. example% The last example outputs a report with each line preceded by a count of the number of times each line occurred in the file: example% uniq -c uniq.test 2 This is a test. 1 TEST. 1 Computer. 2 TEST. 1 Software. example% ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of uniq: LANG, LC_ALL, LC_CTYPE, LC_MES- SAGES, and NLSPATH. EXIT STATUS
The following exit values are returned: 0 Successful completion. >0 An error occurred. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWesu | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ SEE ALSO
comm(1), pack(1), pcat(1), sort(1), uncompress(1), attributes(5), environ(5), standards(5) SunOS 5.10 20 Dec 1996 uniq(1)
All times are GMT -4. The time now is 01:45 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy