Sponsored Content
Top Forums Shell Programming and Scripting How to : Find duplicate number from file? with bash Post 302268042 by cfajohnson on Monday 15th of December 2008 12:12:09 AM
Old 12-15-2008

What is the format of the file?

What do you want to output? The duplicates? Only one instance of every number? The non-duplicated numbers?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Counting The Number Of Duplicate Lines In a File

Hello. First time poster here. I have a huge file of IP numbers. I am trying to output only the class b of the IPs and rank them by most common and output the total # of duplicate class b's before the class b. An example is below: 12.107.1.1 12.107.9.54 12.108.3.89 12.109.109.4 12.109.6.3 ... (2 Replies)
Discussion started by: crunchtime
2 Replies

2. Shell Programming and Scripting

Bash Script duplicate file names

I am trying to write a housekeeping bash script. Part of it involves searching all of my attached storage media for photographs and moving them into a single directory. The problem occurs when files have duplicate names, obviously a file called 001.jpg will get overwritten with another file... (6 Replies)
Discussion started by: stumpyuk
6 Replies

3. Shell Programming and Scripting

find out duplicate records in file?

Dear All, I have one file which looks like : account1:passwd1 account2:passwd2 account3:passwd3 account1:passwd4 account5:passwd5 account6:passwd6 you can see there're two records for account1. and is there any shell command which can find out : account1 is the duplicate record in... (3 Replies)
Discussion started by: tiger2000
3 Replies

4. Shell Programming and Scripting

Find Duplicate records in first Column in File

Hi, Need to find a duplicate records on the first column, ANU4501710430989 0000000W20389390 ANU4501710430989 0000000W67065483 ANU4501130050520 0000000W80838713 ANU4501210170685 0000000W69246611... (3 Replies)
Discussion started by: Murugesh
3 Replies

5. Shell Programming and Scripting

Find the number of non-duplicate names recursively.

Hi, here comes another newbie question: How to find the number of non-duplicate names recursively? For example, my files are stored in the folders like: If I do find . -depth -name "*.txt" | wc -l This will gives out a result "4". One .txt file named "1.txt" in folder "1", and... (2 Replies)
Discussion started by: jiapei100
2 Replies

6. UNIX for Dummies Questions & Answers

CSV file:Find duplicates, save original and duplicate records in a new file

Hi Unix gurus, Maybe it is too much to ask for but please take a moment and help me out. A very humble request to you gurus. I'm new to Unix and I have started learning Unix. I have this project which is way to advanced for me. File format: CSV file File has four columns with no header... (8 Replies)
Discussion started by: arvindosu
8 Replies

7. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

8. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

9. Shell Programming and Scripting

Bash script to find the number of files and identify which ones are 0 bytes.

I am writing a bash script to find out all the files in a directory which are empty. I am running into multiple issues. I will really appreciate if someone can please help me. #!/bin/bash DATE=$(date +%m%d%y) TIME=$(date +%H%M) DIR="/home/statsetl/input/civil/test" ... (1 Reply)
Discussion started by: monasharma13
1 Replies

10. Shell Programming and Scripting

Bash Script to find/sort/move images/duplicate images from USB drive

Ultimately, I'm looking to create a script that allows me to plug in a usb drive with lots of jpegs on it & copy them over to a folder on my hard drive. So in the process of copying I am looking to hash check them, record dupes to a file, copy only 1 of the identical files (if it doesn't exsist... (1 Reply)
Discussion started by: JonaQuinn
1 Replies
MESSAGES(3)						  libbash messages Library Manual					       MESSAGES(3)

NAME
messages -- libbash library that implements a set of functions to print standard status messages SYNOPSIS
printOK [indent] printFAIL [indent] printNA [indent] printATTN [indent] printWAIT [indent] DESCRIPTION
General messages is a collection of functions to print standard status messages - those [ OK ] and [FAIL] messages you see during Linux boot process. The function list: printOK Prints a standard [ OK ] message (green) printFAIL Prints a standard [FAIL] message (red) printNA Prints a standard [ N/A] message (yellow) printATTN Prints a standard [ATTN] message (yellow) printWAIT Prints a standard [WAIT] message (yellow) Detailed interface description follows. indent Column to move to before printing. Default indent is calculated as TTY_WIDTH-10. If current tty width can not be determined (for example, in case of serial console), it defaults to 80, so default indent is 80-10=10 FUNCTIONS DESCRIPTIONS
printOK [indent] Prints a standard [ OK ] message (green) printFAIL [indent] Prints a standard [FAIL] message (red) printNA [indent] Prints a standard [ N/A] message (yellow) printATTN [indent] Prints a standard [ATTN] message (yellow) printWAIT [indent] Prints a standard [WAIT] message (yellow) EXAMPLES
Run a program named MyProg, and report it's success or failure: echo -n 'Running MyProg...' printWAIT if MyProg ; then printOK else printFAIL fi AUTHORS
Hai Zaar <haizaar@haizaar.com> Gil Ran <gil@ran4.net> SEE ALSO
ldbash(1), libbash(1) Linux Epoch Linux
All times are GMT -4. The time now is 09:32 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy