Sponsored Content
Top Forums Shell Programming and Scripting How to : Find duplicate number from file? with bash Post 302267982 by avklinux on Sunday 14th of December 2008 09:01:21 PM
Old 12-14-2008
How to : Find duplicate number from file? with bash

Thanks
AVKlinux
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Counting The Number Of Duplicate Lines In a File

Hello. First time poster here. I have a huge file of IP numbers. I am trying to output only the class b of the IPs and rank them by most common and output the total # of duplicate class b's before the class b. An example is below: 12.107.1.1 12.107.9.54 12.108.3.89 12.109.109.4 12.109.6.3 ... (2 Replies)
Discussion started by: crunchtime
2 Replies

2. Shell Programming and Scripting

Bash Script duplicate file names

I am trying to write a housekeeping bash script. Part of it involves searching all of my attached storage media for photographs and moving them into a single directory. The problem occurs when files have duplicate names, obviously a file called 001.jpg will get overwritten with another file... (6 Replies)
Discussion started by: stumpyuk
6 Replies

3. Shell Programming and Scripting

find out duplicate records in file?

Dear All, I have one file which looks like : account1:passwd1 account2:passwd2 account3:passwd3 account1:passwd4 account5:passwd5 account6:passwd6 you can see there're two records for account1. and is there any shell command which can find out : account1 is the duplicate record in... (3 Replies)
Discussion started by: tiger2000
3 Replies

4. Shell Programming and Scripting

Find Duplicate records in first Column in File

Hi, Need to find a duplicate records on the first column, ANU4501710430989 0000000W20389390 ANU4501710430989 0000000W67065483 ANU4501130050520 0000000W80838713 ANU4501210170685 0000000W69246611... (3 Replies)
Discussion started by: Murugesh
3 Replies

5. Shell Programming and Scripting

Find the number of non-duplicate names recursively.

Hi, here comes another newbie question: How to find the number of non-duplicate names recursively? For example, my files are stored in the folders like: If I do find . -depth -name "*.txt" | wc -l This will gives out a result "4". One .txt file named "1.txt" in folder "1", and... (2 Replies)
Discussion started by: jiapei100
2 Replies

6. UNIX for Dummies Questions & Answers

CSV file:Find duplicates, save original and duplicate records in a new file

Hi Unix gurus, Maybe it is too much to ask for but please take a moment and help me out. A very humble request to you gurus. I'm new to Unix and I have started learning Unix. I have this project which is way to advanced for me. File format: CSV file File has four columns with no header... (8 Replies)
Discussion started by: arvindosu
8 Replies

7. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

8. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

9. Shell Programming and Scripting

Bash script to find the number of files and identify which ones are 0 bytes.

I am writing a bash script to find out all the files in a directory which are empty. I am running into multiple issues. I will really appreciate if someone can please help me. #!/bin/bash DATE=$(date +%m%d%y) TIME=$(date +%H%M) DIR="/home/statsetl/input/civil/test" ... (1 Reply)
Discussion started by: monasharma13
1 Replies

10. Shell Programming and Scripting

Bash Script to find/sort/move images/duplicate images from USB drive

Ultimately, I'm looking to create a script that allows me to plug in a usb drive with lots of jpegs on it & copy them over to a folder on my hard drive. So in the process of copying I am looking to hash check them, record dupes to a file, copy only 1 of the identical files (if it doesn't exsist... (1 Reply)
Discussion started by: JonaQuinn
1 Replies
RBASH(1)						      General Commands Manual							  RBASH(1)

NAME
rbash - restricted bash, see bash(1) RESTRICTED SHELL
If bash is started with the name rbash, or the -r option is supplied at invocation, the shell becomes restricted. A restricted shell is used to set up an environment more controlled than the standard shell. It behaves identically to bash with the exception that the follow- ing are disallowed or not performed: o changing directories with cd o setting or unsetting the values of SHELL, PATH, ENV, or BASH_ENV o specifying command names containing / o specifying a file name containing a / as an argument to the . builtin command o specifying a filename containing a slash as an argument to the -p option to the hash builtin command o importing function definitions from the shell environment at startup o parsing the value of SHELLOPTS from the shell environment at startup o redirecting output using the >, >|, <>, >&, &>, and >> redirection operators o using the exec builtin command to replace the shell with another command o adding or deleting builtin commands with the -f and -d options to the enable builtin command o using the enable builtin command to enable disabled shell builtins o specifying the -p option to the command builtin command o turning off restricted mode with set +r or set +o restricted. These restrictions are enforced after any startup files are read. When a command that is found to be a shell script is executed, rbash turns off any restrictions in the shell spawned to execute the script. SEE ALSO
bash(1) GNU Bash-4.0 2004 Apr 20 RBASH(1)
All times are GMT -4. The time now is 12:03 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy