Sponsored Content
Top Forums Shell Programming and Scripting fastest way to remove duplicates. Post 76001 by vino on Friday 24th of June 2005 01:27:10 AM
Old 06-24-2005
Just a thought.

Why not use the divide and conquer approach ?

Vino

Last edited by vino; 06-24-2005 at 04:46 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

2. UNIX for Dummies Questions & Answers

How to remove duplicates without sorting

Hello, I can remove duplicate entries in a file by: sort File1 | uniq > File2 but how can I remove duplicates without sorting the file? I tried cat File1 | uniq > File2 but it doesn't work thanks (4 Replies)
Discussion started by: orahi001
4 Replies

3. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

4. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

5. Shell Programming and Scripting

Script to remove duplicates

Hi I need a script that removes the duplicate records and write it to a new file for example I have a file named test.txt and it looks like abcd.23 abcd.24 abcd.25 qwer.25 qwer.26 qwer.98 I want to pick only $1 and compare with the next record and the output should be abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies

6. Shell Programming and Scripting

remove duplicates and sort

Hi, I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another. Thanks (6 Replies)
Discussion started by: dvah
6 Replies

7. Shell Programming and Scripting

Fastest way to delete duplicates from a large filelist.....

OK I have two filelists...... The first is formatted like this.... /path/to/the/actual/file/location/filename.jpg and has up to a million records The second list shows filename.jpg where there is more then on instance. and has maybe up to 65,000 records I want to copy files... (4 Replies)
Discussion started by: Bashingaway
4 Replies

8. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

9. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

10. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies
cp(1)							      General Commands Manual							     cp(1)

Name
       cp - copy file data

Syntax
       cp [ -f ] [ -i ] [ -p ] file1 file2

       cp [ -f ] [ -i ] [ -p ] [ -r ] file... directory

       cp [ -f ] [ -i ] [ -p ] [ -r ] directory... directory

Description
       The command copies file1 onto file2.  The mode and owner of file2 are preserved if it already existed; the mode of file1 is used otherwise.
       Note that the command will not copy a file onto itself.

       In the second form, one or more files are copied into the directory with their original file names.

       In the third form, one or more source directories are copied into the destination directory with their original file names.

Options
       -f   Forces existing destination pathnames to be removed before copying, without prompting for confirmation.  The -i option is  ignored	if
	    the -f option is specified.

       -i   Prompts  user  with  the name of file whenever the copy will cause an old file to be overwritten. A yes answer will cause to continue.
	    Any other answer will prevent it from overwriting the file.

       -p   Preserves (duplicates) in the copies the modification time, access time, file mode, user ID, and group ID as allowed  by  the  permis-
	    sions of the source files, ignoring the present umask.

       -r   Copies  directories.  Entire directory trees, including their subtrees and the individual files they contain, are copied to the speci-
	    fied destination directory. The directory, its subtrees, and the individual files retain their original names. For	example,  to  copy
	    the directory including all of its subtrees and files, into the directory enter the following command:
	    cp -r reports news

See Also
       cat(1), pr(1), mv(1)

																	     cp(1)
All times are GMT -4. The time now is 08:54 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy