06-24-2005
Definetly its worth a try.
Precautions u can take are:
1. Make sure all distinct columns are indexed.
2. If it is one table, then u need not worry about joins...else make sure the joins are in such a way that you get maximum throughput instead of least response time
3. Run the query at such a time when no other big activity is going on in same table, bcos if query will be long...it can give rollback segmetn too old error.
All the best.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
hello
i need help to remove directory . The directory is not empty ., it contains
several sub directories and files inside that..
total number of files in one directory is 12,24,446 .
rm -rf doesnt work . it is prompting for every file ..
i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies
2. UNIX for Dummies Questions & Answers
Hello,
I can remove duplicate entries in a file by:
sort File1 | uniq > File2
but how can I remove duplicates without sorting the file?
I tried cat File1 | uniq > File2 but it doesn't work
thanks (4 Replies)
Discussion started by: orahi001
4 Replies
3. Shell Programming and Scripting
Hello Experts,
I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies
4. Shell Programming and Scripting
Hi,
I need to remove duplicates from a file. The file will be like this
0003 10101 20100120 abcdefghi
0003 10101 20100121 abcdefghi
0003 10101 20100122 abcdefghi
0003 10102 20100120 abcdefghi
0003 10103 20100120 abcdefghi
0003 10103 20100121 abcdefghi
Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies
5. Shell Programming and Scripting
Hi
I need a script that removes the duplicate records and write it to a new file
for example I have a file named test.txt and it looks like
abcd.23
abcd.24
abcd.25
qwer.25
qwer.26
qwer.98
I want to pick only $1 and compare with the next record and the output should be
abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies
6. Shell Programming and Scripting
Hi,
I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another.
Thanks (6 Replies)
Discussion started by: dvah
6 Replies
7. Shell Programming and Scripting
OK
I have two filelists......
The first is formatted like this....
/path/to/the/actual/file/location/filename.jpg
and has up to a million records
The second list shows filename.jpg where there is more then on instance.
and has maybe up to 65,000 records
I want to copy files... (4 Replies)
Discussion started by: Bashingaway
4 Replies
8. Shell Programming and Scripting
I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different.
I need to go from this:
http://***/fae78fe/file1.wmv
http://***/39du7si/file1.wmv
http://***/d8el2hd/file2.wmv
http://***/h893js3/file2.wmv
to this:
... (2 Replies)
Discussion started by: locoroco
2 Replies
9. Shell Programming and Scripting
I have a file with the following format:
fields seperated by "|"
title1|something class|long...content1|keys
title2|somhing class|log...content1|kes
title1|sothing class|lon...content1|kes
title3|shing cls|log...content1|ks
I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies
10. Shell Programming and Scripting
Hi I have a below file structure.
200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,,
200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,,
300,20140223,0.001,0.001,0.001,0.001,0.001
300,20140224,0.001,0.001,0.001,0.001,0.001
300,20140225,0.001,0.001,0.001,0.001,0.001
300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies