Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Need tune my command occupying 90% CPU Post 302350160 by thegeek on Thursday 3rd of September 2009 02:23:03 AM
Old 09-03-2009
one thing striked while seeing this, once if you have a certain text then why you need to continue checking...

that is if OOM is enabled at 1st line itself why you are continuing the check for the remaining 999 lines !

So have a if condition, if the OOM is not enabled then do out a pattern match else dont.

This is just a simple thing striked. Expecting better solution from others here...
 

8 More Discussions You Might Find Interesting

1. Programming

Application occupying CPU resources

Dear all, I have a pro c application running in the unix environement. This pro c program actually trigger by a java application from sun workstation. Recently, when we released a new proc c application and notice that the application occupying the CPU resources even through we check that the... (1 Reply)
Discussion started by: ghho
1 Replies

2. UNIX for Dummies Questions & Answers

Tune my logic of script

I have big log file, which contains the netstat output from my application server to a particular DB server. I aim is to plot a daily graph for this. Please find the sample log file below. @ - ........................................................... @ - Total number of connection to the ... (3 Replies)
Discussion started by: senthilkumar_ak
3 Replies

3. Shell Programming and Scripting

Tune my query

I have a requirement to separate only some numbers from the input file and produce it in a format. The input is ( i have took a sample, the actual file contains more than 50000 rows around 840 MB in size) $cat temp.txt 001 08 002 08 003 06 004 11 005 11 006 08 007 08 008 92* 009 92 010... (1 Reply)
Discussion started by: senthil.ak
1 Replies

4. Solaris

Please tune my script for Solaris

I have very big log file around 2-3 GB in that it contians 24 hours log data. My work is extract only 5-5 data and count the patterns from them. I worte a script in linux and we're using that. sed -n "/2009 05:/,/2009 17:/p" trace.log | grep -f patterns.txt > temp.log while read string ;do... (5 Replies)
Discussion started by: senthil.ak
5 Replies

5. Shell Programming and Scripting

Command to find the Memory and CPU utilization using 'top' command

Hi all, I found like top command could be used to find the Memory and CPU utilization. But i want to know how to find the Memory and CPU utilization for a particular user using top command. Thanks in advance. Thanks, Ananthi.U (2 Replies)
Discussion started by: ananthi_ku
2 Replies

6. Programming

SQL : Fine tune Insert by query

i would like to know how can i fine tune the following query since the cost of the query is too high .. insert into temp temp_1 select a,b,c,d from xxxx .. database used is IDS.. (1 Reply)
Discussion started by: expert
1 Replies

7. Shell Programming and Scripting

Tune my script

Hi ! My script read out data out of 144 files per day - every ten minutes a file with data. data-file WR030B 306.71 0 WR050B 315.13 0 WR120B 308.34 0 WV030B 3.52 0 WV050B 5.06 0 WV120B 6.65 0 TLUFT02B 8.60... (3 Replies)
Discussion started by: IMPe
3 Replies

8. Red Hat

Files not getting deleted with rm & occupying space in filesystem

Hello, OS version is Red Hat Enterprise Linux Server release 6.5 (Santiago). In one of the filesystem some old files post clone are not getting removed even with 'rm' # ls -ltr | grep meagpd_62.dbf -rw-rw---- 1 oracle oinstall 34358697984 Sep 1 08:46 meagpd_62.dbf # rm... (7 Replies)
Discussion started by: saharookiedba
7 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 07:59 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy