Hi All,
I need an unix script/command to delete the milliseconds from the time stamps so that it becomes compatible with Excel sheet while displaying finally.
I have the following data in 2 columns which was obtained with some unix script(awk based) on some log files. Finally i want to... (1 Reply)
Hello All,
I'm generating timestamps (file creation timestamps) for all the files in a directory. I need to compare all the timestamps.
for example if i have 4 files and their timestamps are 20091125114556,
20091125114556,20091125114556,20091125114556 respectively.
I need to differentiate... (1 Reply)
Hello All,
I'm generating timestamps (file creation timestamps) for all the files in a directory. I need to compare all the timestamps.
for example if i have 4 files and their timestamps are 20091125114556,
20091125114556,20091125114556,20091125114556 respectively.
I need to differentiate... (9 Replies)
I know there have been a million questions regarding calculating time stamps, and with enough googling, I think I'm almost there (I'm going to use the changing the times into seconds and subtracting solution). My problem is that I'm not sure how to format my log file to get the info I need. Below... (0 Replies)
Hello All,
I have a problem calculating the time difference between start and end timings...!
the timings are given by 24hr format..
Start Date : 08/05/10 12:55
End Date : 08/09/10 06:50
above values are in mm/dd/yy hh:mm format.
Now the thing is, 7th(08/07/10) and... (16 Replies)
I'm looking for a way to have the "date" command output the date in a specific format.
I'm not familiar with the different ways to use the date command at all. i read up on it, but i dont get how to manipulate it.
i know that i can get the date format to give me a format like:
2012-10-13... (6 Replies)
Hi Gents.
Please can you help me to solve a problem.
I have a long list of files, which I need to change the time stamp.
-r--r--r-- 1 geo2 geovect 47096216 Feb 8 10:40 00000009.segd
-r--r--r-- 1 geo2 geovect 47096216 Feb 8 10:40 00000010.segd
-r--r--r-- 1 geo2 geovect 47096216 Feb ... (11 Replies)
Hi,
I have been working on the error Log script, where errors are pulled from server.
I need to pull the data of the error logs between two dates & time, for example :
22/12/2014 20:00:00
22/12/2014 22:00:00
Whatever error have came during this duration.
Now the question is the record... (6 Replies)
Hi, please help me to collect the entire log files between two time stamp.
for example,
I am looking script to collect the entire log between "2015-03-27 15:59" to "2015-03-27 16:15" in the below sample log file.
OS : RHEL 6.3
Date/Time : 24 hours format, the time is printing each log... (12 Replies)
I've installed cygwin_openssh on Windows 2012 R2 and it's working great. My issue is when a file is uploaded say from a different timezone, when it is uploaded, it doesnt pick up the sftp servers time.. Is there a way to fix that?
i.e. When someone in PST uploads a file to this server in EST,... (0 Replies)
Discussion started by: MikeAdkins
0 Replies
LEARN ABOUT DEBIAN
bup-margin
bup-margin(1) General Commands Manual bup-margin(1)NAME
bup-margin - figure out your deduplication safety margin
SYNOPSIS
bup margin [options...]
DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two
entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids.
For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit
hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by
its first 46 bits.
The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits,
that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits
with far fewer objects.
If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if
you're getting dangerously close to 160 bits.
OPTIONS --predict
Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer
from the guess. This is potentially useful for tuning an interpolation search algorithm.
--ignore-midx
don't use .midx files, use only .idx files. This is only really useful when used with --predict.
EXAMPLE
$ bup margin
Reading indexes: 100.00% (1612581/1612581), done.
40
40 matching prefix bits
1.94 bits per doubling
120 bits (61.86 doublings) remaining
4.19338e+18 times larger is possible
Everyone on earth could have 625878182 data sets
like yours, all in one repository, and we would
expect 1 object collision.
$ bup margin --predict
PackIdxList: using 1 index.
Reading indexes: 100.00% (1612581/1612581), done.
915 of 1612581 (0.057%)
SEE ALSO bup-midx(1), bup-save(1)BUP
Part of the bup(1) suite.
AUTHORS
Avery Pennarun <apenwarr@gmail.com>.
Bup unknown-bup-margin(1)