Sponsored Content
Full Discussion: Need help with gawk
Operating Systems SCO Need help with gawk Post 302916987 by Don Cragun on Friday 12th of September 2014 10:21:12 PM
Old 09-12-2014
This thread duplicates the discussion going on in the thread: Gawk Question

This thread is closed.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

gawk HELP

I have to compare records in two files. It can be done using gawk/awk but i am unable to do it. Please help me File1 ABAAAAAB BC asa sa ABAAABAA BC bsa sm ABBBBAAA BC bxz sa ABAAABAB BC csa sa ABAAAAAA BC dsa sm ABBBBAAB BC dxz sa File 2 ABAAAAAB BC aas ba ABAAAAAB BC asa sa... (6 Replies)
Discussion started by: sandeep_hi
6 Replies

2. Shell Programming and Scripting

unable to use GAWK

The following message is being prompted while using gawk in my ksh : gawk: not found any idea how to fix this ? (1 Reply)
Discussion started by: sinpeak
1 Replies

3. Shell Programming and Scripting

gawk and bash

Hi. I'm having trouble using gawk within a bash script and I can't figure out why. I have a command that takes in a data file with two columns, the first one numbers and the second words. My code takes each line, and prints the word its corresponding number of times. The code works from the... (2 Replies)
Discussion started by: cdislater
2 Replies

4. Shell Programming and Scripting

gawk to perl

Hi all I’m looking for a perl equivalent to this command string I need to imbed this in a existing perl script cat file1 | gawk -F"|" '{print $1","$2,",",$3,",",$11 >> "new-file"}' Thank you (4 Replies)
Discussion started by: Ex-Capsa
4 Replies

5. Shell Programming and Scripting

Gawk Help

Hi, I am using the script to print the portion of the file containing a particular string. But it is giving error "For Reading (No such file or directory). I am using cygwin as unix simulator. cat TT35*.log | gawk -v search="12345678" ' /mSOriginating /,/disconnectingParty/ { ... (1 Reply)
Discussion started by: vanand420
1 Replies

6. Shell Programming and Scripting

Help with gawk command

Hi, I have a situation. in a particular file , from the 9th column i have to match a particular pattern . i want a second file which is made by excluding them. I wrote a code like this. gawk '$9~/^(SPI|OTC|SAX)$/' /home/ceh1/ceh_prod/plx_"$mydate"_old.tsv >>... (1 Reply)
Discussion started by: pranabrana
1 Replies

7. Shell Programming and Scripting

Gawk help (windows)

Someone help please. I tried to do it with findstr but I couldn't, so now I'm trying to output the following numbers from this text file with gawk (what I need is in bold down below): Analyzing pool.ntp.org (1 of 1)... delayoffset from local clock Stratum: 2 Warning: Reverse name... (18 Replies)
Discussion started by: harris_t
18 Replies

8. Shell Programming and Scripting

Doubt with gawk

Hi All, I have a doubt with gawk. I have a shell script "cleanup" which calls a gawk script "cleanawk" in it. we have two unix servers epsun532 and wpsun712. So i tested the script in both the environments. In epsun532 while calling the gawk script i just mentioned something like this ... (1 Reply)
Discussion started by: Diddy
1 Replies

9. SCO

Gawk Question

I am trying to use gawk to search a file and put the second value of the string into a string. gawk -F: '$1~/CXFR/ {print $2}' go.dat go.dat ==================== HOME :/ CTMP :/tmp CUTL :/u/rdiiulio/bin CWRK :/u/work CXFR :/u/xfer ... (4 Replies)
Discussion started by: trolley
4 Replies

10. Shell Programming and Scripting

Gawk and regexp

Hello, This is a problem I've worked on a while and can't figure out. There is a file.txt ..some stuff.. ] ] ..some stuff.. The Awk program is trying to extract the year portion of the birth and death ("98: and "2nd C.") using the below technique #!/bin/awk @include... (5 Replies)
Discussion started by: Mid Ocean
5 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 08:52 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy