Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Getting number of lines of nth occurrency Post 302378853 by dennis.jacob on Wednesday 9th of December 2009 03:21:19 AM
Old 12-09-2009
Try:

Code:
awk '/xxx/ && (++x==2){ print NR}'  file

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Grepping nth line number

How do you grep every nth line number from a file? (2 Replies)
Discussion started by: shabs1985
2 Replies

2. Shell Programming and Scripting

Print lines with specific character at nth position in a file

I need to print lines with character S at nth position in a file...can someone pl help me with appropriate awk command for this (2 Replies)
Discussion started by: manaswinig
2 Replies

3. Shell Programming and Scripting

Print lines with specific character at nth position in a file

I need to print lines with character S at nth position in a file...can someone pl help me with appropriate awk command for this (1 Reply)
Discussion started by: manaswinig
1 Replies

4. Shell Programming and Scripting

find string nth occurrence in file and print line number

Hi I have requirement to find nth occurrence in a file and capture data from with in lines (between lines) Data in File. <QUOTE> <SESSION> <ATTRIBUTE NAME='Parameter Filename' VALUE='file1.parm'/> <ATTRIBUTE NAME='Service Name' VALUE='None'/> </SESSION> <SESSION> <ATTRIBUTE... (6 Replies)
Discussion started by: tmalik79
6 Replies

5. Shell Programming and Scripting

How to output all lines following Nth occurrence of string

Greetings experts. Searched the forums (perhaps not hard enough?) - Am searching for a method to capture all output from a log file following the nth occurrence of a known string. Background: Using bash, I want to monitor my Oracle DB alert log file. The script will count the total # of... (2 Replies)
Discussion started by: cjtravis
2 Replies

6. Shell Programming and Scripting

Extracting lines after nth LINE from an output

Hi all, Here is my problem for which i am breaking my head for past three days.. I have parted command output as follows.. Model: ATA WDC WD5000AAKS-0 (scsi) Disk /dev/sdb: 500GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type ... (3 Replies)
Discussion started by: selvarajvs
3 Replies

7. Emergency UNIX and Linux Support

Replace nth position character of all the lines in file

I want to replace 150th character of all the lines in a file using sed or awk... searched the forums but didn't find exact answer (9 Replies)
Discussion started by: greenworld123
9 Replies

8. Shell Programming and Scripting

Using paste command every nth number of file

Hi, I want to use paste command in a loop that does it every 6 files. My sample files are like the ones below. 20010101.txt 20010106.txt 20010111.txt 20010116.txt 20010121.txt 20010126.txt 20010131.txt 20010205.txt 20010210.txt 20010215.txt 20010220.txt 20010225.txt 20010302.txt... (4 Replies)
Discussion started by: ida1215
4 Replies

9. Shell Programming and Scripting

Pairing the nth elements on multiple lines iteratively

Hello, I'm trying to create a word translation section of a book. Each entry in the word list will come from a set of linguistically analyzed texts. Each sentence in the text has the following format. The first element in each line is the "name" of the line (i.e. "A","B","C","D"). The... (12 Replies)
Discussion started by: John Lyon
12 Replies

10. UNIX for Dummies Questions & Answers

Getting the lines with nth column non-null

Hi, I have a huge list of archives (.gz). Each archive is about 40MB. A file is generated every minute so if I want to analyze the data for 1 hour I get already 60 files for example. These are text files, ';' separated, each line having about 300 fields (columns). What I need to do is to... (11 Replies)
Discussion started by: Nenad
11 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 10:50 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy