Sponsored Content
Top Forums Shell Programming and Scripting extract unique pattern from large text file Post 302338120 by johnbach on Monday 27th of July 2009 06:07:21 AM
Old 07-27-2009
Code:
egrep  '[0-9]{5}-'  file |egrep -v '[0-9]{6}-'

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Extract pattern from text line

Hi, the text line looks like this: "test1" " " "test2" "test3" "test4" "10" "test 10 12" "00:05:58" "filename.bin" "3.3MB" "/dir/name" "18459" what's the best way to select any of it? So I can for example get only the time or size and so on. I was trying awk -F""" '{print $N}' but... (3 Replies)
Discussion started by: TehOne
3 Replies

2. Shell Programming and Scripting

Need to extract 7 characters immediately after text '19' from a large file.

Hi All!! I have a large file containing millions of record. My purpose is to extract 7 characters immediately after text '19' from this file (including text '19') and save the result in new file. So, my OUTPUT would be as under : 191234561 194567894 192789005 198839408 and so on..... ... (7 Replies)
Discussion started by: parshant_bvcoe
7 Replies

3. Shell Programming and Scripting

sed: Find start of pattern and extract text to end of line, including the pattern

This is my first post, please be nice. I have tried to google and read different tutorials. The task at hand is: Input file input.txt (example) abc123defhij-E-1234jslo 456ujs-W-abXjklp From this file the task is to grep the -E- and -W- strings that are unique and write a new file... (5 Replies)
Discussion started by: TestTomas
5 Replies

4. UNIX for Dummies Questions & Answers

Extract Unique Values from file

Hello all, I have a file with following sample data 2009-08-26 05:32:01.65 spid5 Process ID 86:214 owns resources that are blocking processes on Scheduler 0. 2009-08-26 05:32:01.65 spid5 Process ID 86:214 owns resources that are blocking processes on Scheduler 0. 2009-08-26... (5 Replies)
Discussion started by: simonsimon
5 Replies

5. UNIX for Dummies Questions & Answers

Extract unique combination of rows from text files

Hi Gurus, I have 100 tab-delimited text files each with 21 columns. I want to extract only 2nd and 5th column from each text file. However, the values in both 2bd and 5th column contain duplicate values but the combination of these values in a row are not duplicate. I want to extract only those... (3 Replies)
Discussion started by: Unilearn
3 Replies

6. Shell Programming and Scripting

Extract UNIque records from File

Hi, I have a file with 20GB Pipe Delimited file where i have too many duplicate records. I need an awk script to extract the unique records from the file and put it into another file. Kindly help. Thanks, Arun (1 Reply)
Discussion started by: Arun Mishra
1 Replies

7. Shell Programming and Scripting

Extract specific line in an html file starting and ending with specific pattern to a text file

Hi This is my first post and I'm just a beginner. So please be nice to me. I have a couple of html files where a pattern beginning with "http://www.site.com" and ending with "/resource.dat" is present on every 241st line. How do I extract this to a new text file? I have tried sed -n 241,241p... (13 Replies)
Discussion started by: dejavo
13 Replies

8. Shell Programming and Scripting

Extract all the sentences from a text file that matches a pattern list

Hi I have a big text file. I want to extract all the sentences that matches at least 70% (seventy percent) of the words from each sentence based on a word list called A. Say the format of the text file is as given below: This is the first sentence which consists of fifteen words... (4 Replies)
Discussion started by: my_Perl
4 Replies

9. Shell Programming and Scripting

Extract pattern from text

Hi all, I got a txt here and I need to extract all D 8888 44 and D 8888 43 + next field =",g("en")];f._sn&&(f._sn= "og."+f._sn);for(var n in f)l.push("&"),l.push(g(n)),l.push("="),l.push(g(f));l.push("&emsg=");l.push(g(d.name+":"+d.message));var m=l.join("");Ea(m)&&(m=m.substr(0,2E3));c=m;var... (5 Replies)
Discussion started by: stinkefisch
5 Replies

10. UNIX for Beginners Questions & Answers

sed awk: split a large file to unique file names

Dear Users, Appreciate your help if you could help me with splitting a large file > 1 million lines with sed or awk. below is the text in the file input file.txt scaffold1 928 929 C/T + scaffold1 942 943 G/C + scaffold1 959 960 C/T +... (6 Replies)
Discussion started by: kapr0001
6 Replies
NNGRAB(1)						      General Commands Manual							 NNGRAB(1)

NAME
nngrab - news retrieval by keyword (nn) SYNOPSIS
nngrab [ -c ] keyword DESCRIPTION
nngrab invokes nn on all USENET articles whose subject (or keyword) field(s) contain an instance of keyword. nngrab is a fast equivalent for: nn -mxX -s/keyword all For example, nngrab tesla will retrieve items concerning Nikola Tesla. Keyword case is ignored unless -c is specified, and the keyword can be a regular expressions (escaped to avoid conflicts with the shell). For example, nngrab "n.*tesla" The range of search includes all newsgroups on the system, including ones which are unsubscribed. FILES
$db/subjects subject database SEE ALSO
nn(1), nnspew(8), egrep(1) NOTES
nngrab can be much faster than the equivalent command shown above, if the tertiary news subject database generated by the nnspew(8) daemon exists. To enable the faster operation, nnspew must be executed regularly by cron. nngrab uses egrep(1) to scan the subject database, so if you are not running fast egrep (GNU-style) this is all for naught. nngrab will use a subject database generated by nnspew independent of its age. Thus, if you stop running nnspew, remember to remove the subjects file as well. BUGS
Under version 6.4 onwards, search of the "Keywords:" field is not supported. Search on name is not possible either. AUTHOR
James A. Woods, NASA Ames Research Center E-mail: jaw@ames.arc.nasa.gov 4th Berkeley Distribution Release 6.6 NNGRAB(1)
All times are GMT -4. The time now is 10:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy