Sponsored Content
Top Forums Shell Programming and Scripting Grep matched records from huge file Post 302338832 by mjkreddy on Wednesday 29th of July 2009 04:32:26 AM
Old 07-29-2009
Grep matched records from huge file

111111111100000000001111111111
123232323200000010001114545454
232435424200000000001232131212
342354234301000000002323423443
232435424200000000001232131212
2390898994200000000001238908092
This is the record format.
From 11th position to 20th position in a record there are 0's occuring,and in some records there are value in between 11th to 20th positoin.

Now i want to grep only 4 nd 6 reocrd from a dat file,and if those records having numeric in between 11th to 20th position ie 0000100000,(only 1 will occur at any place from 11th to 20th).

Please help me out.

---------- Post updated at 02:02 PM ---------- Previous update was at 01:42 PM ----------

need only 4th and 6th records with 000000000 's from 11 to 20th position and 1 anyware in between.0's

Record Format

111111111100000000001111111111
123232323200000010001114545454
232435424200000000001232131212
342354234301000000002323423443
232435424200000000001232131212
2390898994200000000001238908092
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

sed, grep, awk, regex -- extracting a matched substring from a file/string

Ok, I'm stumped and can't seem to find relevant info. (I'm not even sure, I might have asked something similar before.): I'm trying to use shell scripting/UNIX commands to extract URLs from a fairly large web page, with a view to ultimately wrapping this in PHP with exec() and including the... (2 Replies)
Discussion started by: ropers
2 Replies

2. Shell Programming and Scripting

deleting multiple records from a huge file at one time

I have a very big file of 5gb size and there are about 50 million records in there. I have to delete the records based on recrord number that I know fromoutside with out opening the file. The record numbers are very random like 5000678, 7890005 etc. Can somebody let me know how i can... (5 Replies)
Discussion started by: dsravan
5 Replies

3. Shell Programming and Scripting

grep all records in a file and get a word count -perl

Hi, I have a file .. file.txt .. i need to get a total record count in the files into a $variable.. im using perl script thanks (4 Replies)
Discussion started by: meghana
4 Replies

4. UNIX for Dummies Questions & Answers

Parsing out records from one huge record

Hi, I have one huge record and know that each record in the file is 550 bytes long. How do I parse out individual records from the single huge record. Thanks, (4 Replies)
Discussion started by: bwrynz1
4 Replies

5. UNIX for Dummies Questions & Answers

Grep specific records from a file of records that are separated by an empty line

Hi everyone. I am a newbie to Linux stuff. I have this kind of problem which couldn't solve alone. I have a text file with records separated by empty lines like this: ID: 20 Name: X Age: 19 ID: 21 Name: Z ID: 22 Email: xxx@yahoo.com Name: Y Age: 19 I want to grep records that... (4 Replies)
Discussion started by: Atrisa
4 Replies

6. HP-UX

Performance issue with 'grep' command for huge file size

I have 2 files; one file (say, details.txt) contains the details of employees and another file (say, emp.txt) has some selected employee names. I am extracting employee details from details.txt by using emp.txt and the corresponding code is: while read line do emp_name=`echo $line` grep -e... (7 Replies)
Discussion started by: arb_1984
7 Replies

7. Shell Programming and Scripting

Grep records out of file

Hi, I have a file where there "Tab" seperated values are present.I need to identify duplicate entries based on column 1 & 6 only . For e.g : I tried using uniq ..but the output is only having one duplicate entry, instead of both the entries.I need both the above entries . uniq -f5... (2 Replies)
Discussion started by: appu2176
2 Replies

8. UNIX for Dummies Questions & Answers

What is the faster way to grep from huge file?

Hi All, I am new to this forum and this is my first post. My requirement is like to optimize the time taken to grep the file with 40000 lines. There are two files FILEA(40000 lines) FILEB(40000 lines). The requirement is like this, both the file will be in the format below... (11 Replies)
Discussion started by: mad man
11 Replies

9. Shell Programming and Scripting

Extract Matched Records from XML

Hi All, I have a requirement to extract para in XML file on the basis of another list file having specific parameters. I will extract these para from XML and import in one scheduler tool. file2 <FOLDER DATACENTER="ControlMserver" VERSION="800" PLATFORM="UNIX" FOLDER_NAME="SH_AP_INT_B01"... (3 Replies)
Discussion started by: looney
3 Replies

10. Shell Programming and Scripting

How to fetch matched records from files between two different directory?

awk 'NR==FNR{arr;next} $0 in arr' /tmp/Data_mismatch.sh /prd/HK/ACCTCARD_20160115.txt edit by bakunin: seems that one CODE-tag got lost somewhere. i corrected that, but please check your posts more carefully. Thank you. (5 Replies)
Discussion started by: suresh_target
5 Replies
HUGETLBFS_FIND_PATH(3)					     Library Functions Manual					    HUGETLBFS_FIND_PATH(3)

NAME
hugetlbfs_find_path, hugetlbfs_find_path_for_size - Locate an appropriate hugetlbfs mount point SYNOPSIS
#include <hugetlbfs.h> const char *hugetlbfs_find_path(void); const char *hugetlbfs_find_path_for_size(long page_size); DESCRIPTION
These functions return a pathname for a mounted hugetlbfs filesystem for the appropriate huge page size. For hugetlbfs_find_path, the default huge page size is used (see gethugepagesize(3)). For hugetlbfs_find_path_for_size, a valid huge page size must be specified (see gethugepagesizes(3)). RETURN VALUE
On success, a non-NULL value is returned. On failure, NULL is returned. SEE ALSO
libhugetlbfs(7), gethugepagesize(3), gethugepagesizes(3) AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list. March 7, 2012 HUGETLBFS_FIND_PATH(3)
All times are GMT -4. The time now is 08:21 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy