Sponsored Content
Top Forums Shell Programming and Scripting Grep matched records from huge file Post 302338873 by lathavim on Wednesday 29th of July 2009 06:35:36 AM
Old 07-29-2009
Code:
while read record
do
a=`echo $record | cut -c11-20`
echo $a | grep 1 1>/dev/null
if [ $? -eq 0 ]
then
echo $record >>output
fi
done<input



---------- Post updated at 05:35 AM ---------- Previous update was at 05:30 AM ----------

Or


Code:
 
 awk '{ if( substr($0, 11, 10) ~ /1/ ) { print } }' inputfile

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

sed, grep, awk, regex -- extracting a matched substring from a file/string

Ok, I'm stumped and can't seem to find relevant info. (I'm not even sure, I might have asked something similar before.): I'm trying to use shell scripting/UNIX commands to extract URLs from a fairly large web page, with a view to ultimately wrapping this in PHP with exec() and including the... (2 Replies)
Discussion started by: ropers
2 Replies

2. Shell Programming and Scripting

deleting multiple records from a huge file at one time

I have a very big file of 5gb size and there are about 50 million records in there. I have to delete the records based on recrord number that I know fromoutside with out opening the file. The record numbers are very random like 5000678, 7890005 etc. Can somebody let me know how i can... (5 Replies)
Discussion started by: dsravan
5 Replies

3. Shell Programming and Scripting

grep all records in a file and get a word count -perl

Hi, I have a file .. file.txt .. i need to get a total record count in the files into a $variable.. im using perl script thanks (4 Replies)
Discussion started by: meghana
4 Replies

4. UNIX for Dummies Questions & Answers

Parsing out records from one huge record

Hi, I have one huge record and know that each record in the file is 550 bytes long. How do I parse out individual records from the single huge record. Thanks, (4 Replies)
Discussion started by: bwrynz1
4 Replies

5. UNIX for Dummies Questions & Answers

Grep specific records from a file of records that are separated by an empty line

Hi everyone. I am a newbie to Linux stuff. I have this kind of problem which couldn't solve alone. I have a text file with records separated by empty lines like this: ID: 20 Name: X Age: 19 ID: 21 Name: Z ID: 22 Email: xxx@yahoo.com Name: Y Age: 19 I want to grep records that... (4 Replies)
Discussion started by: Atrisa
4 Replies

6. HP-UX

Performance issue with 'grep' command for huge file size

I have 2 files; one file (say, details.txt) contains the details of employees and another file (say, emp.txt) has some selected employee names. I am extracting employee details from details.txt by using emp.txt and the corresponding code is: while read line do emp_name=`echo $line` grep -e... (7 Replies)
Discussion started by: arb_1984
7 Replies

7. Shell Programming and Scripting

Grep records out of file

Hi, I have a file where there "Tab" seperated values are present.I need to identify duplicate entries based on column 1 & 6 only . For e.g : I tried using uniq ..but the output is only having one duplicate entry, instead of both the entries.I need both the above entries . uniq -f5... (2 Replies)
Discussion started by: appu2176
2 Replies

8. UNIX for Dummies Questions & Answers

What is the faster way to grep from huge file?

Hi All, I am new to this forum and this is my first post. My requirement is like to optimize the time taken to grep the file with 40000 lines. There are two files FILEA(40000 lines) FILEB(40000 lines). The requirement is like this, both the file will be in the format below... (11 Replies)
Discussion started by: mad man
11 Replies

9. Shell Programming and Scripting

Extract Matched Records from XML

Hi All, I have a requirement to extract para in XML file on the basis of another list file having specific parameters. I will extract these para from XML and import in one scheduler tool. file2 <FOLDER DATACENTER="ControlMserver" VERSION="800" PLATFORM="UNIX" FOLDER_NAME="SH_AP_INT_B01"... (3 Replies)
Discussion started by: looney
3 Replies

10. Shell Programming and Scripting

How to fetch matched records from files between two different directory?

awk 'NR==FNR{arr;next} $0 in arr' /tmp/Data_mismatch.sh /prd/HK/ACCTCARD_20160115.txt edit by bakunin: seems that one CODE-tag got lost somewhere. i corrected that, but please check your posts more carefully. Thank you. (5 Replies)
Discussion started by: suresh_target
5 Replies
GETHUGEPAGESIZES(3)					     Library Functions Manual					       GETHUGEPAGESIZES(3)

NAME
gethugepagesizes - Get the system supported huge page sizes SYNOPSIS
#include <hugetlbfs.h> int gethugepagesizes(long pagesizes[], int n_elem); DESCRIPTION
The gethugepagesizes() function returns either the number of system supported huge page sizes or the sizes themselves. If pagesizes is NULL and n_elem is 0, then the number of huge pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem page sizes. RETURN VALUE
On success, either the number of huge page sizes supported by the system or the number of huge page sizes stored in pagesizes is returned. On failure, -1 is returned and errno is set appropriately. ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL. Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened. NOTES
This call will return all huge page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount points may not be available for all sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be called on a specific size to see if a mount point is configured. SEE ALSO
oprofile(1), opendir(3), hugetlbfs_find_path_for_size(3), libhugetlbfs(7) AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list. October 10, 2008 GETHUGEPAGESIZES(3)
All times are GMT -4. The time now is 09:17 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy