Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Parsing out records from one huge record Post 302173719 by bwrynz1 on Friday 7th of March 2008 03:50:49 PM
Old 03-07-2008
Parsing out records from one huge record

Hi,

I have one huge record and know that each record in the file is 550 bytes long. How do I parse out individual records from the single huge record.

Thanks,
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

deleting multiple records from a huge file at one time

I have a very big file of 5gb size and there are about 50 million records in there. I have to delete the records based on recrord number that I know fromoutside with out opening the file. The record numbers are very random like 5000678, 7890005 etc. Can somebody let me know how i can... (5 Replies)
Discussion started by: dsravan
5 Replies

2. UNIX for Advanced & Expert Users

Parsing records from one record

Hi, I got a file which is one huge record. I know each record should be 550 bytes long. How do I parse out the records from the one huge record. (1 Reply)
Discussion started by: bwrynz1
1 Replies

3. Shell Programming and Scripting

Grep matched records from huge file

111111111100000000001111111111 123232323200000010001114545454 232435424200000000001232131212 342354234301000000002323423443 232435424200000000001232131212 2390898994200000000001238908092 This is the record format. From 11th position to 20th position in a record there are 0's occuring,and... (6 Replies)
Discussion started by: mjkreddy
6 Replies

4. Shell Programming and Scripting

Parsing record into multiple records in Shell Script

Hi, I am trying to parse a very long record in a text file into multiple records by checking ADD, DELETE, or MODIFY field value in a shell script. Input # File name xyz.txt ADD|N000|8015662|DELETE|N001|9915662|MODIFY|N999|85678 Output ADD|N000|8015662| DELETE|N001|9915662|... (8 Replies)
Discussion started by: naveed
8 Replies

5. Shell Programming and Scripting

Multiple records based on ';' in the record

Hi All, I have a *.csv files in a die /pro/lif/dow, (pipe delimiter file), these files are having 8 columns and 6 column(CDR_LOGIC) records are populated as below, I need to incorporate the below logic in all the *.csv files. 11||:ColumnA||:ColumnB 123||:ColumnA IIF(:ColumnA = :ColumnC then... (6 Replies)
Discussion started by: shruthidwh
6 Replies

6. Shell Programming and Scripting

Need help splitting huge single record file

I was given a data file that I need to split into multiple lines/records based on a key word. The problem is that it is 2.5GB or bigger and everything I try in perl or sed causes a Segmentation fault. Can someone give me some other ideas. The data is of the form:... (5 Replies)
Discussion started by: leolson
5 Replies

7. Shell Programming and Scripting

reformat one record from two records

I have not get much answer/solution for the posting. Here I break down the question and hope to get some help. 1. How can I use AWK to read in two records at the same time and keep loop to next two when the condition is meet? position 1-10 --> Unique to identity whether... (4 Replies)
Discussion started by: menglm
4 Replies

8. Shell Programming and Scripting

Multiple Records from 1 Record

I need to make one record to multiple records based on occurence column in the record and change the date.For example below first record has 5 ,so need to create 5 records from one and change the date to 5 months.Occurence can be any number. I am unable to come with a script.Can some one help ... (5 Replies)
Discussion started by: traininfa
5 Replies

9. Shell Programming and Scripting

Fetching record based on Uniq Key from huge file.

Hi i want to fetch 100k record from a file which is looking like as below. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX ... (17 Replies)
Discussion started by: lathigara
17 Replies

10. UNIX for Beginners Questions & Answers

Help in printing records where there is a 'header' in the first record ???

Hi, I have a backup report that unfortunately has some kind of hanging indent thing where the first line contains one column more than the others I managed to get the output that I wanted using awk, but just wanting to know if there is short way of doing it using the same awk Below is what... (2 Replies)
Discussion started by: newbie_01
2 Replies
GETHUGEPAGESIZES(3)					     Library Functions Manual					       GETHUGEPAGESIZES(3)

NAME
gethugepagesizes - Get the system supported huge page sizes SYNOPSIS
#include <hugetlbfs.h> int gethugepagesizes(long pagesizes[], int n_elem); DESCRIPTION
The gethugepagesizes() function returns either the number of system supported huge page sizes or the sizes themselves. If pagesizes is NULL and n_elem is 0, then the number of huge pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem page sizes. RETURN VALUE
On success, either the number of huge page sizes supported by the system or the number of huge page sizes stored in pagesizes is returned. On failure, -1 is returned and errno is set appropriately. ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL. Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened. NOTES
This call will return all huge page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount points may not be available for all sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be called on a specific size to see if a mount point is configured. SEE ALSO
oprofile(1), opendir(3), hugetlbfs_find_path_for_size(3), libhugetlbfs(7) AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list. October 10, 2008 GETHUGEPAGESIZES(3)
All times are GMT -4. The time now is 05:19 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy