Parsing out records from one huge record


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Parsing out records from one huge record
# 1  
Old 03-07-2008
Parsing out records from one huge record

Hi,

I have one huge record and know that each record in the file is 550 bytes long. How do I parse out individual records from the single huge record.

Thanks,
# 2  
Old 03-07-2008
Quote:
Originally Posted by bwrynz1
Hi,

I have one huge record and know that each record in the file is 550 bytes long. How do I parse out individual records from the single huge record.

Thanks,
Show how the input (one huge record) and what the output (individual records from the single huge record) looks like.
# 3  
Old 03-07-2008
The one huge record file is 9 MB long... I'm sorry I can't post it here.

But assume there is a record as follows -
1234567890abcdefghij1234567890abcdefghij1234567890
and I wanted to put a line break after each 10 characters and get the following -
1234567890
abcdefghij
1234567890
abcdefghij
1234567890

Thanks for offering to help
# 4  
Old 03-07-2008
The one huge record file is 9 MB long... I'm sorry I can't post it here.

But assume there is a record as follows -
1234567890abcdefghij1234567890abcdefghij1234567890
and I wanted to put a line break after each 10 characters and get the following -
1234567890
abcdefghij
1234567890
abcdefghij
1234567890

Thanks for offering to help
# 5  
Old 03-07-2008
No UNIX tool can read a 9MB record. If you look at the man page for dd, you can specify a cbs=10:

Code:
dd cbs=10 conv=unblock < oldfile > newfile

This creates a stream of carriage returns at the end of 10 character records.
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Help in printing records where there is a 'header' in the first record ???

Hi, I have a backup report that unfortunately has some kind of hanging indent thing where the first line contains one column more than the others I managed to get the output that I wanted using awk, but just wanting to know if there is short way of doing it using the same awk Below is what... (2 Replies)
Discussion started by: newbie_01
2 Replies

2. Shell Programming and Scripting

Fetching record based on Uniq Key from huge file.

Hi i want to fetch 100k record from a file which is looking like as below. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX ... (17 Replies)
Discussion started by: lathigara
17 Replies

3. Shell Programming and Scripting

Multiple Records from 1 Record

I need to make one record to multiple records based on occurence column in the record and change the date.For example below first record has 5 ,so need to create 5 records from one and change the date to 5 months.Occurence can be any number. I am unable to come with a script.Can some one help ... (5 Replies)
Discussion started by: traininfa
5 Replies

4. Shell Programming and Scripting

reformat one record from two records

I have not get much answer/solution for the posting. Here I break down the question and hope to get some help. 1. How can I use AWK to read in two records at the same time and keep loop to next two when the condition is meet? position 1-10 --> Unique to identity whether... (4 Replies)
Discussion started by: menglm
4 Replies

5. Shell Programming and Scripting

Need help splitting huge single record file

I was given a data file that I need to split into multiple lines/records based on a key word. The problem is that it is 2.5GB or bigger and everything I try in perl or sed causes a Segmentation fault. Can someone give me some other ideas. The data is of the form:... (5 Replies)
Discussion started by: leolson
5 Replies

6. Shell Programming and Scripting

Multiple records based on ';' in the record

Hi All, I have a *.csv files in a die /pro/lif/dow, (pipe delimiter file), these files are having 8 columns and 6 column(CDR_LOGIC) records are populated as below, I need to incorporate the below logic in all the *.csv files. 11||:ColumnA||:ColumnB 123||:ColumnA IIF(:ColumnA = :ColumnC then... (6 Replies)
Discussion started by: shruthidwh
6 Replies

7. Shell Programming and Scripting

Parsing record into multiple records in Shell Script

Hi, I am trying to parse a very long record in a text file into multiple records by checking ADD, DELETE, or MODIFY field value in a shell script. Input # File name xyz.txt ADD|N000|8015662|DELETE|N001|9915662|MODIFY|N999|85678 Output ADD|N000|8015662| DELETE|N001|9915662|... (8 Replies)
Discussion started by: naveed
8 Replies

8. Shell Programming and Scripting

Grep matched records from huge file

111111111100000000001111111111 123232323200000010001114545454 232435424200000000001232131212 342354234301000000002323423443 232435424200000000001232131212 2390898994200000000001238908092 This is the record format. From 11th position to 20th position in a record there are 0's occuring,and... (6 Replies)
Discussion started by: mjkreddy
6 Replies

9. UNIX for Advanced & Expert Users

Parsing records from one record

Hi, I got a file which is one huge record. I know each record should be 550 bytes long. How do I parse out the records from the one huge record. (1 Reply)
Discussion started by: bwrynz1
1 Replies

10. Shell Programming and Scripting

deleting multiple records from a huge file at one time

I have a very big file of 5gb size and there are about 50 million records in there. I have to delete the records based on recrord number that I know fromoutside with out opening the file. The record numbers are very random like 5000678, 7890005 etc. Can somebody let me know how i can... (5 Replies)
Discussion started by: dsravan
5 Replies
Login or Register to Ask a Question