Sponsored Content
Top Forums Shell Programming and Scripting File splitting according to the length of the fields Post 302946554 by Akshay Hegde on Wednesday 10th of June 2015 07:45:27 AM
Old 06-10-2015
This may help you

Code:
[akshay@localhost tmp]$ cat description
Field Name	Length	Start Position
ADJ-TYPE-CODE	1	1
ALLOW-AMT	11	2
CAP-SRVC-NO	1	13
BILL-AMT	11	14

Code:
[akshay@localhost tmp]$ cat datafile 
R-0000017611N-00000350001095ANZU01
A00000017611N000000350001095ANZU02
R-0000019427N-00000265001202BGYI03
R-0000005977N-00000092001202BGYI03
R-0000017195N-00000995001353B1IZ03
A00000099500N000000995001353B1IZ04
R-0000258547N-00002266002019AXAJ01
A00000258547N000002266002019AXAJ02
R-0000012277N-00000216002026BLCF03
A00000012277N000000216002026BLCF04

Code:
[akshay@localhost tmp]$ cat extract.awk
function extract(str,field,i)
{ 
	for(i=1; i<=c; i++)
	{
		field = substr($0,A[i,3],A[i,2])  
		str   = str ? str OFS field : field
	}
		return str		
}

FNR==NR{
		if(NR==1)next
		hdr = hdr ? hdr OFS $1 : $1 	
		c++
		for(i=2; i<=NF; i++)A[c,i]=$i
		next
}

{
	print FNR==1 ? hdr RS extract() : extract()
}

Code:
[akshay@localhost tmp]$ awk -vOFS="," -f extract.awk description datafile 
ADJ-TYPE-CODE,ALLOW-AMT,CAP-SRVC-NO,BILL-AMT
R,-0000017611,N,-0000035000
A,00000017611,N,00000035000
R,-0000019427,N,-0000026500
R,-0000005977,N,-0000009200
R,-0000017195,N,-0000099500
A,00000099500,N,00000099500
R,-0000258547,N,-0000226600
A,00000258547,N,00000226600
R,-0000012277,N,-0000021600
A,00000012277,N,00000021600

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need help in splitting a line into fields in shell scripting

I have a line of more than 3000 bytes which will contain & as fields separator..I am using following awk command ..Its working but its not accepting the line more than 3000 bytes...Anyother alternate solution even in othe shell command also fine... awk -F '&' '{for( i=1; i<=NF; i++ ) print $i}'... (2 Replies)
Discussion started by: punithavel
2 Replies

2. Shell Programming and Scripting

Need awk script to compare 2 fields in fixed length file.

Need a script that manipulates a fixed length file that will compare 2 fields in that file and if they are equal write that line to a new file. i.e. If fields 87-93 = fields 119-125, then write the entire line to a new file. Do this for every line in the file. After we get only the fields... (1 Reply)
Discussion started by: Muga801
1 Replies

3. Shell Programming and Scripting

Fixed length fields

HPUX and posix shell Hi all. I have a record with fixed length fields....I would like to reorder the fields and preserver the fixed lengths.... cat test 4 960025460 Dept of Music 8 960025248 Dept of Music 12-08 cat... (3 Replies)
Discussion started by: lyoncc
3 Replies

4. Shell Programming and Scripting

Help with splitting fields

Hi. I want to put the first field to the end and the lines are of different number of fields. How should I do this with awk? Thanks. (3 Replies)
Discussion started by: dustinwang2003
3 Replies

5. Shell Programming and Scripting

Splitting fixed length file using awk

Hi, I need to split a fixed length file of 160 characters based on value of a column. Example: ABC 456780001 DGDG SDFSF BCD 444440002 SSSS TTTTT ABC 777750003 HHHH UUUUU THH 888880001 FFFF LLLLLL HHH 999990002 GGGG OOOOO I need to split this file on basis of column from... (7 Replies)
Discussion started by: Neelkanth
7 Replies

6. Shell Programming and Scripting

Splitting a column in two separate fields

for making a summary I have a CSV file which is transformed to .DAT. I have an AWK file which is supposing to do the mapping of the DAT file. The code from the AWK file is the one below. The content of the DAT file looks like this (tab separated): ODT AGE CDT CO SEX TIME VALUE COMMENT ... (1 Reply)
Discussion started by: grikoss
1 Replies

7. Shell Programming and Scripting

Help in splitting Sub Fields and compare with other field

Hi All, We are trying to pull out data from below table, the table contains four fields and out of which last two fields are having sub-fields with delimiter $, we want to identify number "1" position in the 3rd field and from 4th field need to extract the information from the same position. ... (4 Replies)
Discussion started by: rramkrishnas
4 Replies

8. Shell Programming and Scripting

Splitting a filed into multiple fields using awk

Hi, I have a tab delimited file as below: AWA Divi DD01 None 1 2 Room AC 01-MAY-15 31-OCT-15 OT 01-MAY-15 31-OCT-15 CF 01-MAY-15 31-OCT-15 AW0 Beach DD02 None 1 2 Double AC 01-MAY-15 31-OCT-15 AD 01-MAY-15 31-OCT-15 The number of columns(fields) after 7th field is not fixed and... (3 Replies)
Discussion started by: Bobby_2000
3 Replies

9. Shell Programming and Scripting

Splitting multiple fields of /usr/bin/id

Hi, Iv got the following input $id |grep uid uid=6090(dsiddiq) gid=1(staff) groups=4001(cdgrp) and Im using the below command to split the field to grab the numberical userid as well the alphabetical userid $id|awk -F'=' '{print $2}'|awk -F')' '{print $1}'|awk -F'(' '{print $1" "$2}'... (4 Replies)
Discussion started by: dsid
4 Replies

10. UNIX for Beginners Questions & Answers

Splitting the file based on two fields - Fixed length file

Hi , I am having a scenario where I need to split the file based on two field values. The file is a fixed length file. ex: AA0998703000000000000190510095350019500010005101980301 K 0998703000000000000190510095351019500020005101480 ... (4 Replies)
Discussion started by: saj
4 Replies
HISTORY(5)							File Formats Manual							HISTORY(5)

NAME
history - record of current and recently expired Usenet articles DESCRIPTION
The file /var/lib/news/history keeps a record of all articles currently stored in the news system, as well as those that have been received but since expired. In a typical production environment, this file will be many megabytes. The file consists of text lines. Each line corresponds to one article. The file is normally kept sorted in the order in which articles are received, although this is not a requirement. Innd(8) appends a new line each time it files an article, and expire(8) builds a new version of the file by removing old articles and purging old entries. Each line consists of two or three fields separated by a tab, shown below as : <Message-ID> date <Message-ID> date files The Message-ID field is the value of the article's Message-ID header, including the angle brackets. The date field consists of three sub-fields separated by a tilde. All sub-fields are the text representation of the number of seconds since the epoch -- i.e., a time_t; see gettimeofday(2). The first sub-field is the article's arrival date. If copies of the article are still present then the second sub-field is either the value of the article's Expires header, or a hyphen if no expiration date was speci- fied. If an article has been expired then the second sub-field will be a hyphen. The third sub-field is the value of the article's Date header, recording when the article was posted. The files field is a set of entries separated by one or more spaces. Each entry consists of the name of the newsgroup, a slash, and the article number. This field is empty if the article has been expired. For example, an article cross-posted to comp.sources.unix and comp.sources.d that was posted on February 10, 1991 (and received three min- utes later), with an expiration date of May 5, 1991, could have a history line (broken into two lines for display) like the following: <312@litchi.foo.com> 666162000~673329600~666162180 comp.sources.unix/1104 comp.sources.d/7056 In addition to the text file, there is a dbz(3z) database associated with the file that uses the Message-ID field as a key to determine the offset in the text file where the associated line begins. For historical reasons, the key includes the trailing byte (which is not stored in the text file). HISTORY
Written by Rich $alz <rsalz@uunet.uu.net> for InterNetNews. This is revision 1.12, dated 1996/09/06. SEE ALSO
dbz(3z), expire(8), innd(8), news-recovery(8). HISTORY(5)
All times are GMT -4. The time now is 10:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy