Sponsored Content
Full Discussion: Need to split record
Top Forums Shell Programming and Scripting Need to split record Post 302845103 by drl on Tuesday 20th of August 2013 10:06:09 PM
Old 08-20-2013
Hi.
Code:
$ echo 123456789 | fold -w 3
123
456
789

Best wishes ... cheers, drl
These 2 Users Gave Thanks to drl For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split a record

UNIX Scripting Hi I am trying to read a record and split it into multiple records My Record looks like this 1001A0010@B0010*&^0)C0012hgdj&6sD0020fhfri93kivmepi9 where UniqueID is 1001 segments are A,B,C,D length of each segment is 4 characters after the segment 0010 for A 0010 for B 0012... (5 Replies)
Discussion started by: pukars4u
5 Replies

2. Shell Programming and Scripting

How to split a file record

-Hi, I have a problem with parcing/spliting a file record into two parts and assigning the split parts to two viriables. The record is as follows: ftrn facc ttrd feed xref fsdb fcp ruldb csdb omom fordr ftxn fodb fsdc texc oxox reng ttrn ttxn fqdb ... (5 Replies)
Discussion started by: aoussenko
5 Replies

3. Shell Programming and Scripting

Split a record based on particular match

Hi , I have a requirement to split the record based on particular match using UNIX. Case1: Input Record : 10.44.48.63;"Personals/Dating;sports";1441 Output Records : 10.44.48.63;Personals/Dating;1441;Original 10.44.48.63;sports;1441;Dummy Case2: Input Record : ... (5 Replies)
Discussion started by: mksuneel
5 Replies

4. Shell Programming and Scripting

Record split.

I want to keep only records contain length is 10 other records should remove from my original file without redirecting to other output file. Source 1234567890 123456789011234 abcdefghil Expected Result 1234567890 abcdefghil (9 Replies)
Discussion started by: Jairaj
9 Replies

5. Shell Programming and Scripting

split record based on delimiter

Hi, My inputfile contains field separaer is ^. 12^inms^ 13^fakdks^ssk^s3 23^avsd^ 13^fakdks^ssk^a4 I wanted to print only 2 delimiter occurence i.e 12^inms^ 23^avsd^ (4 Replies)
Discussion started by: Jairaj
4 Replies

6. UNIX for Dummies Questions & Answers

split record without pattern

Hi , I have file with all records in one line, which needs to split it to have a fixed length.Am trying to execute the below script for the same FILENAME="$1" while line LINE do echo $LINE | awk 'BEGIN{n=1}{while(substr($0,n,10)){print substr($0,n,10);n+=10}}' done < $FILENAME it... (4 Replies)
Discussion started by: nishantrk
4 Replies

7. Shell Programming and Scripting

split content and write to new record

Hi, Help required to split record value and write to new row. Input a~b~c~value in ('3','4','5')~test output a~b~c~3~test a~b~c~4~test a~b~c~5~test input a~b~c~value in ('3','4')~test output a~b~c~3~test a~b~c~4~test (8 Replies)
Discussion started by: Jairaj
8 Replies

8. UNIX for Dummies Questions & Answers

Split single record to multiple records

Hi Friends, source .... col1,col2,col3 a,b,1;2;3 here colom delimeter is comma(,). here we dont know what is the max length of col3 means now we have 1;2;3 next time i will receive 1;2;3;4;5;etc... required output .............. col1,col2,col3 a,b,1 a,b,2 a,b,3 please give me... (5 Replies)
Discussion started by: bab.galary
5 Replies

9. Shell Programming and Scripting

How to split one record to multiple records?

Hi, I have one tab delimited file which is having multiple store_ids in first column seprated by pipe.I want to split the file on the basis of store_id(separating 1st record in to 2 records ). I tried some more options like below with using split,awk etc ,But not able to get proper output. can... (1 Reply)
Discussion started by: jaggy
1 Replies

10. UNIX for Advanced & Expert Users

How to split large file with different record delimiter?

Hi, I have received a file which is 20 GB. We would like to split the file into 4 equal parts and process it to avoid memory issues. If the record delimiter is unix new line, I could use split command either with option l or b. The problem is that the line terminator is |##| How to use... (5 Replies)
Discussion started by: Ravi.K
5 Replies
FPUTCSV(3)								 1								FPUTCSV(3)

fputcsv - Format line as CSV and write to file pointer

SYNOPSIS
int fputcsv (resource $handle, array $fields, [string $delimiter = ","], [string $enclosure = '"'], [string $escape_char = " DESCRIPTION
fputcsv(3) formats a line (passed as a $fields array) as CSV and write it (terminated by a newline) to the specified file $handle. PARAMETERS
o $handle -The file pointer must be valid, and must point to a file successfully opened by fopen(3) or fsockopen(3) (and not yet closed by fclose(3)). o $fields - An array of values. o $delimiter - The optional $delimiter parameter sets the field delimiter (one character only). o $enclosure - The optional $enclosure parameter sets the field enclosure (one character only). o $escape_char - The optional $escape_char parameter sets the escape character (one character only). RETURN VALUES
Returns the length of the written string or FALSE on failure. CHANGELOG
+--------+---------------------------------------+ |Version | | | | | | | Description | | | | +--------+---------------------------------------+ | 5.5.4 | | | | | | | The $escape_char parameter was added | | | | +--------+---------------------------------------+ EXAMPLES
Example #1 fputcsv(3) example <?php $list = array ( array('aaa', 'bbb', 'ccc', 'dddd'), array('123', '456', '789'), array('"aaa"', '"bbb"') ); $fp = fopen('file.csv', 'w'); foreach ($list as $fields) { fputcsv($fp, $fields); } fclose($fp); ?> The above example will write the following to file.csv: aaa,bbb,ccc,dddd 123,456,789 """aaa""","""bbb""" NOTES
Note If PHP is not properly recognizing the line endings when reading files either on or created by a Macintosh computer, enabling the auto_detect_line_endings run-time configuration option may help resolve the problem. SEE ALSO
fgetcsv(3). PHP Documentation Group FPUTCSV(3)
All times are GMT -4. The time now is 10:28 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy