07-14-2008
script to splite large file to number of small files
Dear All,
Could you please help me to split a file contain around 240,000,000 line to 4 files all equally likely , note that we need to maintain that the end of each file should started by start flage (MSISDN) and ended by end flag (End), also the number of the line between the start flag (MSISDN) and end flag (End) are variable.
Kindly find below sample of the lines in the file
//////////////////////////////////////////////////////
MSISDN
line1
line2
line3
line4
line5
.
.
Line36
End
MSISDN
line1
line2
line3
line4
line5
.
.
line37
End
MSISDN
.
.
//////////////////////////////////////////
Thank you in advance,
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi,
I need to split a large file into small files based on a string.
At different palces in the large I have the string ^Job.
I need to split the file into different files starting from ^Job to the last character before the next ^Job.
Also all the small files should be automatically named.... (4 Replies)
Discussion started by: dncs
4 Replies
2. Shell Programming and Scripting
I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames
I tried the following logics
(1)
find . -name \*.dat > list
for i in `cat list` do mv $i test/ done
(2)
cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies
3. Shell Programming and Scripting
I have one large file, after every 200 line i have to split the file and the add header and footer to each small file?
It is possible to add different header and footer to each file? (7 Replies)
Discussion started by: ashish4422
7 Replies
4. Shell Programming and Scripting
I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc.
What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up.
its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies
5. Shell Programming and Scripting
Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example
VOICE_0000000000
VOICE_1223o23u0
VOICE_934934927349
I use the following code:
cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE
... (10 Replies)
Discussion started by: chriss_58
10 Replies
6. UNIX for Dummies Questions & Answers
Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them.
In the file, each line is a full path to a file:
/path/to/the files/file1
/path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies
7. Shell Programming and Scripting
Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box)
Also need make sure file has been sftped successfully...
Please let me know :
1. What is the best / faster way to transfer files?
2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies
8. Shell Programming and Scripting
Dear all,
I have huge txt file with the input files for some setup_code. However for running my setup_code, I require txt files with maximum of 1000 input files
Please help me in suggesting way to break down this big txt file to small txt file of 1000 entries only.
thanks and Greetings,
Emily (12 Replies)
Discussion started by: emily
12 Replies
9. Shell Programming and Scripting
Hi All,
I am having a situation now to delete a huge number of temp files created during run times approx. 16700+ files. We have never imagined that we will get this this much big list of files during run time. It worked fine for lesser no of files in the list. But when list is huge we are... (7 Replies)
Discussion started by: mad man
7 Replies
10. UNIX for Beginners Questions & Answers
I Have a large file with 24hrs log in the below format.i need to split the large file in to 24 small files on one hour based.i.e ex:from 09:55 to 10:55,10:55-11:55
can any one help me on this.!
... (20 Replies)
Discussion started by: Raghuram717
20 Replies
LEARN ABOUT PHP
stream_wrapper_register
STREAM_WRAPPER_REGISTER(3) 1 STREAM_WRAPPER_REGISTER(3)
stream_wrapper_register - Register a URL wrapper implemented as a PHP class
SYNOPSIS
bool stream_wrapper_register (string $protocol, string $classname, [int $flags])
DESCRIPTION
Allows you to implement your own protocol handlers and streams for use with all the other filesystem functions (such as fopen(3), fread(3)
etc.).
PARAMETERS
o $protocol
- The wrapper name to be registered.
o $classname
- The classname which implements the $protocol.
o $flags
- Should be set to STREAM_IS_URL if $protocol is a URL protocol. Default is 0, local stream.
RETURN VALUES
Returns TRUE on success or FALSE on failure.
stream_wrapper_register(3) will return FALSE if the $protocol already has a handler.
CHANGELOG
+--------+------------------------------+
|Version | |
| | |
| | Description |
| | |
+--------+------------------------------+
| 5.2.4 | |
| | |
| | Added the $flags parameter. |
| | |
+--------+------------------------------+
EXAMPLES
Example #1
How to register a stream wrapper
<?php
$existed = in_array("var", stream_get_wrappers());
if ($existed) {
stream_wrapper_unregister("var");
}
stream_wrapper_register("var", "VariableStream");
$myvar = "";
$fp = fopen("var://myvar", "r+");
fwrite($fp, "line1
");
fwrite($fp, "line2
");
fwrite($fp, "line3
");
rewind($fp);
while (!feof($fp)) {
echo fgets($fp);
}
fclose($fp);
var_dump($myvar);
if ($existed) {
stream_wrapper_restore("var");
}
?>
The above example will output:
line1
line2
line3
string(18) "line1
line2
line3
"
SEE ALSO
The "streamWrapper" prototype class, "Example class registered as stream wrapper", stream_wrapper_unregister(3), stream_wrapper_restore(3),
stream_get_wrappers(3).
PHP Documentation Group STREAM_WRAPPER_REGISTER(3)