Unfortunately, my perl script error causing I get this output file (output.txt) content:
>seq_1 5
>seq_1 5
>seq_1 5
>seq_1 5
.
.
.
My purpose want to let the perl script and split all input data header by header and then run the program and all output in one file (output.txt).
Unfortunately, my perl script error causing that it only consider the first header throughout the end
Thus my output file only got the detail of first header read
Thanks a lot for any advice and point out what mistakes I did
Besides perl script, any other alternative solution like awk or sed, I also appreciate it ^^
Thanks again
thanks a lot, skmdu
Your perl script worked perfectly for my input file
Thanks a lot.
Besides that, can I ask you that how can I edit my perl script to let it only run my "program_name_count_length" program to archive the same goal?
Thanks again, skmdu ^^
Hi skmdu,
sorry ya because the program actually is a binary mode.
I can't edit it as well, only can run it
Thus I try to write a script to let each header data run the same program one by one and output it to a output file.
I found out that "system call" command able to archive this goal. But need to write a script to let the program run the data one by one.
really thanks for your help, skmdu.
I got try one way to archive the same goals but it is not a good solution and no effective
First of all. I split all the content into a lot of file based on its header. After then I run each of the file with the desired program and cat each output result to a combined file at the end.
It is worked as well. Just not a good solution
Gents
I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server.
How to can I do that without changing anything in the source.
Please your support. (1 Reply)
Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file
File delimiter "|"
Need to have Sum of all columns, with column number : aggregation (summation) for each column
File not having the header
Like below -
Column 1 "Total
Column 2 : "Total
...
...... (2 Replies)
We have a folder XYZ with large number of files (>350,000). how can i split the folder and create say 10 of them XYZ1 to XYZ10 with 35,000 files each. (doesnt matter which files go where). (12 Replies)
Hi,
I have a Huge 7 GB file which has around 1 million records, i want to split this file into 4 files to contain around 250k messages each.
Please help me as Split command cannot work here as it might miss tags..
Format of the file is as below
<!--###### ###### START-->... (6 Replies)
into small files. i need to add a head.txt and tail.txt into small files at the begin and end, and give a name as q1.xml q2.xml q3.xml ....
thank you very much. (2 Replies)
Hi
When i copy 300GB of data from one filesystem to the other filesystem in AIX I get the error :
tar: 0511-825 The file 'SAPBRD.dat' is too large.
The command I used is :
# tar -cf - . | (cd /sapbackup ; tar -xf - )
im copying as root
The below is my ulimit -a output :
... (3 Replies)
I’m new to Linux script and not sure how to filter out bad records from huge flat files (over 1.3GB each). The delimiter is a semi colon “;”
Here is the sample of 5 lines in the file:
Name1;phone1;address1;city1;state1;zipcode1
Name2;phone2;address2;city2;state2;zipcode2;comment... (7 Replies)
Hello Everyone,
I have a perl script that reads two types of data files (txt and XML). These data files are huge and large in number. I am using something like this :
foreach my $t (@text)
{
open TEXT, $t or die "Cannot open $t for reading: $!\n";
while(my $line=<TEXT>){
... (4 Replies)
Friends,
I have to write a shell script,the description is----
i Have to check the uniqueness of the numbers in a file.
A file is containing 200thousand tickets and a ticket have 15 numbers in asecending order.And there is a strip that is having 6 tickets that means 90 numbers.I... (7 Replies)