Excution Problems with loading huge data content and convert it
Hi,
I got long list of referred file content:
Input file content:
Desired output:
My main purpose is convert the input file's content that same like the content in referred file into "XXXXXXXXXXXXXXX" and written into an output file.
Thanks for any advice or suggestion.
But what is the problem ? You want us to give solution to you or you have some solution with issues ?!
I would suggest you to solve using perl, if you know perl already.
Hi thegeek,
I got try to use the sed command.
But it seems like sed command is not worked well when deal which huge data
Do you got better suggestion by using perl script to archive my goal?
Thanks for any advice.
Thanks for your suggestion. It is worked
But it seems like not worked if my input my sequence content is like:
It seems like I needed to remove the newline (\n) in the content before start doing the conversion, am I right?
Sed command that I try previously worked as well.
Just it takes time when loading for huge data convert
Thanks again my advice ya
Quote:
Originally Posted by rdcwayx
So what's the problem by sed?
---------- Post updated at 06:43 AM ---------- Previous update was at 05:47 AM ----------
Hi rdcwayx,
The command that you suggested will output the result in the same file (input file), am I right?
Besides that, if my referred file content got around 124464 reads (each reads length around 22) and the input file got around 115478631 bases, do you got any other suggestion to speed up the progress?
Thanks
Hi Friends,
I have a file with sample amount data as follows:
-89990.3456
8788798.990000128
55109787.20
-12455558989.90876
I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million.
How... (8 Replies)
I have a huge list of files (about 300,000) which have a pattern like this.
.I 1
.U
87049087
.S
Am J Emerg
.M
Allied Health Personnel/*; Electric Countershock/*;
.T
Refibrillation managed by EMT-Ds:
.P
ARTICLE.
.W
Some patients converted from ventricular fibrillation to organized... (1 Reply)
I habe a UNIX Variablw with content as below:
WHOLE_REC_TXT="$record"
where $record contains contents of file
Sample contents of file:
topic_id|1624|AIDS-HIV||
topic_id|1625|Allergies||
topic_id|1626|Alzheimer s||
topic_id|1627|Knee Pain||
topic_id|1628|Autism||... (2 Replies)
I have huge xml file in server and i want to convert it to .csv with specific column ...
i have search in blog but i didn't get any usefully command.
Thanks in advance (1 Reply)
I'll try to keep this short, but basically I need to figure out a way to load data in shared memory (this file will be called load.c) I will later access the data with a print.c program.
The "data" is in the form of a student database that looks like this
John Blakeman
111223333
560... (7 Replies)
i am studying a script which is used for data loading.
it has functions which deletes all the existing data before loading and then loads new fresh data.
but i am stuck up at function Replace into table ( col 1,col 2....)
Does this signify All Inserts. (1 Reply)
I am having problems with mysql authentication using courier-authlib (authdaemond). This is getting really frustrating. The error I am getting is:
Aug 28 17:48:48 www authdaemond: modules="authmysql", daemons=5
Aug 28 17:48:48 www authdaemond: Installing libauthmysql
Aug 28 17:48:48 www... (0 Replies)