Hello,
I have got one file with more than 120+ million records(35 GB in size). I have to extract some relevant data from file based on some parameter and generate other output file.
What will be the besat and fastest way to extract the ne file.
sample file format :--... (2 Replies)
I have a file containing date/time sorted data of the form
...
2009/06/10,20:59:59.950,XAG/USD,Q,1,1115, 14.3025,100,1,1
2009/06/10,20:59:59.950,XAG/USD,Q,1,1116, 14.3026,125,1,1
2009/06/10,20:59:59.950,XAG/USD,R,0,0, , 0,0,0
2009/06/10,20:59:59.950,XAG/USD,R,1,0, 14.1910,100,1,1... (6 Replies)
Hi All,
I am trying to extract data from a large text file , I want to extract lines which contains a five digit number followed by a hyphen , like
12345- , i tried with egrep ,eg : egrep "+" text.txt
but which returns all the lines which contains any number of digits followed by hyhen ,... (19 Replies)
I have a script with this statement:
/usr/xpg4/bin/awk -F"" 'NR==FNR{s=$2;next}{printf "%s\"%s\"\n", $0, s}' LOOKUP.TXT finallistnew.txt >test.txt
I want to include logic or an additional step that says if there is no data in field 3, move the whole line out of test.txt into an additional... (9 Replies)
I have the test data with 10 column separated by comma and each column has more than 1000000 rows. Can anyone help me to find empty field in all columns and delete that empty field alone and lift that column up by one row.
Data with empty field:
A74203XYZ,A21718XYZ,A72011XYZ,A41095XYZ,... (7 Replies)
So I want to put a line at the end of my script which greps for keywords from syslog.log that outputs the following after it is done:
"This file was last modified on (thisdate)"
I know I can use the following to get the date:
rtidsvb(izivanov):/home/izivanov> ll /var/adm/syslog/syslog.log ... (4 Replies)
I am trying to update an older program on a small cluster. It uses individual files to send jobs to each node. However the newer database comes as one large file, containing over 10,000 records. I therefore need to split this file. It looks like this:
HMMER3/b
NAME 1-cysPrx_C
ACC ... (2 Replies)
Hi all,
I want to remove empty field in a text file. I tried to used sed. But it failed.
Input:
LG10_PM_map_19_LEnd 1000560 G AG AG
LG10_PM_map_19_LEnd 1005621 G AG
LG10_PM_map_19_LEnd 1011214 A AG AG
LG10_PM_map_19_LEnd 1011673 T CT CT ... (3 Replies)
Dear all,
I want to extract around 300 columns from a very large file with almost 2million columns. There are no headers, but I can find out which column numbers I want. I know I can extract with the function 'cut -f2' for example just the second column but how do I do this for such a large... (1 Reply)
Hi All!!
I have a large file containing millions of records. My purpose is to extract 8 characters immediately from the given file.
222222222|ZRF|2008.pdf|2008|01/29/2009|001|B|C|C
222222222|ZRF|2009.pdf|2009|01/29/2010|001|B|C|C
222222222|ZRF|2010.pdf|2010|01/29/2011|001|B|C|C... (5 Replies)
Discussion started by: pavand
5 Replies
LEARN ABOUT DEBIAN
otfdump
OTFDUMP(1) User Commands OTFDUMP(1)NAME
otfdump - otfdump
DESCRIPTION
otfdump - convert otf traces or parts of it into a human readable, long
version
Options:
-h, --help
show this help message
-V show OTF version
-f <n> set max number of filehandles available (default: 50)
-o <file>
output file if the ouput file is unspecified the stdout will be used
--num <a> <b>
output only records no. [a,b]
--time <a> <b> output only records with time stamp in [a,b]
--nodef
omit definition records
--noevent
omit event records
--nostat
omit statistic records
--nosnap
omit snapshot records
--nomarker
omit marker records
--nokeyvalue
omit key-value pairs
--fullkeyvalue show key-value pairs including the contents
of byte-arrays
--procs <a>
show only processes <a> <a> is a space-seperated list of process-tokens
--records <a>
show only records <a> <a> is a space-seperated list of record-type-numbers record-type-numbers can be found in OTF_Definitions.h
(OTF_*_RECORD)
-s, --silent
do not display anything except the time otfdump needed to read the tracefile
otfdump 1.10.2 May 2012 OTFDUMP(1)