Sponsored Content
Top Forums Shell Programming and Scripting The builtin split function in AWK is too slow Post 302423102 by danmero on Thursday 20th of May 2010 07:21:59 AM
Old 05-20-2010
Quote:
Originally Posted by kevintse
I have a text file that contains 4 million lines, each line contains 2 fields(colon as field separator).
....
Here I have to split the second field(can be up to 40,000 fields) by comma into an array for analysis,
Code:
print $2 | cmd; close(cmd)} ' data.txt

.....
How to solve this problem? or you have other alternatives to replacing the split function?
Calling the external command will be to slow (4,000,000 x 40,000 = 160,000,000,000 calls).
Please answer Franklin52 previous question.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

split function

Hi all! I am relatively new to UNIX staff, and I have come across a problem: I have a big directory, which contains 100 smaller ones. Each of the 100 contains a file ending in .txt , so there are 100 files ending in .txt I want to split each of the 100 files in smaller ones, which will contain... (4 Replies)
Discussion started by: ktsirig
4 Replies

2. Shell Programming and Scripting

perl split function

$mystring = "name:blk:house::"; print "$mystring\n"; @s_format = split(/:/, $mystring); for ($i=0; $i <= $#s_format; $i++) { print "index is $i,field is $s_format"; print "\n"; } $size = $#s_format + 1; print "total size of array is $size\n"; i am expecting my size to be 5, why is it... (5 Replies)
Discussion started by: new2ss
5 Replies

3. UNIX for Dummies Questions & Answers

Split a file with no pattern -- Split, Csplit, Awk

I have gone through all the threads in the forum and tested out different things. I am trying to split a 3GB file into multiple files. Some files are even larger than this. For example: split -l 3000000 filename.txt This is very slow and it splits the file with 3 million records in each... (10 Replies)
Discussion started by: madhunk
10 Replies

4. Shell Programming and Scripting

awk - split function

Hi, I have some output in the form of: #output: abc123 def567 hij890 ghi324 the above is in one column, stored in the variable x ( and if you wana know about x... x=sprintf(tolower(substr(someArray,1,1)substr(userArray,3,1)substr(userArray,2,1))) when i simply print x (print x) I get... (7 Replies)
Discussion started by: fusionX
7 Replies

5. Shell Programming and Scripting

Use split function in perl

Hello, if i have file like this: 010000890306932455804 05306977653873 0520080417010520ISMS SMT ZZZZZZZZZZZZZOC30693599000 30971360000 ZZZZZZZZZZZZZZZZZZZZ202011302942311 010000890306946317387 05306977313623 0520080417010520ISMS SMT ZZZZZZZZZZZZZOC306942190000 30971360000... (5 Replies)
Discussion started by: chriss_58
5 Replies

6. Homework & Coursework Questions

PERL split function

Hi... I have a question regarding the split function in PERL. I have a very huge csv file (more than 80 million records). I need to extract a particular position(eg : 50th position) of each line from the csv file. I tried using split function. But I realized split takes a very long time. Also... (1 Reply)
Discussion started by: castle
1 Replies

7. Homework & Coursework Questions

PERL split function

Hi... I have a question regarding the split function in PERL. I have a very huge csv file (more than 80 million records). I need to extract a particular position(eg : 50th position) of each line from the csv file. I tried using split function. But I realized split takes a very long time. Also... (0 Replies)
Discussion started by: castle
0 Replies

8. Shell Programming and Scripting

PERL split function

Hi... I have a question regarding the split function in PERL. I have a very huge csv file (more than 80 million records). I need to extract a particular position(eg : 50th position) of each line from the csv file. I tried using split function. But I realized split takes a very long time. Also... (1 Reply)
Discussion started by: castle
1 Replies

9. Shell Programming and Scripting

Perl split function

my @d =split('\|', $_); west|ACH|3|Y|LuV|N||N|| Qt|UWST|57|Y|LSV|Y|Bng|N|KT| It Returns d as 8 for First Line, and 9 as for Second Line . I want to Process Both the Files, How to Handle It. (3 Replies)
Discussion started by: vishwakar
3 Replies

10. Shell Programming and Scripting

awk to split one field and print the last two fields within the split part.

Hello; I have a file consists of 4 columns separated by tab. The problem is the third fields. Some of the them are very long but can be split by the vertical bar "|". Also some of them do not contain the string "UniProt", but I could ignore it at this moment, and sort the file afterwards. Here is... (5 Replies)
Discussion started by: yifangt
5 Replies
Temporary(4)						     Kernel Interfaces Manual						      Temporary(4)

NAME
Temporary - Stores data files during transfers to remote systems SYNOPSIS
/usr/spool/uucp/SystemName/TM.xxPID.000 DESCRIPTION
The uucp Temporary (TM.*) files store data files during transfers to remote systems. After a Data (D.*) file is transferred to a remote system by the uucico daemon, the uucp program places it in a subdirectory of the uucp spooling directory named /usr/spool/uucp/SystemName, where the SystemName directory is named for the computer that is transmitting the file. The uucp program creates a temporary data file to hold the original data file. The full pathname of the temporary data file is in the following format: /usr/spool/uucp/SystemName/TM.xxPID.000 where the SystemName directory is named for the computer that is sending the file, and TM.xxPID.000 is the name of the file; for example, TM.00451.000. The PID variable is the process ID of the job. FILES
Describes accessible remote systems Contains uucp command, data, and execute files Contain data to be transferred. Contain files that uucp has transferred RELATED INFORMATION
Daemons: uucico(8) Commands: uucp(1), uudemon.cleanu(4), uupick(1), uuto(1), uux(1) delim off Temporary(4)
All times are GMT -4. The time now is 10:37 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy