If the input files are all on the same physical disk/logical you are probably IO bound no matter what you do. Most of the time spent in any event is disk I/O wait time - especially if you are reading and writing to the same logical volume/physical disk.
If you can find a disk/logical volume that is not busy, then put your destination file there.
Otherwise, cat is pretty efficient - on HPUX 11i -
Most six seconds here is probably related to i/o wait time. The system was not busy at this time. /dev/null does not write to disk, so this is related to read wait times. This is for a 120MB file. My version of cat calls read with an 8192 byte buffer.
Hi ,
I am having a file 1n.txt -
cat 1n.txt gives
get_next_sp_fixing_mod_ref_no,
get_next_sp_tran_ref_no,
cat 2n.txt -
boxer1.cpp
boxer2.cpp
I want a file resn.txt which has -
get_next_sp_fixing_mod_ref_no,boxer1.cpp
get_next_sp_tran_ref_no,boxer2.cpp
How can i do that ? Its... (3 Replies)
I have the following snippet to concatenate about a hundred csv-files:
for file in *csv; do cat $file >> newfile; done
This line works, but before I was experimenting with the following line, which is more intuitive and is a tad more robust:
for file in *.csv; do cat $file >> newfile; done
Can... (2 Replies)
I need to concatenate the files of same type but with different names that conatin header. Before conactenating its mandatory to remove the header from the files ..
and after concatenation the output file should contain header too.
how to do it...
thanks in advance. (4 Replies)
Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example
VOICE_0000000000
VOICE_1223o23u0
VOICE_934934927349
I use the following code:
cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE
... (10 Replies)
Hi All,
I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Good evening
Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000
lines in average.
i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Hi All!
i am trying to copy files from a SCO Openserver 5.0.6 to a NAS Server using NFS. I have a cron job that takes 1.5 hours to run and most of the data is static. I would like to find a faster way. In the event I needed to running manually or avoid an issue with taking down the servers... (9 Replies)
Hi,
I have the following reports that get generated every 1 hour and this is my requirement:
1. 5 reports get generated every hour with the names "Report.Dddmmyy.Thhmiss.CTLR"
"Report.Dddmmyy.Thhmiss.ACCD"
"Report.Dddmmyy.Thhmiss.BCCD"
"Report.Dddmmyy.Thhmiss.CCCD"... (1 Reply)
Since my last threads were closed on account of spamming, keeping just this one opened!
Hi,
I have the following reports that get generated every 1 hour and this is my requirement:
1. 5 reports get generated every hour with the names
"Report.Dddmmyy.Thhmiss.CTLR"... (5 Replies)
I have a very big input file <inputFile1.txt> which has list of mobile no
inputFile1.txt
3434343
3434323
0970978
85233
... around 1 million records
i have another file as inputFile2.txt which has some log detail big file
inputFile2.txt
afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)