04-28-2012
You can split the files that you need to process into a number of batches and run your code concurrently on each of them.
9 More Discussions You Might Find Interesting
1. Programming
hi all,
can anyone tell me some good site for the mutithreading tutorials, its application, and some code examples.
-sushil (2 Replies)
Discussion started by: shushilmore
2 Replies
2. UNIX for Dummies Questions & Answers
i)wc -c/etc/passwd|awk'{print $1}'
ii)ls -al/etc/passwd|awk'{print $5}' (4 Replies)
Discussion started by: karthi_g
4 Replies
3. Programming
hello,
I have wrote a multi thread application to run under uclinux.
the problem is that threads does not share data. using the ps command it shows a single process for each thread.
I test the application under Ubuntu 8.04 and Open Suse 10.3 with 2.6 kernel and there were no problems and also... (8 Replies)
Discussion started by: mrhosseini
8 Replies
4. Shell Programming and Scripting
I have a unix directory where a million of small text files getting accumulated every week.
As of now there is a shell batch program in place which merges all the files in this directory into a single file and ftp to other system.
Previously the volume of the files would be around 1 lakh... (2 Replies)
Discussion started by: vk39221
2 Replies
5. Shell Programming and Scripting
awk "/May 23, 2012 /,0" /var/tmp/datafile
the above command pulls out information in the datafile. the information it pulls is from the date specified to the end of the file.
now, how can i make this faster if the datafile is huge? even if it wasn't huge, i feel there's a better/faster way to... (8 Replies)
Discussion started by: SkySmart
8 Replies
6. Shell Programming and Scripting
Hi,
I have a large number of input files with two columns of numbers.
For example:
83 1453
99 3255
99 8482
99 7372
83 175
I only wish to retain lines where the numbers fullfil two requirements. E.g:
=83
1000<=<=2000
To do this I use the following... (10 Replies)
Discussion started by: s052866
10 Replies
7. Shell Programming and Scripting
Hi,
I have a problem where I need to make this input:
nameRow1a,text1a,text2a,floatValue1a,FloatValue2a,...,floatValue140a
nameRow1b,text1b,text2b,floatValue1b,FloatValue2b,...,floatValue140b
look like this output:
nameRow1a,text1b,text2a,(floatValue1a - floatValue1b),(floatValue2a -... (4 Replies)
Discussion started by: nricardo
4 Replies
8. Shell Programming and Scripting
I have the below command which is referring a large file and it is taking 3 hours to run. Can something be done to make this command faster.
awk -F ',' '{OFS=","}{ if ($13 == "9999") print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12 }' ${NLAP_TEMP}/hist1.out|sort -T ${NLAP_TEMP} |uniq>... (13 Replies)
Discussion started by: Peu Mukherjee
13 Replies
9. Shell Programming and Scripting
I have nginx web server logs with all requests that were made and I'm filtering them by date and time.
Each line has the following structure:
127.0.0.1 - xyz.com GET 123.ts HTTP/1.1 (200) 0.000 s 3182 CoreMedia/1.0.0.15F79 (iPhone; U; CPU OS 11_4 like Mac OS X; pt_br)
These text files are... (21 Replies)
Discussion started by: brenoasrm
21 Replies
LEARN ABOUT DEBIAN
uucpsend.ctl
UUCPSEND.CTL(5) Administration UUCPSEND.CTL(5)
NAME
uucpsend.ctl - list of sites to feed via uucpsend
DESCRIPTION
The file /etc/news/uucpsend.ctl specifies the default list of sites to be fed by uucpsend(8). The program is able to read site information
from other related configuration files as well.
Comments begin with a hash mark (``#'') and continue through the end of the line. Blank lines and comments are ignored. All other lines
should consist of six fields separated by a colon. Each line looks like
site:max_size:queue_size:header:compressor:args
The first field site is the name of the site as specified in the newsfeeds(5) file. This is also the name of the UUCP system connected to
this site.
The second field max_size describes the maximum size of all batches in kbytes that may be sent to this site. If this amount of batches is
reached, this site will not be batched with this run and a reason will be logged into the logfile. This test includs all UUCP jobs, not
only the ones sent to rnews (performing ``du -s'').
The third field queue_size specifies the maximum size in kbytes of one batch. This argument is passed directly to batcher(8).
The fourth field header defines the text that shall appear in the command header of every batch file. `#! ' is prefixed each batch. Nor-
mally you'll need cunbatch for compress, gunbatch or zunbatch for gzip. This header is important since there is not standard way to handle
gzip'ed batches. Using this and the next argument you're also able to use any compressor you like. So you receive a certain amount of
flexibility by using uucpsend. If you don't want to have any compression leave the field empty.
The fifth field compressor names a program that reads from stdin and writes to stdout. Normally it modifies the input stream by compress-
ing it, such as compress(1) or gzip(1).
The sixth field args consists of additional arguments that are passed directly to uux when sending the batch.
One entry in the main configuration file is mandatory. There must exist a line containing the default values for all these variables. To
achieve this the pseudo site /default/ is used.
One default entry could look like this:
/default/:2000:200:cunbatch:compress:-r -n
This reflects a minimal setup. The maximal size that may be used by the UUCP spool directory is 2MB. Each batch will be max. 200 kBytes
big. The header of each batch will contain the string `cunbatch' and compress(1) is used to compress the batches. `-r -n' is passed to
uux(1) which means no notification will be sent if uux was successful and uux won't start the uucico(8) program when spooling the file.
HISTORY
Written by Martin Schulze <joey@infodrom.org> for InterNetNews. Most of the work is derived from nncpsend.ctl(5) by Landon Curt Noll
<chongo@toad.com> for InterNetNews.
SEE ALSO
batcher(8), newsfeeds(5), uucpsend(8), uux(1).
Infodrom 21 November 2001 UUCPSEND.CTL(5)