Sponsored Content
Top Forums Shell Programming and Scripting Multi thread awk command for faster performance Post 302632087 by chetan.c on Sunday 29th of April 2012 04:14:16 AM
Old 04-29-2012
Hi Bartus,

Is it like if there are 100 files then get 50 files and call function concurrently.

Like this?
Code:
 
function first {
#to_process=$1
cd $1
for i in *.txt
do
echo $i
awk '/<.*/ , /.*<\/.*>/' "$i" | tr -d '\n'
echo -ne '\n' 
done
}
first /home/Folder1 &
first /home/Folder2 &
wait

Please let me know if im wrong.Also is multithreading an option here?

Thanks,
Chetan
 

9 More Discussions You Might Find Interesting

1. Programming

Multi threading using posix thread library

hi all, can anyone tell me some good site for the mutithreading tutorials, its application, and some code examples. -sushil (2 Replies)
Discussion started by: shushilmore
2 Replies

2. UNIX for Dummies Questions & Answers

Which command will be faster? y?

i)wc -c/etc/passwd|awk'{print $1}' ii)ls -al/etc/passwd|awk'{print $5}' (4 Replies)
Discussion started by: karthi_g
4 Replies

3. Programming

Multi thread data sharing problem in uclinux

hello, I have wrote a multi thread application to run under uclinux. the problem is that threads does not share data. using the ps command it shows a single process for each thread. I test the application under Ubuntu 8.04 and Open Suse 10.3 with 2.6 kernel and there were no problems and also... (8 Replies)
Discussion started by: mrhosseini
8 Replies

4. Shell Programming and Scripting

Multi thread shell programming

I have a unix directory where a million of small text files getting accumulated every week. As of now there is a shell batch program in place which merges all the files in this directory into a single file and ftp to other system. Previously the volume of the files would be around 1 lakh... (2 Replies)
Discussion started by: vk39221
2 Replies

5. Shell Programming and Scripting

Faster way to use this awk command

awk "/May 23, 2012 /,0" /var/tmp/datafile the above command pulls out information in the datafile. the information it pulls is from the date specified to the end of the file. now, how can i make this faster if the datafile is huge? even if it wasn't huge, i feel there's a better/faster way to... (8 Replies)
Discussion started by: SkySmart
8 Replies

6. Shell Programming and Scripting

Making a faster alternative to a slow awk command

Hi, I have a large number of input files with two columns of numbers. For example: 83 1453 99 3255 99 8482 99 7372 83 175 I only wish to retain lines where the numbers fullfil two requirements. E.g: =83 1000<=<=2000 To do this I use the following... (10 Replies)
Discussion started by: s052866
10 Replies

7. Shell Programming and Scripting

How to substract selective values in multi row, multi column file (using awk or sed?)

Hi, I have a problem where I need to make this input: nameRow1a,text1a,text2a,floatValue1a,FloatValue2a,...,floatValue140a nameRow1b,text1b,text2b,floatValue1b,FloatValue2b,...,floatValue140b look like this output: nameRow1a,text1b,text2a,(floatValue1a - floatValue1b),(floatValue2a -... (4 Replies)
Discussion started by: nricardo
4 Replies

8. Shell Programming and Scripting

How to make awk command faster?

I have the below command which is referring a large file and it is taking 3 hours to run. Can something be done to make this command faster. awk -F ',' '{OFS=","}{ if ($13 == "9999") print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12 }' ${NLAP_TEMP}/hist1.out|sort -T ${NLAP_TEMP} |uniq>... (13 Replies)
Discussion started by: Peu Mukherjee
13 Replies

9. Shell Programming and Scripting

How to make awk command faster for large amount of data?

I have nginx web server logs with all requests that were made and I'm filtering them by date and time. Each line has the following structure: 127.0.0.1 - xyz.com GET 123.ts HTTP/1.1 (200) 0.000 s 3182 CoreMedia/1.0.0.15F79 (iPhone; U; CPU OS 11_4 like Mac OS X; pt_br) These text files are... (21 Replies)
Discussion started by: brenoasrm
21 Replies
English(3pm)						 Perl Programmers Reference Guide					      English(3pm)

NAME
English - use nice English (or awk) names for ugly punctuation variables SYNOPSIS
use English; use English qw( -no_match_vars ) ; # Avoids regex performance penalty # in perl 5.16 and earlier ... if ($ERRNO =~ /denied/) { ... } DESCRIPTION
This module provides aliases for the built-in variables whose names no one seems to like to read. Variables with side-effects which get triggered just by accessing them (like $0) will still be affected. For those variables that have an awk version, both long and short English alternatives are provided. For example, the $/ variable can be referred to either $RS or $INPUT_RECORD_SEPARATOR if you are using the English module. See perlvar for a complete list of these. PERFORMANCE
NOTE: This was fixed in perl 5.20. Mentioning these three variables no longer makes a speed difference. This section still applies if your code is to run on perl 5.18 or earlier. This module can provoke sizeable inefficiencies for regular expressions, due to unfortunate implementation details. If performance matters in your application and you don't need $PREMATCH, $MATCH, or $POSTMATCH, try doing use English qw( -no_match_vars ) ; . It is especially important to do this in modules to avoid penalizing all applications which use them. perl v5.18.2 2014-01-06 English(3pm)
All times are GMT -4. The time now is 07:19 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy