03-28-2018
Running awk once and only once would be so much faster than running awk 180,000,000 times, it'd be done in under a minute, maybe even single digit seconds.
Perl is not faster. If you wrote this code the same way in Perl it'd be just as slow or slower.
Unfortunately, the program you've given doesn't seem to work, so I can't tell what output you want. Could you post the output you want?
Last edited by Corona688; 03-28-2018 at 04:46 PM..
10 More Discussions You Might Find Interesting
1. UNIX for Advanced & Expert Users
Hi every one,
We have HP UX server which normally loaded as avg load of 19-21.
NOw when I try and do ftp to this server it takes ages to get the FTP prompt.
I have seen this server loaded as max agv load of 35-40 tht time we never had such problems of FTP sessions.
Now my new Unix admin... (1 Reply)
Discussion started by: nilesrex
1 Replies
2. Shell Programming and Scripting
How to make a script run for a maximum of "x" number of hours only (7 Replies)
Discussion started by: ScriptDummy
7 Replies
3. UNIX for Dummies Questions & Answers
Hi all,
I'm having some trouble with a shell script that I have put together to search our web pages for links to PDFs.
The first thing I did was:
ls -R | grep .pdf > /tmp/dave_pdfs.outWhich generates a list of all of the PDFs on the server. For the sake of arguement, say it looks like... (8 Replies)
Discussion started by: Dave Stockdale
8 Replies
4. HP-UX
Hi All
I have a problem, I wonder if you can help me sort it out:
I have the following entry in the cron:
00 1,13 * * * /home/report/opn_amt_gestores_credito.ksh > opn_amt_gestores_credito.log
But the entry only runs at 01:07
I have stopped the cron deamon, and started, but it still... (39 Replies)
Discussion started by: fretagi
39 Replies
5. Shell Programming and Scripting
I want to parse a log file which i am grepping root user connection but is showing whole day and previous day detail as well.
First i want to see last 2 hours log file then after that i want to search particular string. Lets suppose right now its 5:00PM, So i want to see the log of 3:00PM to... (6 Replies)
Discussion started by: learnbash
6 Replies
6. Shell Programming and Scripting
This is my first experience writing unix script. I've created the following script. It does what I want it to do, but I need it to be a lot faster. Is there any way to speed it up?
cat 'Tax_Provision_Sample.dat' | sort | while read p; do fn=`echo $p|cut -d~ -f2,4,3,8,9`; echo $p >> "$fn.txt";... (20 Replies)
Discussion started by: JohnN6
20 Replies
7. UNIX for Advanced & Expert Users
Hi
I have task to zip files based on modified time but they are in millions and it is taking lot of time more than 12 hours and also eating up high cpu
is there any other / better way to handle it quickly with less cpu consumptionfind . ! -name \"*.gz\" -mtime +7 -type f | grep -v '/.*/' |... (2 Replies)
Discussion started by: reldb
2 Replies
8. Shell Programming and Scripting
HI Guys hoping some one can help
I have two files on both containing uk phone numbers
master is a file which has been collated over a few years ad currently contains around 4 million numbers
new is a file which also contains 4 million number i need to split new nto two separate files... (4 Replies)
Discussion started by: dunryc
4 Replies
9. Shell Programming and Scripting
Hi All,
I have a bash script which is scheduled to run for every 20 minutes. Inside the bash script, one command which I am using need to be triggered only once in two or three hours.Is there anyway to achieve this.
For example,
if
then
echo "hi"
else
echo "Hello"
UNIX Command---once... (5 Replies)
Discussion started by: ginrkf
5 Replies
10. Shell Programming and Scripting
Hello experts,
we have input files with 700K lines each (one generated for every hour). and we need to convert them as below and move them to another directory once.
Sample INPUT:-
# cat test1
1559205600000,8474,NormalizedPortInfo,PctDiscards,0.0,Interface,BG-CTA-AX1.test.com,Vl111... (7 Replies)
Discussion started by: prvnrk
7 Replies
funtbl(1) SAORD Documentation funtbl(1)
NAME
funtbl - extract a table from Funtools ASCII output
SYNOPSIS
funtable [-c cols] [-h] [-n table] [-p prog] [-s sep] <iname>
DESCRIPTION
[NB: This program has been deprecated in favor of the ASCII text processing support in funtools. You can now perform fundisp on funtools
ASCII output files (specifying the table using bracket notation) to extract tables and columns.]
The funtbl script extracts a specified table (without the header and comments) from a funtools ASCII output file and writes the result to
the standard output. The first non-switch argument is the ASCII input file name (i.e. the saved output from funcnts, fundisp, funhist,
etc.). If no filename is specified, stdin is read. The -n switch specifies which table (starting from 1) to extract. The default is to
extract the first table. The -c switch is a space-delimited list of column numbers to output, e.g. -c "1 3 5" will extract the first
three odd-numbered columns. The default is to extract all columns. The -s switch specifies the separator string to put between columns.
The default is a single space. The -h switch specifies that column names should be added in a header line before the data is output. With-
out the switch, no header is prepended. The -p program switch allows you to specify an awk-like program to run instead of the default
(which is host-specific and is determined at build time). The -T switch will output the data in rdb format (i.e., with a 2-row header of
column names and dashes, and with data columns separated by tabs). The -help switch will print out a message describing program usage.
For example, consider the output from the following funcnts command:
[sh] funcnts -sr snr.ev "ann 512 512 0 9 n=3"
# source
# data file: /proj/rd/data/snr.ev
# arcsec/pixel: 8
# background
# constant value: 0.000000
# column units
# area: arcsec**2
# surf_bri: cnts/arcsec**2
# surf_err: cnts/arcsec**2
# summed background-subtracted results
upto net_counts error background berror area surf_bri surf_err
---- ------------ --------- ------------ --------- --------- --------- ---------
1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008
2 625.000 25.000 0.000 0.000 6976.00 0.090 0.004
3 1442.000 37.974 0.000 0.000 15936.00 0.090 0.002
# background-subtracted results
reg net_counts error background berror area surf_bri surf_err
---- ------------ --------- ------------ --------- --------- --------- ---------
1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008
2 478.000 21.863 0.000 0.000 5376.00 0.089 0.004
3 817.000 28.583 0.000 0.000 8960.00 0.091 0.003
# the following source and background components were used:
source_region(s)
----------------
ann 512 512 0 9 n=3
reg counts pixels sumcnts sumpix
---- ------------ --------- ------------ ---------
1 147.000 25 147.000 25
2 478.000 84 625.000 109
3 817.000 140 1442.000 249
There are four tables in this output. To extract the last one, you can execute:
[sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -n 4
1 147.000 25 147.000 25
2 478.000 84 625.000 109
3 817.000 140 1442.000 249
Note that the output has been re-formatted so that only a single space separates each column, with no extraneous header or comment informa-
tion.
To extract only columns 1,2, and 4 from the last example (but with a header prepended and tabs between columns), you can execute:
[sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -c "1 2 4" -h -n 4 -s " "
#reg counts sumcnts
1 147.000 147.000
2 478.000 625.000
3 817.000 1442.000
Of course, if the output has previously been saved in a file named foo.out, the same result can be obtained by executing:
[sh] funtbl -c "1 2 4" -h -n 4 -s " " foo.out
#reg counts sumcnts
1 147.000 147.000
2 478.000 625.000
3 817.000 1442.000
SEE ALSO
See funtools(7) for a list of Funtools help pages
version 1.4.2 January 2, 2008 funtbl(1)