Sponsored Content
Full Discussion: Optimizing bash script
Top Forums Shell Programming and Scripting Optimizing bash script Post 302975517 by stomp on Tuesday 14th of June 2016 04:31:07 PM
Old 06-14-2016
Hi,

can you post sample input + output files? That will
make it easier.

stomp();

Here's some bash code, which should be a lot faster, because only shell builtins are used. Can be better if you bring samples(in/out) and explain them.

Code:
#!/bin/bash

shopt -s nocasematch 

NUMBER=${TMFR//[wdhmoWDHMO]/}

# default multiplier for MINUTES
MULTIPLIER=60

# order of the checks matters!
[[ "$TMFR" =~ mo ]] && MULTIPLIER=2592000 # MONTH
[[ "$TMFR" =~ w  ]] && MULTIPLIER=604800  # WEEK
[[ "$TMFR" =~ d  ]] && MULTIPLIER=86400   # DAY
[[ "$TMFR" =~ h  ]] && MULTIPLIER=3600    # HOUR

((FIRSTIN= $NUMBER * $MULTIPLIER))

echo $FIRSTIN $NUMBER $MULTIPLIER


Last edited by stomp; 06-14-2016 at 06:38 PM..
This User Gave Thanks to stomp For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX and Linux Applications

Optimizing query

Hi All, My first thread to this sub-forum and first thread of this sub-forum :) Here it is, Am trying to delete duplicates from a table retaining just 1 duplicate value out of the duplicate records for example : from n records of a table out of which x are duplicates, I want to remove x... (15 Replies)
Discussion started by: matrixmadhan
15 Replies

2. OS X (Apple)

Optimizing OSX

Hi forum, I'm administrating a workstation/server for my lab and I was wondering how to optimize OSX. I was wondering what unnecessary background tasks I could kick off the system so I free up as much memory and cpu power. Other optimization tips are also welcome (HD parameters, memory... (2 Replies)
Discussion started by: deiphon
2 Replies

3. Shell Programming and Scripting

Need help optimizing this piece of code (Shell script Busybox)

I am looking for suggestions on how I could possibly optimized that piece of code where most of the time is spend on this script. In a nutshell this is a script that creates an xml file(s) based on certain criteria that will be used by a movie jukebox. Example of data: $SORTEDTMP= it is a... (16 Replies)
Discussion started by: snappy46
16 Replies

4. Shell Programming and Scripting

Optimizing the code

Hi, I have two files in the format listed below. I need to find out all values from field 12 to field 20 present in file 2 and list them in file3(format as file2) File1 : FEIN,CHRISTA... (2 Replies)
Discussion started by: nua7
2 Replies

5. Shell Programming and Scripting

Optimizing awk script

Can this awk statement be optimized? i ask because log.txt is a giant file with several hundred thousands of lines of records. myscript.sh: while read line do searchterm="${1}" datecurr=$(date +%s) file=$(awk 'BEGIN{split(ARGV,var,",");print var}' $line) ... (3 Replies)
Discussion started by: SkySmart
3 Replies

6. Shell Programming and Scripting

Optimizing script to reduce execution time

AFILENAME=glow.sh FILENAME="/${AFILENAME}" WIDTHA=$(echo ${FILENAME} | wc -c) NTIME=0 RESULTS=$(for eachletter in $(echo ${FILENAME} | fold -w 1) do WIDTHTIMES=$(awk "BEGIN{printf... (5 Replies)
Discussion started by: SkySmart
5 Replies

7. Shell Programming and Scripting

Optimizing the Shell Script [Expert Advise Needed]

I have prepared a shell script to find the duplicates based on the part of filename and retain latest. #!/bin/bash if ; then mkdir -p dup fi NOW=$(date +"%F-%H:%M:%S") LOGFILE="purge_duplicate_log-$NOW.log" LOGTIME=`date "+%Y-%m-%d %H:%M:%S"` echo... (6 Replies)
Discussion started by: gold2k8
6 Replies

8. Shell Programming and Scripting

Optimizing bash loop

now, i have to search for a pattern within a particular time frame which the user will provide in the following format: 19/Jun/2018:07:04,21/Jun/2018:21:30 it is easy to get tempted to attempt this search with a variation of the following awk command: awk... (3 Replies)
Discussion started by: SkySmart
3 Replies

9. Shell Programming and Scripting

How to block first bash script until second bash script script launches web server/site?

I'm new to utilities like socat and netcat and I'm not clear if they will do what I need. I have a "compileDeployStartWebServer.sh" script and a "StartBrowser.sh" script that are started by emacs/elisp at the same time in two different processes. I'm using Cygwin bash on Windows 10. My... (3 Replies)
Discussion started by: siegfried
3 Replies

10. Web Development

Optimizing JS and CSS

Yes. Got few suggestions. - How about minifying resources - mod_expires - Service workers setup https://www.unix.com/attachments/web-programming/7709d1550557731-sneak-preview-new-unix-com-usercp-vuejs-demo-screenshot-png (8 Replies)
Discussion started by: Akshay Hegde
8 Replies
OCI_NUM_FIELDS(3)														 OCI_NUM_FIELDS(3)

oci_num_fields - Returns the number of result columns in a statement

SYNOPSIS
int oci_num_fields (resource $statement) DESCRIPTION
Gets the number of columns in the given $statement. PARAMETERS
o $statement - A valid OCI statement identifier. RETURN VALUES
Returns the number of columns as an integer, or FALSE on errors. EXAMPLES
Example #1 oci_num_fields(3) example <?php // Create the table with: // CREATE TABLE mytab (id NUMBER, quantity NUMBER); $conn = oci_connect("hr", "hrpwd", "localhost/XE"); if (!$conn) { $m = oci_error(); trigger_error(htmlentities($m['message']), E_USER_ERROR); } $stid = oci_parse($conn, "SELECT * FROM mytab"); oci_execute($stid, OCI_DESCRIBE_ONLY); // Use OCI_DESCRIBE_ONLY if not fetching rows $ncols = oci_num_fields($stid); for ($i = 1; $i <= $ncols; $i++) { echo oci_field_name($stid, $i) . " " . oci_field_type($stid, $i) . "<br> "; } // Outputs: // ID NUMBER // QUANTITY NUMBER oci_free_statement($stid); oci_close($conn); ?> NOTES
Note In PHP versions before 5.0.0 you must use ocinumcols(3) instead. This name still can be used, it was left as alias of oci_num_fields(3) for downwards compatability. This, however, is deprecated and not recommended. PHP Documentation Group OCI_NUM_FIELDS(3)
All times are GMT -4. The time now is 01:18 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy