Sponsored Content
Full Discussion: Optimizing query
Special Forums UNIX and Linux Applications Optimizing query Post 302130202 by matrixmadhan on Friday 3rd of August 2007 02:00:44 PM
Old 08-03-2007
Shell_Life

out of the 4 potential hazards that you have listed

since the query is executed only on a table with 0.25 million records, I just encounter the 4th hazard which is taking real long time.

When it initially took such a long time, I though I might be receiving ' Long transaction aborted '. But didn't.

Considering the alternative of programmatically deleting is a fine idea without filling the logs. Smilie
 

10 More Discussions You Might Find Interesting

1. Filesystems, Disks and Memory

Optimizing the system reliability

My product have around 10-15 programs/services running in the sun box, which together completes a task, sequentially. Several instances of the each program/service are running in the unix box, to manage the load and for risk-management reasons. As of now, we dont follow a strict strategy in... (2 Replies)
Discussion started by: Deepa
2 Replies

2. Filesystems, Disks and Memory

optimizing disk performance

I have some questions regarding disk perfomance, and what I can do to make it just a little (or much :)) more faster. From what I've heard the first partitions will be faster than the later ones because tracks at the outer edges of a hard drive platter simply moves faster. But I've also read in... (4 Replies)
Discussion started by: J.P
4 Replies

3. Shell Programming and Scripting

Optimizing for a Speed-up

How would one go about optimizing this current .sh program so it works at a more minimal time. Such as is there a better way to count what I need than what I have done or better way to match patterns in the file? Thanks, #declare variables to be used. help=-1 count=0 JanCount=0 FebCount=0... (3 Replies)
Discussion started by: switch
3 Replies

4. OS X (Apple)

Optimizing OSX

Hi forum, I'm administrating a workstation/server for my lab and I was wondering how to optimize OSX. I was wondering what unnecessary background tasks I could kick off the system so I free up as much memory and cpu power. Other optimization tips are also welcome (HD parameters, memory... (2 Replies)
Discussion started by: deiphon
2 Replies

5. Shell Programming and Scripting

Optimizing the code

Hi, I have two files in the format listed below. I need to find out all values from field 12 to field 20 present in file 2 and list them in file3(format as file2) File1 : FEIN,CHRISTA... (2 Replies)
Discussion started by: nua7
2 Replies

6. Shell Programming and Scripting

Optimizing awk script

Can this awk statement be optimized? i ask because log.txt is a giant file with several hundred thousands of lines of records. myscript.sh: while read line do searchterm="${1}" datecurr=$(date +%s) file=$(awk 'BEGIN{split(ARGV,var,",");print var}' $line) ... (3 Replies)
Discussion started by: SkySmart
3 Replies

7. Shell Programming and Scripting

Optimizing search using grep

I have a huge log file close to 3GB in size. My task is to generate some reporting based on # of times something is being logged. I need to find the number of time StringA , StringB , StringC is being called separately. What I am doing right now is: grep "StringA" server.log | wc -l... (4 Replies)
Discussion started by: Junaid Subhani
4 Replies

8. Shell Programming and Scripting

Optimizing find with many replacements

Hello, I'm looking for advice on how to optimize this bash script, currently i use the shotgun approach to avoid file io/buffering problems of forks trying to write simultaneously to the same file. i'd like to keep this as a fairly portable bash script rather than writing a C routine. in a... (8 Replies)
Discussion started by: f77hack
8 Replies

9. Shell Programming and Scripting

Optimizing bash loop

now, i have to search for a pattern within a particular time frame which the user will provide in the following format: 19/Jun/2018:07:04,21/Jun/2018:21:30 it is easy to get tempted to attempt this search with a variation of the following awk command: awk... (3 Replies)
Discussion started by: SkySmart
3 Replies

10. Web Development

Optimizing JS and CSS

Yes. Got few suggestions. - How about minifying resources - mod_expires - Service workers setup https://www.unix.com/attachments/web-programming/7709d1550557731-sneak-preview-new-unix-com-usercp-vuejs-demo-screenshot-png (8 Replies)
Discussion started by: Akshay Hegde
8 Replies
SQLSRV_ROWS_AFFECTED(3) 												   SQLSRV_ROWS_AFFECTED(3)

sqlsrv_rows_affected - Returns the number of rows modified by the last INSERT, UPDATE, or DELETE query executed

SYNOPSIS
int sqlsrv_rows_affected (resource $stmt) DESCRIPTION
Returns the number of rows modified by the last INSERT, UPDATE, or DELETE query executed. For information about the number of rows returned by a SELECT query, see sqlsrv_num_rows(3). PARAMETERS
o $stmt - The executed statement resource for which the number of affected rows is returned. RETURN VALUES
Returns the number of rows affected by the last INSERT, UPDATE, or DELETE query. If no rows were affected, 0 is returned. If the number of affected rows cannot be determined, -1 is returned. If an error occurred, FALSE is returned. EXAMPLES
Example #1 sqlsrv_rows_affected(3) example <?php $serverName = "serverNamesqlexpress"; $connectionInfo = array( "Database"=>"dbName", "UID"=>"username", "PWD"=>"password" ); $conn = sqlsrv_connect( $serverName, $connectionInfo); if( $conn === false ) { die( print_r( sqlsrv_errors(), true)); } $sql = "UPDATE Table_1 SET data = ? WHERE id = ?"; $params = array("updated data", 1); $stmt = sqlsrv_query( $conn, $sql, $params); $rows_affected = sqlsrv_rows_affected( $stmt); if( $rows_affected === false) { die( print_r( sqlsrv_errors(), true)); } elseif( $rows_affected == -1) { echo "No information available.<br />"; } else { echo $rows_affected." rows were updated.<br />"; } ?> SEE ALSO
sqlsrv_num_rows(3). PHP Documentation Group SQLSRV_ROWS_AFFECTED(3)
All times are GMT -4. The time now is 07:59 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy