10 More Discussions You Might Find Interesting
1. UNIX for Beginners Questions & Answers
Team,
Hope you all are doing fine
I have one admin server which is being used dedicately to run cron jobs on hourly basis, fetching the details from Database which is in a different server.These cronjob are run on every hourly/5 minutes basis depending as per end user requirement.The script... (12 Replies)
Discussion started by: whizkidash
12 Replies
2. UNIX for Advanced & Expert Users
HI
i was trying to configure logrotate for my apache server and it's not working properly.
here is my lodrotate configuration
/var/log/httpd/*log {
daily
missingok
notifempty
sharedscripts
compress
delaycompress
postrotate
/sbin/service httpd... (1 Reply)
Discussion started by: bentech4u
1 Replies
3. Web Development
Hello,
I've updated my apache access log to include the x-forward-for IP instead of my client(loadbalancer) ip. However, i can't seem to find a way to do the same for the error logs. Can someone please assist. Thank you.
-K (0 Replies)
Discussion started by: kmaq7621
0 Replies
4. Shell Programming and Scripting
Hi Experts,
I am having a requirement, where i need to generate a report of the execution time of all the processes. All the processes generate the log files in a log directory and I can get the execution time from the log files. like below is the log file.
/home/vikas/log >ls -l... (2 Replies)
Discussion started by: k_vikash
2 Replies
5. Web Development
My solaris server utilize the freeware savelog program to rotate apache logs. One server has become very busy and even after doing a graceful restart it continues to log to the saved gzip log file.
Has anyone been able to come up with a way or script to issue a "wait" type command so that the... (2 Replies)
Discussion started by: csross
2 Replies
6. Red Hat
I installed sarg from sarg rpm and i am facing issue while generating sarg reports and getting this time different error below
sarg -l /var/log/squid/access.log
SARG: Records in file: 242332, reading: 0.00%
SARG: Records in file: 242332, reading: 2.06%
SARG: Records in file: 242332, reading:... (0 Replies)
Discussion started by: mail4vij
0 Replies
7. Shell Programming and Scripting
Given that I have a log file of the format:
DATE ID LOG_LEVEL | EVENT
2009-07-23T14:05:11Z T-4030097550 D | MessX
2009-07-23T14:10:44Z T-4030097550 D | MessY
2009-07-23T14:34:08Z T-7298651656 D | MessX
2009-07-23T14:41:00Z T-7298651656 D | MessY
2009-07-23T15:05:10Z T-4030097550 D | MessZ... (5 Replies)
Discussion started by: daccad
5 Replies
8. Shell Programming and Scripting
I`m new to shell scripting and I need some help here
I`m trying to pharse Apache log and I encountered a problem so I need some help...
How to break line into fields having different field separators?
let`s say I want to break line into 9 fields
and the lines format is:
text1 text2... (2 Replies)
Discussion started by: Armisan
2 Replies
9. Shell Programming and Scripting
Hi,
I have a question with sed/awk. When I handle some log files I want to search all reports with specified keyword. For example, in the log below.
abcd
efg
===start
abc
e
===end
xyz
===start
af
f
===end
nf
ga
===start
ab
===end (4 Replies)
Discussion started by: danielnpu
4 Replies
10. Shell Programming and Scripting
hi i have data extracted in the following format ranging around 300000 to 800000 records in a text file , the format is of network data .
No. Time Source Destination Protocol
1 1998-06-05 17:20:23.569905 HP_61:aa:c9 HP_61:aa:c9 ... (1 Reply)
Discussion started by: renukaprasadb
1 Replies
DBI::ProfileDumper::Apache(3) User Contributed Perl Documentation DBI::ProfileDumper::Apache(3)
NAME
DBI::ProfileDumper::Apache - capture DBI profiling data from Apache/mod_perl
SYNOPSIS
Add this line to your httpd.conf:
PerlSetEnv DBI_PROFILE DBI::ProfileDumper::Apache
Then restart your server. Access the code you wish to test using a web browser, then shutdown your server. This will create a set of
dbi.prof.* files in your Apache log directory. Get a profiling report with dbiprof:
dbiprof /usr/local/apache/logs/dbi.prof.*
When you're ready to perform another profiling run, delete the old files
rm /usr/local/apache/logs/dbi.prof.*
and start again.
DESCRIPTION
This module interfaces DBI::ProfileDumper to Apache/mod_perl. Using this module you can collect profiling data from mod_perl applications.
It works by creating a DBI::ProfileDumper data file for each Apache process. These files are created in your Apache log directory. You
can then use dbiprof to analyze the profile files.
USAGE
LOADING THE MODULE
The easiest way to use this module is just to set the DBI_PROFILE environment variable in your httpd.conf:
PerlSetEnv DBI_PROFILE DBI::ProfileDumper::Apache
If you want to use one of DBI::Profile's other Path settings, you can use a string like:
PerlSetEnv DBI_PROFILE 2/DBI::ProfileDumper::Apache
It's also possible to use this module by setting the Profile attribute of any DBI handle:
$dbh->{Profile} = "DBI::ProfileDumper::Apache";
See DBI::ProfileDumper for more possibilities.
GATHERING PROFILE DATA
Once you have the module loaded, use your application as you normally would. Stop the webserver when your tests are complete. Profile
data files will be produced when Apache exits and you'll see something like this in your error_log:
DBI::ProfileDumper::Apache writing to /usr/local/apache/logs/dbi.prof.2619
Now you can use dbiprof to examine the data:
dbiprof /usr/local/apache/logs/dbi.prof.*
By passing dbiprof a list of all generated files, dbiprof will automatically merge them into one result set. You can also pass dbiprof
sorting and querying options, see dbiprof for details.
CLEANING UP
Once you've made some code changes, you're ready to start again. First, delete the old profile data files:
rm /usr/local/apache/logs/dbi.prof.*
Then restart your server and get back to work.
MEMORY USAGE
DBI::Profile can use a lot of memory for very active applications. It collects profiling data in memory for each distinct query your
application runs. You can avoid this problem with a call like this:
$dbh->{Profile}->flush_to_disk() if $dbh->{Profile};
Calling "flush_to_disk()" will clear out the profile data and write it to disk. Put this someplace where it will run on every request,
like a CleanupHandler, and your memory troubles should go away. Well, at least the ones caused by DBI::Profile anyway.
AUTHOR
Sam Tregar <sam@tregar.com>
COPYRIGHT AND LICENSE
Copyright (C) 2002 Sam Tregar
This program is free software; you can redistribute it and/or modify it under the same terms as Perl 5 itself.
perl v5.8.0 2002-11-29 DBI::ProfileDumper::Apache(3)