11-02-2019
How to find out and monitor IO spikes history in Linux?
hello all
i have application which according to AWS monitoring is reaching to high spikes of IO at random time .
and causing the server to crash and restart .
my question is how can i find out what cause the spike and if i can't with the native linux tools
what free open source minimon intrusive monitor i can use ?
thanks
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi All,
My question is how can i stop my users on system from deleting their history.
How can i stop the users from running 'history -c'.
I have searched thoroughly on the forum but didn't find any satisfactory solution to the problem.
kindly help if you any suggestions
Thanx in... (3 Replies)
Discussion started by: xander
3 Replies
2. UNIX for Advanced & Expert Users
ctr+R is used to search bash history reversely, How to get next match? just like N in vi search. (1 Reply)
Discussion started by: honglus
1 Replies
3. Linux
Hi All,
Is there a way to check command executed by users in Linux for a specific date? I know we can use history, but it doesn't shows yesterday's executed commands.
rgds, (3 Replies)
Discussion started by: ronny_nch
3 Replies
4. UNIX for Dummies Questions & Answers
Hello All, Good Morning.
I am trying to erase history list in my linux box, but my below command is failing. What is the actual way to clear it?
> history clear
-bash: history: clear: numeric argument required
Also when I run my commands in my command prompt, my team lead can see my... (7 Replies)
Discussion started by: NARESH1302
7 Replies
5. UNIX for Dummies Questions & Answers
Hi,
I am getting a high load average, around 7, once an hour. It last for about 4 minutes and makes things fairly unusable for this time.
How do I find out what is using this. Looking at top the only thing running at the time is md5sum.
I have looked at the crontab and there is nothing... (10 Replies)
Discussion started by: sm9ai
10 Replies
6. Shell Programming and Scripting
Hi - Can anyone help me to get the shell script for the below scenario
I need to find out the time stamps history for the files residing within the subfolders. Ex..
Let say I got directory structure like
/home/project1/ -- Fixed directory
Now within the project there are many... (1 Reply)
Discussion started by: pankaj80
1 Replies
7. UNIX for Dummies Questions & Answers
Hi friends,
I am unable to find the .sh_history file on my PC.
This files contains the log of all commands typed by the users.
I searched it also , but no results.
Please help me :(, where to find it . Or is there anything else to do to set the history file. (6 Replies)
Discussion started by: paras.oriental
6 Replies
8. HP-UX
Please guide me how to get the head cleaning history on HP HP MSL4048 1 LT0-4 Ultrium 1840. (0 Replies)
Discussion started by: marunmeera
0 Replies
9. What is on Your Mind?
Dear All,
Taking a break from Vue.js coding for the site, SEO and YT videos; and hopefully addressing some well deserved criticism from some here that I have been too focused on the visual aspects of the forums versus the substance and the community....
While the "current generation... (9 Replies)
Discussion started by: Neo
9 Replies
10. UNIX for Advanced & Expert Users
I have been wrangling with a small problem on a Ubuntu server which runs a LAMP application.
Linux ubuntu 4.15.0-33-generic #36-Ubuntu SMP Wed Aug 15 16:00:05 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
This server runs fine, basically:
ubuntu:/var/www# uptime
20:17:13 up 105 days,... (45 Replies)
Discussion started by: Neo
45 Replies
S3GET(1p) User Contributed Perl Documentation S3GET(1p)
NAME
s3get - Retrieve contents of S3 items
SYNOPSIS
s3get [options]
s3get [options] [ bucket/item ...]
Options:
--access-key AWS Access Key ID
--secret-key AWS Secret Access Key
Environment:
AWS_ACCESS_KEY_ID
AWS_ACCESS_KEY_SECRET
OPTIONS
--help Print a brief help message and exits.
--man Prints the manual page and exits.
--verbose
Output what is being done as it is done.
--access-key and --secret-key
Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret
Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential.
The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
environment variables.
Specifying them on the command line overrides the environment variables.
--secure
Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP.
ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and
AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service,
and should be kept confidential.
The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line
parameters.
If the command line parameters are set, they override these environment variables.
CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with
one option per line. For example, the file could contain:
--access-key <AWS access key>
--secret-key <AWS secret key>
--secure
This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all
communications.
DESCRIPTION
Retrieves S3 items, and outputs them to stdout.
BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com.
Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay
and a backoff.
Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should
support that.
It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that.
Errors and warnings are very "Perl-ish", and can be confusing.
Trying to access an item that does not exist or is not accessable by the user generates less than helpful error messages.
Trying to retrieve a bucket instead of an item is silently skipped.
TODO
option to write to files instead of stdout
option to write to paths instead of stdout
option to write to a tar file stream, for multiple items
option to write extended file attributes based on S3 & HTTP metadata
option to have a progress bar
AUTHOR
Written by Mark Atwood mark@fallenpegasus.com.
Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools.
Many thanks to the Amazon AWS engineers for developing S3.
SEE ALSO
These tools use the Net::Amazon:S3 Perl module.
The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>.
perl v5.10.0 2009-03-08 S3GET(1p)