I'm having an issue with a script i wrote to pull information from the Amazon AWS API. Basically the script takes arguments from the command line and attempts to grab user information for each AWS access group. The command is issued like this:
# sh awsReport.sh <outputFileName> <AWS Authentication Profile Name>
Code for the script:
Crontab file:
And finnaly the Crontab error i'm getting:
When I run the script from the CLI everything works file. I can change the arguments to run the script for our other environments and it works fine. I get this error only when the script is executed from Cron, any ideas?
I have a script running as a cron job in machine A . This script ftps some files everyday from machine A to machine B, and mails me about the status. It works fine for some days....and suddenly stops running. By viewing the log files, I see that the script itself was not invoked by cron on those... (4 Replies)
Hi All,
i am trying to automate a process and have to create a unix script like wise. I have a scenario in which i need to automate a file movement. Below are the steps i need to automate.
1. Check whether a file (Not Fixed name-Pattern search of file say 'E*.dat') is present in a... (2 Replies)
Hello,
I have searched and searched google to do this and i want my websever to be able to run a php file everyday automatically. How do I go about doing this? Php is installed as an apache module not CGI.
Thank you! (3 Replies)
Hi everyone:
I'm trying to make a CRON job that will execute Fridays at 7am. I have the following:
* 7 * * 5
I've been studying up on CRON and I know to have this in a file and then "crontab filename.txt" to add it to the CRON job list.
The CRON part I believe I understand, but I would... (6 Replies)
Dear all,
I have the following case,, i need to transfer a group of file from one server to another ....when the size of any of these file reach a specified value (Ex: 10MB) i need to transfer it to another server ....my problem is that i dont know how to determine when the size of the file... (1 Reply)
The crontab entry looks like this...
29 13 * * * /usr/bin/mk-find --printf "%D\ %N\n" |
This is just a part of a long statement that has apostrophe in it. I will like to know why does it work at command prompt but fail if set as cronjob?
The error is...
/bin/sh: -c: line 0: unexpected EOF... (3 Replies)
Hi,
I have following directory structure
Media (Inside media directory I have two folders namely videos and images)
-->videos
-->images
Inside media directory I have some video files with extension .mp4 and images with extension of .jpg and .jpeg
I want to write a cron job which will... (3 Replies)
Hi All,
I beginner in unix, i have no idea how to set the script file using cron job every 5 second. I also want to execute automatically the output to text file.This is my script name countsys.sh and my textfile abc.txt. (6 Replies)
We need to configure autosys that when a job fails continously for 3 times, we need to call another job.
Is this possible in Autosys, or can anyone advice on the alternative. (2 Replies)
Hello,
I have written a cron job to automate the sftp of files using key authentication.
I wanted to add a timeStamp and name of file sent to a log file and append each these details to the same file each time files are sent and if possible include whether the files were sent successfully or not.... (3 Replies)
Discussion started by: KidKoder
3 Replies
LEARN ABOUT DEBIAN
s3put
S3PUT(1p) User Contributed Perl Documentation S3PUT(1p)NAME
s3put - Write an S3 item
SYNOPSIS
s3put [options] [ bucket/item ...]
Options:
--access-key AWS Access Key ID
--secret-key AWS Secret Access Key
Environment:
AWS_ACCESS_KEY_ID
AWS_ACCESS_KEY_SECRET
OPTIONS --help Print a brief help message and exits.
--man Prints the manual page and exits.
--verbose
Output what is being done as it is done.
--access-key and --secret-key
Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret
Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential.
The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
environment variables.
Specifying them on the command line overrides the environment variables.
--secure
Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP.
ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and
AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service,
and should be kept confidential.
The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line
parameters.
If the command line parameters are set, they override these environment variables.
CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with
one option per line. For example, the file could contain:
--access-key <AWS access key>
--secret-key <AWS secret key>
--secure
This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all
communications.
DESCRIPTION
Reads stdin, and writes it to an S3 item
BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com.
Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay
and a backoff.
Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should
support that.
It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that.
Errors and warnings are very "Perl-ish", and can be confusing.
Trying to write to a bucket that does not exist or is not accessable by the user generates less than helpful error messages.
Trying to put a bucket instead of an item is silently skipped.
TODO
option to read from files instead of stdin
use the fs mtime to set the http Last-Modified
option to read filenames to read from, from stdin
option to read from a tar file stream, for multiple items
option to magically guess mime type
option to use extended file attributes for metadata
option to have a progress bar
AUTHOR
Written by Mark Atwood mark@fallenpegasus.com.
Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools.
Many thanks to the Amazon AWS engineers for developing S3.
SEE ALSO
These tools use the Net::Amazon:S3 Perl module.
The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>.
perl v5.10.0 2009-03-08 S3PUT(1p)