I'm having an issue with a script i wrote to pull information from the Amazon AWS API. Basically the script takes arguments from the command line and attempts to grab user information for each AWS access group. The command is issued like this:
# sh awsReport.sh <outputFileName> <AWS Authentication Profile Name>
Code for the script:
Crontab file:
And finnaly the Crontab error i'm getting:
When I run the script from the CLI everything works file. I can change the arguments to run the script for our other environments and it works fine. I get this error only when the script is executed from Cron, any ideas?
I have a script running as a cron job in machine A . This script ftps some files everyday from machine A to machine B, and mails me about the status. It works fine for some days....and suddenly stops running. By viewing the log files, I see that the script itself was not invoked by cron on those... (4 Replies)
Hi All,
i am trying to automate a process and have to create a unix script like wise. I have a scenario in which i need to automate a file movement. Below are the steps i need to automate.
1. Check whether a file (Not Fixed name-Pattern search of file say 'E*.dat') is present in a... (2 Replies)
Hello,
I have searched and searched google to do this and i want my websever to be able to run a php file everyday automatically. How do I go about doing this? Php is installed as an apache module not CGI.
Thank you! (3 Replies)
Hi everyone:
I'm trying to make a CRON job that will execute Fridays at 7am. I have the following:
* 7 * * 5
I've been studying up on CRON and I know to have this in a file and then "crontab filename.txt" to add it to the CRON job list.
The CRON part I believe I understand, but I would... (6 Replies)
Dear all,
I have the following case,, i need to transfer a group of file from one server to another ....when the size of any of these file reach a specified value (Ex: 10MB) i need to transfer it to another server ....my problem is that i dont know how to determine when the size of the file... (1 Reply)
The crontab entry looks like this...
29 13 * * * /usr/bin/mk-find --printf "%D\ %N\n" |
This is just a part of a long statement that has apostrophe in it. I will like to know why does it work at command prompt but fail if set as cronjob?
The error is...
/bin/sh: -c: line 0: unexpected EOF... (3 Replies)
Hi,
I have following directory structure
Media (Inside media directory I have two folders namely videos and images)
-->videos
-->images
Inside media directory I have some video files with extension .mp4 and images with extension of .jpg and .jpeg
I want to write a cron job which will... (3 Replies)
Hi All,
I beginner in unix, i have no idea how to set the script file using cron job every 5 second. I also want to execute automatically the output to text file.This is my script name countsys.sh and my textfile abc.txt. (6 Replies)
We need to configure autosys that when a job fails continously for 3 times, we need to call another job.
Is this possible in Autosys, or can anyone advice on the alternative. (2 Replies)
Hello,
I have written a cron job to automate the sftp of files using key authentication.
I wanted to add a timeStamp and name of file sent to a log file and append each these details to the same file each time files are sent and if possible include whether the files were sent successfully or not.... (3 Replies)
Discussion started by: KidKoder
3 Replies
LEARN ABOUT DEBIAN
s3mkbucket
S3MKBUCKET(1p) User Contributed Perl Documentation S3MKBUCKET(1p)NAME
s3mkbucket - Create Amazon AWS S3 buckets
SYNOPSIS
s3mkbucket [options] [bucket ...]
Options:
--access-key AWS Access Key ID
--secret-key AWS Secret Access Key
--acl-short private|public-read|public-read-write|authenticated-read
Environment:
AWS_ACCESS_KEY_ID
AWS_ACCESS_KEY_SECRET
OPTIONS --help Print a brief help message and exits.
--man Prints the manual page and exits.
--verbose
Print a message for each created bucket.
--access-key and --secret-key
Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret
Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential.
The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
environment variables.
Specifying them on the command line overrides the environment variables.
--secure
Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP.
--acl-short
Apply a "canned ACL" to the bucket when it is created. To set a more complex ACL, use the "s3acl" tool after the bucket is
created.
The following canned ACLs are currently defined by S3:
private Owner gets "FULL_CONTROL". No one else has any access rights. This is the default.
public-read
Owner gets "FULL_CONTROL". The anonymous principal is granted "READ" access.
public-read-write
Owner gets "FULL_CONTROL". The anonymous principal is granted "READ" and "WRITE" access. This is a useful policy to apply
to a bucket, if you intend for any anonymous user to PUT objects into the bucket.
authenticated-read
Owner gets "FULL_CONTROL" . Any principal authenticated as a registered Amazon S3 user is granted "READ" access.
bucket One or more bucket names. As many as possible will be created.
A user may have no more than 100 buckets.
Bucket names must be between 3 and 255 characters long, and can only contain alphanumeric characters, underscore, period, and dash.
Bucket names are case sensitive. Buckets with names containing uppercase characters or underscores are not accessible using the
virtual hosting method.
Buckets are unique in a global namespace. That means if someone has created a bucket with a given name, someone else cannot create
another bucket with the same name.
If a bucket name begins with one or more dashes, it might be mistaken for a command line option. If this is the case, separate the
command line options from the bucket names with two dashes, like so:
s3mkbucket --verbose -- --bucketname
ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and
AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service,
and should be kept confidential.
The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line
parameters.
If the command line parameters are set, they override these environment variables.
CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with
one option per line. For example, the file could contain:
--access-key <AWS access key>
--secret-key <AWS secret key>
--secure
This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all
communications.
DESCRIPTION
Create buckets in the Amazon Simple Storage Service (S3).
BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com.
Making a bucket that already exists and is owned by the user does not fail. It is unclear whether this is a bug or not.
Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay
and a backoff.
Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should
support that.
It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that.
Errors and warnings are very "Perl-ish", and can be confusing.
AUTHOR
Written by Mark Atwood mark@fallenpegasus.com.
Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools.
Many thanks to the Amazon AWS engineers for developing S3.
SEE ALSO
These tools use the Net::Amazon:S3 Perl module.
The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>.
perl v5.10.0 2009-03-08 S3MKBUCKET(1p)