Thank you for the reply, migurus. I edited the crontab as you suggested and am getting the error:
It seems that the file is actually not being created when run as a cron job. When run manually from the command line ie:
the file gets created correctly.
I did an ls on the directory after the cron and it's not creating the file 'awsUserReportFile'....
I have a script running as a cron job in machine A . This script ftps some files everyday from machine A to machine B, and mails me about the status. It works fine for some days....and suddenly stops running. By viewing the log files, I see that the script itself was not invoked by cron on those... (4 Replies)
Hi All,
i am trying to automate a process and have to create a unix script like wise. I have a scenario in which i need to automate a file movement. Below are the steps i need to automate.
1. Check whether a file (Not Fixed name-Pattern search of file say 'E*.dat') is present in a... (2 Replies)
Hello,
I have searched and searched google to do this and i want my websever to be able to run a php file everyday automatically. How do I go about doing this? Php is installed as an apache module not CGI.
Thank you! (3 Replies)
Hi everyone:
I'm trying to make a CRON job that will execute Fridays at 7am. I have the following:
* 7 * * 5
I've been studying up on CRON and I know to have this in a file and then "crontab filename.txt" to add it to the CRON job list.
The CRON part I believe I understand, but I would... (6 Replies)
Dear all,
I have the following case,, i need to transfer a group of file from one server to another ....when the size of any of these file reach a specified value (Ex: 10MB) i need to transfer it to another server ....my problem is that i dont know how to determine when the size of the file... (1 Reply)
The crontab entry looks like this...
29 13 * * * /usr/bin/mk-find --printf "%D\ %N\n" |
This is just a part of a long statement that has apostrophe in it. I will like to know why does it work at command prompt but fail if set as cronjob?
The error is...
/bin/sh: -c: line 0: unexpected EOF... (3 Replies)
Hi,
I have following directory structure
Media (Inside media directory I have two folders namely videos and images)
-->videos
-->images
Inside media directory I have some video files with extension .mp4 and images with extension of .jpg and .jpeg
I want to write a cron job which will... (3 Replies)
Hi All,
I beginner in unix, i have no idea how to set the script file using cron job every 5 second. I also want to execute automatically the output to text file.This is my script name countsys.sh and my textfile abc.txt. (6 Replies)
We need to configure autosys that when a job fails continously for 3 times, we need to call another job.
Is this possible in Autosys, or can anyone advice on the alternative. (2 Replies)
Hello,
I have written a cron job to automate the sftp of files using key authentication.
I wanted to add a timeStamp and name of file sent to a log file and append each these details to the same file each time files are sent and if possible include whether the files were sent successfully or not.... (3 Replies)
Discussion started by: KidKoder
3 Replies
LEARN ABOUT DEBIAN
s3rmbucket
S3RMBUCKET(1p) User Contributed Perl Documentation S3RMBUCKET(1p)NAME
s3rmbucket - Delete Amazon AWS S3 buckets
SYNOPSIS
s3rmbucket [options] [bucket ...]
Options:
--access-key AWS Access Key ID
--secret-key AWS Secret Access Key
Environment:
AWS_ACCESS_KEY_ID
AWS_ACCESS_KEY_SECRET
OPTIONS --help Print a brief help message and exits.
--man Prints the manual page and exits.
--verbose
Print a message for each created bucket.
--access-key and --secret-key
Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret
Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential.
The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
environment variables.
Specifying them on the command line overrides the environment variables.
--secure
Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP.
bucket One or more bucket names. As many as possible will be deleted.
A bucket may only be deleted if it is empty.
Bucket names must be between 3 and 255 characters long, and can only contain alphanumeric characters, underscore, period, and dash.
Bucket names are case sensitive.
If a bucket name begins with one or more dashes, it might be mistaken for a command line option. If this is the case, separate the
command line options from the bucket names with two dashes, like so:
s3rmbucket --verbose -- --bucketname
ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and
AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service,
and should be kept confidential.
The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line
parameters.
If the command line parameters are set, they override these environment variables.
CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with
one option per line. For example, the file could contain:
--access-key <AWS access key>
--secret-key <AWS secret key>
--secure
This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all
communications.
DESCRIPTION
Delete buckets in the Amazon Simple Storage Service (S3). A bucket may only be deleted if it is empty.
BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com.
Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay
and a backoff.
Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should
support that.
It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that.
Some errors and warnings are very "Perl-ish", and can be confusing.
A bucket can only be deleted if it is empty. It might be useful to add an option to delete every item in the bucket before then deleting
it, similar to the semantics of the "rm -rf dir" command. This tool should support that.
AUTHOR
Written by Mark Atwood mark@fallenpegasus.com.
Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools.
Many thanks to the Amazon AWS engineers for developing S3.
SEE ALSO
These tools use the Net::Amazon:S3 Perl module.
The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>.
perl v5.10.0 2009-03-08 S3RMBUCKET(1p)