Sponsored Content
Special Forums UNIX and Linux Applications Virtualization and Cloud Computing Amazon CloudFront / S3 Small Object Test Results Post 302387407 by linuxpenguin on Friday 15th of January 2010 04:00:17 PM
Old 01-15-2010
Very interesting. I think it would be interesting to test with multiple instances in the same region e.g. us-east-1c v/s instances in different regions. Although the first case sounds more interesting. What it would mean that if I had amazon instance say in virginia and in NJ, I should ideally get back the same response time.
 

3 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Printer Error(the Object Instance Test Does Not Exist)

Hello, i need some help about how to set up a high velocity impact printer in UNIX SCO 5.05, this printer is attached with a parallel port in a PC(host), the host use tunemul to access unix.(this reference is just to ask you if this is a local or remote connection, just to be sure), so, i... (2 Replies)
Discussion started by: jav_v
2 Replies

2. Virtualization and Cloud Computing

CEP as a Service (CEPaaS) with MapReduce on Amazon EC2 and Amazon S3

Tim Bass 11-25-2008 01:02 PM Just as I was starting to worry that complex event processing community has been captured by RDBMS pirates off the coast of Somalia, I rediscovered a new core blackboard architecture component, Hadoop. Hadoop is a framework for building applications on large... (0 Replies)
Discussion started by: Linux Bot
0 Replies

3. Shell Programming and Scripting

PERL - traverse sub directories and get test case results

Hello, I need help in creating a PERL script for parsing test result files to get the results (pass or fail). Each test case execution generates a directory with few files among which we are interested in .result file. Lets say Testing is home directory. If i executed 2 test cases. It will... (4 Replies)
Discussion started by: ravi.videla
4 Replies
S3PUT(1p)						User Contributed Perl Documentation						 S3PUT(1p)

NAME
s3put - Write an S3 item SYNOPSIS
s3put [options] [ bucket/item ...] Options: --access-key AWS Access Key ID --secret-key AWS Secret Access Key Environment: AWS_ACCESS_KEY_ID AWS_ACCESS_KEY_SECRET OPTIONS
--help Print a brief help message and exits. --man Prints the manual page and exits. --verbose Output what is being done as it is done. --access-key and --secret-key Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential. The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET environment variables. Specifying them on the command line overrides the environment variables. --secure Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP. ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service, and should be kept confidential. The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line parameters. If the command line parameters are set, they override these environment variables. CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with one option per line. For example, the file could contain: --access-key <AWS access key> --secret-key <AWS secret key> --secure This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all communications. DESCRIPTION
Reads stdin, and writes it to an S3 item BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com. Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay and a backoff. Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should support that. It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that. Errors and warnings are very "Perl-ish", and can be confusing. Trying to write to a bucket that does not exist or is not accessable by the user generates less than helpful error messages. Trying to put a bucket instead of an item is silently skipped. TODO
option to read from files instead of stdin use the fs mtime to set the http Last-Modified option to read filenames to read from, from stdin option to read from a tar file stream, for multiple items option to magically guess mime type option to use extended file attributes for metadata option to have a progress bar AUTHOR
Written by Mark Atwood mark@fallenpegasus.com. Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools. Many thanks to the Amazon AWS engineers for developing S3. SEE ALSO
These tools use the Net::Amazon:S3 Perl module. The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>. perl v5.10.0 2009-03-08 S3PUT(1p)
All times are GMT -4. The time now is 07:23 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy