How to make a loop to read the input from a file part by part?
Hi All,
We've a VDI infrastructure in AWS (AWS workspaces) and we're planning to automate the process of provisioning workspaces. Instead of going to GUI console, and launching workspaces by selecting individual users is little time consuming. Thus, I want to create them in bunches from AWS CLI (installed on centos 7). To achieve this I've created below shell script to launch the workspaces from CLI with least efforts. Please note, the ws-userlist.txt has a list of 60 users, one word per line.
Now, here, there is a limitation of above workspace creation command. It can only build maximum 25 workspaces at one execution. I wonder if there any way to make the loop read the ws-userlist.txt in such a way that it read the first 25 users and take it as input for the command and execute it. once the it get executed, loop should go back to the input file, and read next set of 25 users and execute the command. Once it get executed, loop again go back to input file, and read the last 10 usernames and run the command.
Please note, username count could vary.Sometime I may have a list of 30, sometimes it could be 55.
I've basic knowledge of shell scripting. Need your expert help on this.
Thanks in advance.
Hi,
I justs started learning Unix on my own.
I have a question:
What command can I use when I need to read in part of the file into another file?
I remember I saw it somewhere but I don't know what it is.
Thanks (3 Replies)
Hi there,
I am lil confused with the following issue.
I have a File, which has the following header: IMSHRATE_043008_101016
a sample detailed record is :9820101 A982005000CAVG030108000000000000010169000MAR 2008
9820102 MAR 2008 D030108
... (1 Reply)
Ok, I am brand new to UNIX and I am trying to learn a cross between basic script and database use. I had got some ideas off the net on simple ideas for learning UNIX. I am working on creating a simple phone book program that allows myself to enter our employees from work into a phone book text... (0 Replies)
For a field format such as AAL1001_MD82, how do I select(and use in if statement) only the last four elements( in this case MD82) or the first three elements (in this case AAL)?
For instance, how do I do the following - if first three elements of $x == yyy, then ... (5 Replies)
Hi Guys
I need to have a shell script which reads a log file and insert a part of each line into the database. Some sample lines in the file are as below.
20091112091359 MED_RQACK : user_data=60173054304,100232120,20091112091359,;ask_status=0;ask_reason=OK;msg_id=20091112091319... (5 Replies)
Hi,
I have 80 large files, from which I want to get a specific value to run a Bash script. Firstly, I want to get the part of a file which contains this:
Name =A
xxxxxx
yyyyyy
zzzzzz
aaaaaa
bbbbbb
Value = 57
This is necessary because in a file there are written more lines which... (6 Replies)
I have files named with different prefixes. From each I want to extract the first line containing a specific string, and then print that line along with the prefix.
I've tried to do this with a while loop, but instead of printing the prefix I print the first line of the file twice.
Files:... (3 Replies)
Hello
let me explain senario.
there is a file which name is config and it store main software variable:
file main.conf contents:
update="1"
log_login="0"
allow_ports=""
deny_ports="21,22,23"
and there is a file which name is ports.txt
file ports.txt contents:
25,26,27
i... (3 Replies)
I'm working on Aix 6.1 and using ksh shell.
The below works fine on Linux bash or ksh shell .
while IFS= read -r dirpath ; do
echo "Hi"
done <<<"$var"
However, any such while loop that reads the input from file or variable using <<< fails on Aix system with the below error:
Below... (2 Replies)
Discussion started by: mohtashims
2 Replies
LEARN ABOUT DEBIAN
s3ls
S3LS(1p) User Contributed Perl Documentation S3LS(1p)NAME
s3ls - List S3 buckets and bucket contents
SYNOPSIS
s3ls [options]
s3ls [options] [ [ bucket | bucket/item ] ...]
Options:
--access-key AWS Access Key ID
--secret-key AWS Secret Access Key
--long
Environment:
AWS_ACCESS_KEY_ID
AWS_ACCESS_KEY_SECRET
OPTIONS --help Print a brief help message and exits.
--man Prints the manual page and exits.
--verbose
Output what is being done as it is done.
--access-key and --secret-key
Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret
Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential.
The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
environment variables.
Specifying them on the command line overrides the environment variables.
--secure
Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP.
--long
ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET
Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and
AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service,
and should be kept confidential.
The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line
parameters.
If the command line parameters are set, they override these environment variables.
CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with
one option per line. For example, the file could contain:
--access-key <AWS access key>
--secret-key <AWS secret key>
--secure
This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all
communications.
DESCRIPTION
Lists the buckets owned by the user, or all the item keys in a given bucket, or attributes associated with a given item.
If no buckets or bucket/itemkey is specified on the command line, all the buckets owned by the user are listed.
If the "--long" option is specified, the creation date of each bucket is also output.
If a bucket name is specified on the command line, all the item keys in that bucket are listed.
If the "--long" option is specified, the ID and display string of the item owner, the creation date, the MD5, and the size of the item are
also output.
If a bucket name and an item key, seperated by a slash character, is specified on the command line, then the bucket name and the item key
are output. This is useful to check that the item actually exists.
If the "--long" option is specified, all the HTTP attributes of the item are also output. This will include Content-Length, Content-Type,
ETag (which is the MD5 of the item contents), and Last-Modifed.
It may also include the HTTP attributes Content-Language, Expires, Cache-Control, Content-Disposition, and Content-Encoding.
It will also include any x-amz- metadata headers.
BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com.
Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay
and a backoff.
Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should
support that.
It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that.
Errors and warnings are very "Perl-ish", and can be confusing.
Trying to access a bucket or item that does not exist or is not accessable by the user generates less than helpful error messages.
This tool does not efficiently handle listing huge buckets, as it downloads and parses the entire bucket listing, before it outputs
anything.
This tool does not take advantage of the prefix, delimiter, and hierarchy features of the AWS S3 key listing API.
AUTHOR
Written by Mark Atwood mark@fallenpegasus.com.
Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools.
Many thanks to the Amazon AWS engineers for developing S3.
SEE ALSO
These tools use the Net::Amazon:S3 Perl module.
The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>.
perl v5.10.0 2009-03-08 S3LS(1p)