Sponsored Content
Top Forums Shell Programming and Scripting How to segregate a section from big file? Post 303026649 by Peasant on Saturday 1st of December 2018 02:44:08 AM
Old 12-01-2018
I suppose you are reading the following document :
AWS IP Address Ranges - Amazon Web Services

If we look at the example 5 and 6 in the documentation, it is quite close to your requirement.
You do not need to pipe the json parser output to additional shell commands to parse, it should be able to parse JSON format as you wish.
See if this is what you want..
Code:
curl https://ip-ranges.amazonaws.com/ip-ranges.json | jq '.prefixes[] | select (.region=="us-west-2") | .ip_prefix'


This is my first time using the jq tool, the one used is from debian repositories.
Code:
jq --version
jq-1.5-1-a5b5cbe

Regards
Peasant.
This User Gave Thanks to Peasant For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How to view a big file(143M big)

1 . Thanks everyone who read the post first. 2 . I have a log file which size is 143M , I can not use vi open it .I can not use xedit open it too. How to view it ? If I want to view 200-300 ,how can I implement it 3 . Thanks (3 Replies)
Discussion started by: chenhao_no1
3 Replies

2. Shell Programming and Scripting

cutting a section of a big file

Hi, I have a text file 10giga size. Opening the file with vi takes forever ... Im intersting only with the 100 first records. Is there way to copy those 100 lines to new file (with no need to open the file)? Thanks (6 Replies)
Discussion started by: yoavbe
6 Replies

3. Shell Programming and Scripting

segregate the file based on matching patterns

print 'test' SETUSER 'dbo' go create proc abc as /Some code here/ go SETUSER go print 'test1' SETUSER 'dbo' go Create Procedure xyz as /some code here/ go SETUSER go print 'test2' SETUSER 'dbo' (2 Replies)
Discussion started by: mad_man12
2 Replies

4. UNIX for Dummies Questions & Answers

How big is too big a config.log file?

I have a 5000 line config.log file with several "maybe" errors. Any reccomendations on finding solvable problems? (2 Replies)
Discussion started by: NeedLotsofHelp
2 Replies

5. Shell Programming and Scripting

Extract section of file based on word in section

I have a list of Servers in no particular order as follows: virtualMachines="IIBSBS IIBVICDMS01 IIBVICMA01"And I am generating some output from a pre-existing script that gives me the following (this is a sample output selection). 9/17/2010 8:00:05 PM: Normal backup using VDRBACKUPS... (2 Replies)
Discussion started by: jelloir
2 Replies

6. Shell Programming and Scripting

Delete a section of a file if...

i have a file as below that has n section : 2006 0101 1236 49.3 L 37.902 48.482 0.0 Teh 5 0.2 2.7LTeh 1 GAP=238 E Iranian Seismological Center, Institute of Geophysics, University of Tehran 6 ... (5 Replies)
Discussion started by: oreka18
5 Replies

7. Shell Programming and Scripting

Fetch a section from a file

Hi, I have a file like... $cat file1 +++++++++++++++++++ client1 +++++++++++++++++++++++++++++ col1 col2 col3 ------ ----- ----- (0 rows affected) ========================================================= +++++++++++++++++++ client1 +++++++++++++++++++++++++++++ col1 col2 col3... (6 Replies)
Discussion started by: sam05121988
6 Replies

8. Shell Programming and Scripting

How can i segregate?

I have this file which contains 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,1.9.12,aML,en 91886,000,MiniC2-00,3.0,aML,en... (6 Replies)
Discussion started by: nikhil jain
6 Replies

9. Shell Programming and Scripting

Segregate by suffixed file names using Korn Shell

I have following files at /dir1 a.csv.20131201 b.csv.20131201 c.csv.20131201 d.csv.20131201 a.csv.20131202 b.csv.20131202 c.csv.20131202 d.csv.20131202 ....................... ....................... ....................... ....................... I need to move these files to... (4 Replies)
Discussion started by: JaisonJ
4 Replies

10. UNIX for Advanced & Expert Users

Segregate file content using sed backreference

I have some text like EU1BTDAT:ASSGNDD filename='$SEQFILES/SUNIA.PJ008202.CARDLIB/DATECARD' EU1BTDATEST:ASSGNDD filename='$SEQFILES/SUNIA.PJ008202.CARDLIB/DATECARD' EU1CLOSEDATES:ASSGNDD filename='$SEQFILES/SUNIA.PJ008202.CARDLIB/DATECARD' EU1DATED:ASSGNDD... (8 Replies)
Discussion started by: gotamp
8 Replies
S3PUT(1p)						User Contributed Perl Documentation						 S3PUT(1p)

NAME
s3put - Write an S3 item SYNOPSIS
s3put [options] [ bucket/item ...] Options: --access-key AWS Access Key ID --secret-key AWS Secret Access Key Environment: AWS_ACCESS_KEY_ID AWS_ACCESS_KEY_SECRET OPTIONS
--help Print a brief help message and exits. --man Prints the manual page and exits. --verbose Output what is being done as it is done. --access-key and --secret-key Specify the "AWS Access Key Identifiers" for the AWS account. --access-key is the "Access Key ID", and --secret-key is the "Secret Access Key". These are effectively the "username" and "password" to the AWS account, and should be kept confidential. The access keys MUST be specified, either via these command line parameters, or via the AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET environment variables. Specifying them on the command line overrides the environment variables. --secure Uses SSL/TLS HTTPS to communicate with the AWS service, instead of HTTP. ENVIRONMENT VARIABLES
AWS_ACCESS_KEY_ID and AWS_ACCESS_KEY_SECRET Specify the "AWS Access Key Identifiers" for the AWS account. AWS_ACCESS_KEY_ID contains the "Access Key ID", and AWS_ACCESS_KEY_SECRET contains the "Secret Access Key". These are effectively the "username" and "password" to the AWS service, and should be kept confidential. The access keys MUST be specified, either via these environment variables, or via the --access-key and --secret-key command line parameters. If the command line parameters are set, they override these environment variables. CONFIGURATION FILE
The configuration options will be read from the file "~/.s3-tools" if it exists. The format is the same as the command line options with one option per line. For example, the file could contain: --access-key <AWS access key> --secret-key <AWS secret key> --secure This example configuration file would specify the AWS access keys and that a secure connection using HTTPS should be used for all communications. DESCRIPTION
Reads stdin, and writes it to an S3 item BUGS
Report bugs to Mark Atwood mark@fallenpegasus.com. Occasionally the S3 service will randomly fail for no externally apparent reason. When that happens, this tool should retry, with a delay and a backoff. Access to the S3 service can be authenticated with a X.509 certificate, instead of via the "AWS Access Key Identifiers". This tool should support that. It might be useful to be able to specify the "AWS Access Key Identifiers" in the user's "~/.netrc" file. This tool should support that. Errors and warnings are very "Perl-ish", and can be confusing. Trying to write to a bucket that does not exist or is not accessable by the user generates less than helpful error messages. Trying to put a bucket instead of an item is silently skipped. TODO
option to read from files instead of stdin use the fs mtime to set the http Last-Modified option to read filenames to read from, from stdin option to read from a tar file stream, for multiple items option to magically guess mime type option to use extended file attributes for metadata option to have a progress bar AUTHOR
Written by Mark Atwood mark@fallenpegasus.com. Many thanks to Wotan LLC <http://wotanllc.com>, for supporting the development of these S3 tools. Many thanks to the Amazon AWS engineers for developing S3. SEE ALSO
These tools use the Net::Amazon:S3 Perl module. The Amazon Simple Storage Service (S3) is documented at <http://aws.amazon.com/s3>. perl v5.10.0 2009-03-08 S3PUT(1p)
All times are GMT -4. The time now is 08:55 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy