Large Variable content size caveats?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Large Variable content size caveats?
# 8  
Old 03-11-2019
Quote:
Originally Posted by stomp
basically it boils down to jq. Without it I never would have considered doing this task with bash. All JSON-Handling is done with jq. Bash is the code around it.
Thanks for explaining.

Maybe move away from jq and use something more mainstream for processing JSON?

What are you planning to do with the JSON object after you download and process it?

Push it into a mysql db? Push it to Firebase? Save to a flat file on your box? Push it to another server?
# 9  
Old 03-11-2019
Code:
 What are you planning to do with the JSON object after you download and process it?

JSON is the result data format for the PowerDNS-API.

I'm using the API to query/add/modify/delete DNS-Records - not to store the retrieved data.

The Script is a command line tool, that's used for various purposes(Create/Delete Records for Let's Encrypt Certificate Validation, automatic configuration of MX-Record Changes, Setup of E-Mail-Autoconfiguration,...)

Quote:
Maybe move away from jq and use something more mainstream for processing JSON?
I have no plans to do so, since it's working fine.

Last edited by stomp; 03-11-2019 at 06:47 AM..
# 10  
Old 03-11-2019
Yes, I understanding working with JSON. I do it nearly daily (read, modify, update, write, between client and server)

FWIW,

If I was going to use the same API to modify and update JSON as you are doing, I would write a quick app with either PHP (if I did not need any UI), or I would develop with with Vue.js if I needed a web UI.

PHP processes JSON much better than bash and can easily update as well with very simple tools to pull and push files from the net built in, obviously.

Vue.js with extensions like Axios and Vuex is so rich in features for processing JSON across the net that comparing it to bash would be like comparing the Starship Enterprise to a flat bottom wooden boat (at worse) or at best (and being generous) an antique car.

Anyway, I realize a lot of people like to use these old tech tools from decades past and build an infrastructure around it to make it work; but honestly when working with JSON across the net as you are doing, there are much better tools than command line jq, curl, wget and bash, I promise Smilie

Anyway, I think I understand the reason you are using bash, curl and jq.... you are comfortable with those tools and that's cool too Smilie . I used to use those tools (excessively) between 15 and 5 years ago, so I understand and might use them again if I was forced to. However, I have noticed a seismic shift in JSON processing, including the use of Firebase and JSON-based NOSQL repos.

PS: At least use Postman to analyze your API calls. Postman is one of the single most productivity enhancing tools out where when working with JSON APIs.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Programming

Best Method For Query Content In Large JSON Files

I wanted to know what is the best way to query json formatted files for content? Ex. Data https://usn.ubuntu.com/usn-db/database-all.json.bz2 When looking at keys as in: import json json_data = json.load(open('database-all.json')) for keys in json_data.iterkeys(): print 'Keys--> {}... (0 Replies)
Discussion started by: metallica1973
0 Replies

2. Shell Programming and Scripting

Help with Splitting a Large XML file based on size AND tags

Hi All, This is my first post here. Hoping to share and gain knowledge from this great forum !!!! I've scanned this forum before posting my problem here, but I'm afraid I couldn't find any thread that addresses this exact problem. I'm trying to split a large XML file (with multiple tag... (7 Replies)
Discussion started by: Aviktheory11
7 Replies

3. Red Hat

Empty directory, large size and performance

Hi, I've some directory that I used as working directory for a program. At the end of the procedure, the content is deleted. This directory, when I do a ls -l, appears to still take up some space. After a little research, I've seen on a another board of this forum that it's not really taking... (5 Replies)
Discussion started by: bdx
5 Replies

4. Solaris

Need help with configuring large packet size on Solaris 7 / e6500

Greetings, I'm stuck in a time warp using ancient machines from the prehistoric era that should be rightfully displayed in the Smithsonian. We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096... (9 Replies)
Discussion started by: sharique
9 Replies

5. UNIX for Advanced & Expert Users

Need help with configuring large packet size on Solaris 7 / e6500

We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096 bytes. Linux has /etc/network/interfaces. Does ANYONE remember the equivalent in Unix? When I do ifconfig eth0 mtu 4000, I get the error... (0 Replies)
Discussion started by: sharique
0 Replies

6. Shell Programming and Scripting

Creating large number of files of specific size

Hi I am new to shell scripting.I want to create a batch file which creates a desired number of files with a specific size say 1MB each to consume space.How can i go about it using for loop /any other loop condition using shell script? Thanks (3 Replies)
Discussion started by: swatideswal
3 Replies

7. Web Development

Content Management System for uploading large files

Hi everybody, I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for... (3 Replies)
Discussion started by: z1dane
3 Replies

8. Shell Programming and Scripting

Split a large file with patterns and size

Hi, I have a large file with a repeating pattern in it. Now i want the file split into the block of patterns with a specified no. of lines in each file. i.e. The file is like 1... 2... 2... 3... 1... 2... 3... 1... 2... 2... 2... 2... 2... 3... where 1 is the start of the block... (5 Replies)
Discussion started by: sudhamacs
5 Replies

9. UNIX for Dummies Questions & Answers

Editing a large size file

I would like to edit a doc which is large file size. I can't use "vi" command due to out of memory. $ vi large.dat ex: 0602-101 Out of memory saving lines for undo. Please help. Thanks. (2 Replies)
Discussion started by: Rock
2 Replies

10. UNIX for Dummies Questions & Answers

prevent file size is too large

We have EDP members will do some testing job in my system , but sometimes these process will generate some error to the system log or any file ( usually the members don't know the log is reached to this level ) , then make the system crashed , could suggest the way how can to prevent this problem ?... (2 Replies)
Discussion started by: ust
2 Replies
Login or Register to Ask a Question