Large Variable content size caveats?


Login or Register to Reply

 
Thread Tools Search this Thread
# 1  
Old 2 Weeks Ago
Large Variable content size caveats?

Hi,

I wrote a shell script, which let me manage dns records through an api.

The raw core-command looks like about this:

Code:
output="$(curl -X GET https://mgt.myserver.de:8081/api/v1/servers/localhost/zones)"

The output contains a list of all zones with all records and is about 800 Kilobytes JSON-Data.

I ran into a first issue when I used this(incorrect) try to remove leading 000 from the 800K-variable:

Code:
# remove 000 from 
output="${output#000*}"

I think the correct term should be ${output#000}. Maybe this is an error when I typed something in without being completely aware, what I'm doing. The resulting regex caused the program did not finish this command. I assume the pattern required a huge amount of computing in that 800K-variable.

My question is:

Are there - in your experience - other general caveats, why one generally should refrain from using such big variable content sizes? As far as I read, there are no relevant size limits within linux regarding variables(for data sizes <100MB). Or would it be generally better, to use files for data at a certain limit?

Environment: Linux, bash

Last edited by stomp; 2 Weeks Ago at 05:31 AM..
# 2  
Old 2 Weeks Ago
I always thought the size of a variable was limited by the stack size...
This User Gave Thanks to vbe For This Post:
stomp (2 Weeks Ago)
# 3  
Old 2 Weeks Ago
Quote:
I always thought the size of a variable was limited by the stack size...
That may be true.

Code:
ulimit -a
 core file size          (blocks, -c) 0
 data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 773193
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 773193
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

# 4  
Old 2 Weeks Ago
Main caveat, it's almost almost always a dumb idea. Ad-hoc data, "dump-and-fix-later", is the modern UUOC, wasteful and pointless. If you know what you actually want, you can either deal with it in a structured way or avoid storing it entirely.

Another caveat, most shells don't let you keep binary data in variables. Mass dumping unchecked data into variables can occasionally surprise you.
# 5  
Old 2 Weeks Ago
Hi Corona,

thanks for your answer, even if not using very nice words.

What I get out of it: Beware what unknown data can possibly and in worst case may do and especially in heavy interpreted shell environment.

---

The method was not - like you assumed - to get the full data out of the API, have it locally and pull the needed bits out locally. It was in short looking at the API the only way to get what I need. Digging deeper now into the api documentation revealed other ways to do the same more efficiently.

Maximum efficiency was not my goal. I started with a shell script and at about half completion of the task I realized that the task may be to complex for a shell script and it would have been better to use a scripting language. Now I have a 500 lines Bash script which is implemented fairly well structured and working quite well. I will not rewrite it again if there are not serious issues.

My intention with this question is/was to get feedback is if there are any serious problems I was not aware of.

Since all of that data is JSON data which will get fed into jq for extraction, I do not see much potential trouble ahead.
# 6  
Old 1 Week Ago
Quote:
Originally Posted by stomp

Since all of that data is JSON data which will get fed into jq for extraction, I do not see much potential trouble ahead.
If your data is JSON, why are you processing it with curl and shell scripts where there are tools better suited for processing JSON data?

I process a lot of JSON data on Linux and never use a shell script to process this JSON data. There are many other languages, libs and tools built to process JSON. Why use a tool suboptimal for JSON processing?

Just curious ...

As I said, I process reams of JSON data on Linux and do not use shell scripts to process any JSON objects.
# 7  
Old 1 Week Ago
Hi Neo,

basically it boils down to jq. Without it I never would have considered doing this task with bash. All JSON-Handling is done with jq. Bash is the code around it.

Even I may have considered differently after having it done, because the task got a little more complex than expected, I think I got the task done very quickly. It's working more than sufficiently well, the code is maintainable and the speed is acceptable. For smaller tasks I consider this still as an excellent choice.

Example: Get an ipaddress from zone data

Assuming this zone data...

JSON-Zonedata from PowerDNS-API . GitHub

...this would be the command to extract the pure ip address of the A-Record "ftp.bla.com":

Code:
jq -r '.rrsets[] | select(.name=="ftp.bla.com." and .type=="A") | .records[0].content '  <<<"$json"

#output

1.2.3.5


Last edited by stomp; 1 Week Ago at 05:16 AM..
Login or Register to Reply

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

More UNIX and Linux Forum Topics You Might Find Helpful
Can't input large file to variable newbie2010 Shell Programming and Scripting 3 06-25-2018 06:49 PM
How to get script to create a new file that lists folder content sorted by size? Braveheart UNIX for Beginners Questions & Answers 4 11-27-2016 09:20 AM
Best Method For Query Content In Large JSON Files metallica1973 Programming 0 04-15-2016 05:41 PM
Help with Splitting a Large XML file based on size AND tags Aviktheory11 Shell Programming and Scripting 7 07-03-2014 03:24 AM
Need help in finishing a bash script for listing subfolder by size in a large folder ultimo Shell Programming and Scripting 2 12-04-2013 05:27 AM
Empty directory, large size and performance bdx Red Hat 5 01-20-2012 12:21 PM
Need help with configuring large packet size on Solaris 7 / e6500 sharique Solaris 9 04-07-2011 03:19 AM
Need help with configuring large packet size on Solaris 7 / e6500 sharique UNIX for Advanced & Expert Users 0 04-06-2011 05:37 PM
Columns comparision of two large size files and printing the difference krao Shell Programming and Scripting 6 10-08-2010 12:47 AM
Creating large number of files of specific size swatideswal Shell Programming and Scripting 3 06-24-2009 08:32 AM
Content Management System for uploading large files z1dane Web Development 3 03-20-2009 12:52 AM
Split a large file with patterns and size sudhamacs Shell Programming and Scripting 5 06-25-2008 03:49 PM
Content of Content of a variable! jaduks Shell Programming and Scripting 2 08-27-2007 12:40 AM
Editing a large size file Rock UNIX for Dummies Questions & Answers 2 12-04-2006 08:32 PM
prevent file size is too large ust UNIX for Dummies Questions & Answers 2 03-10-2005 01:04 PM