Sponsored Content
Top Forums Shell Programming and Scripting Large Variable content size caveats? Post 303031948 by stomp on Friday 8th of March 2019 05:15:17 AM
Old 03-08-2019
Large Variable content size caveats?

Hi,

I wrote a shell script, which let me manage dns records through an api.

The raw core-command looks like about this:

Code:
output="$(curl -X GET https://mgt.myserver.de:8081/api/v1/servers/localhost/zones)"

The output contains a list of all zones with all records and is about 800 Kilobytes JSON-Data.

I ran into a first issue when I used this(incorrect) try to remove leading 000 from the 800K-variable:

Code:
# remove 000 from 
output="${output#000*}"

I think the correct term should be ${output#000}. Maybe this is an error when I typed something in without being completely aware, what I'm doing. The resulting regex caused the program did not finish this command. I assume the pattern required a huge amount of computing in that 800K-variable.

My question is:

Are there - in your experience - other general caveats, why one generally should refrain from using such big variable content sizes? As far as I read, there are no relevant size limits within linux regarding variables(for data sizes <100MB). Or would it be generally better, to use files for data at a certain limit?

Environment: Linux, bash

Last edited by stomp; 03-08-2019 at 06:31 AM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

prevent file size is too large

We have EDP members will do some testing job in my system , but sometimes these process will generate some error to the system log or any file ( usually the members don't know the log is reached to this level ) , then make the system crashed , could suggest the way how can to prevent this problem ?... (2 Replies)
Discussion started by: ust
2 Replies

2. UNIX for Dummies Questions & Answers

Editing a large size file

I would like to edit a doc which is large file size. I can't use "vi" command due to out of memory. $ vi large.dat ex: 0602-101 Out of memory saving lines for undo. Please help. Thanks. (2 Replies)
Discussion started by: Rock
2 Replies

3. Shell Programming and Scripting

Split a large file with patterns and size

Hi, I have a large file with a repeating pattern in it. Now i want the file split into the block of patterns with a specified no. of lines in each file. i.e. The file is like 1... 2... 2... 3... 1... 2... 3... 1... 2... 2... 2... 2... 2... 3... where 1 is the start of the block... (5 Replies)
Discussion started by: sudhamacs
5 Replies

4. Web Development

Content Management System for uploading large files

Hi everybody, I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for... (3 Replies)
Discussion started by: z1dane
3 Replies

5. Shell Programming and Scripting

Creating large number of files of specific size

Hi I am new to shell scripting.I want to create a batch file which creates a desired number of files with a specific size say 1MB each to consume space.How can i go about it using for loop /any other loop condition using shell script? Thanks (3 Replies)
Discussion started by: swatideswal
3 Replies

6. UNIX for Advanced & Expert Users

Need help with configuring large packet size on Solaris 7 / e6500

We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096 bytes. Linux has /etc/network/interfaces. Does ANYONE remember the equivalent in Unix? When I do ifconfig eth0 mtu 4000, I get the error... (0 Replies)
Discussion started by: sharique
0 Replies

7. Solaris

Need help with configuring large packet size on Solaris 7 / e6500

Greetings, I'm stuck in a time warp using ancient machines from the prehistoric era that should be rightfully displayed in the Smithsonian. We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096... (9 Replies)
Discussion started by: sharique
9 Replies

8. Red Hat

Empty directory, large size and performance

Hi, I've some directory that I used as working directory for a program. At the end of the procedure, the content is deleted. This directory, when I do a ls -l, appears to still take up some space. After a little research, I've seen on a another board of this forum that it's not really taking... (5 Replies)
Discussion started by: bdx
5 Replies

9. Shell Programming and Scripting

Help with Splitting a Large XML file based on size AND tags

Hi All, This is my first post here. Hoping to share and gain knowledge from this great forum !!!! I've scanned this forum before posting my problem here, but I'm afraid I couldn't find any thread that addresses this exact problem. I'm trying to split a large XML file (with multiple tag... (7 Replies)
Discussion started by: Aviktheory11
7 Replies

10. Programming

Best Method For Query Content In Large JSON Files

I wanted to know what is the best way to query json formatted files for content? Ex. Data https://usn.ubuntu.com/usn-db/database-all.json.bz2 When looking at keys as in: import json json_data = json.load(open('database-all.json')) for keys in json_data.iterkeys(): print 'Keys--> {}... (0 Replies)
Discussion started by: metallica1973
0 Replies
DD(1)									FSF								     DD(1)

NAME
dd - convert and copy a file SYNOPSIS
dd [OPTION]... DESCRIPTION
Copy a file, converting and formatting according to the options. bs=BYTES force ibs=BYTES and obs=BYTES cbs=BYTES convert BYTES bytes at a time conv=KEYWORDS convert the file as per the comma separated keyword list count=BLOCKS copy only BLOCKS input blocks ibs=BYTES read BYTES bytes at a time if=FILE read from FILE instead of stdin obs=BYTES write BYTES bytes at a time of=FILE write to FILE instead of stdout seek=BLOCKS skip BLOCKS obs-sized blocks at start of output skip=BLOCKS skip BLOCKS ibs-sized blocks at start of input --help display this help and exit --version output version information and exit BLOCKS and BYTES may be followed by the following multiplicative suffixes: xM M, c 1, w 2, b 512, kB 1000, K 1024, MB 1,000,000, M 1,048,576, GB 1,000,000,000, G 1,073,741,824, and so on for T, P, E, Z, Y. Each KEYWORD may be: ascii from EBCDIC to ASCII ebcdic from ASCII to EBCDIC ibm from ASCII to alternated EBCDIC block pad newline-terminated records with spaces to cbs-size unblock replace trailing spaces in cbs-size records with newline lcase change upper case to lower case notrunc do not truncate the output file ucase change lower case to upper case swab swap every pair of input bytes noerror continue after read errors sync pad every input block with NULs to ibs-size; when used with block or unblock, pad with spaces rather than NULs AUTHOR
Written by Paul Rubin, David MacKenzie, and Stuart Kemp. REPORTING BUGS
Report bugs to <bug-coreutils@gnu.org>. COPYRIGHT
Copyright (C) 2002 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICU- LAR PURPOSE. SEE ALSO
The full documentation for dd is maintained as a Texinfo manual. If the info and dd programs are properly installed at your site, the com- mand info dd should give you access to the complete manual. dd (coreutils) 4.5.3 February 2003 DD(1)
All times are GMT -4. The time now is 12:43 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy