I wrote a shell script, which let me manage dns records through an api.
The raw core-command looks like about this:
The output contains a list of all zones with all records and is about 800 Kilobytes JSON-Data.
I ran into a first issue when I used this(incorrect) try to remove leading 000 from the 800K-variable:
I think the correct term should be ${output#000}. Maybe this is an error when I typed something in without being completely aware, what I'm doing. The resulting regex caused the program did not finish this command. I assume the pattern required a huge amount of computing in that 800K-variable.
My question is:
Are there - in your experience - other general caveats, why one generally should refrain from using such big variable content sizes? As far as I read, there are no relevant size limits within linux regarding variables(for data sizes <100MB). Or would it be generally better, to use files for data at a certain limit?
We have EDP members will do some testing job in my system , but sometimes these process will generate some error to the system log or any file ( usually the members don't know the log is reached to this level ) , then make the system crashed , could suggest the way how can to prevent this problem ?... (2 Replies)
I would like to edit a doc which is large file size. I can't use "vi" command due to out of memory.
$ vi large.dat
ex: 0602-101 Out of memory saving lines for undo.
Please help. Thanks. (2 Replies)
Hi,
I have a large file with a repeating pattern in it. Now i want the file split into the block of patterns with a specified no. of lines in each file.
i.e. The file is like
1...
2...
2...
3...
1...
2...
3...
1...
2...
2...
2...
2...
2...
3...
where 1 is the start of the block... (5 Replies)
Hi everybody,
I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for... (3 Replies)
Hi
I am new to shell scripting.I want to create a batch file which creates a desired number of files with a specific size say 1MB each to consume space.How can i go about it using for loop /any other loop condition using shell script?
Thanks (3 Replies)
We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096 bytes.
Linux has /etc/network/interfaces. Does ANYONE remember the equivalent in Unix? When I do ifconfig eth0 mtu 4000, I get the error... (0 Replies)
Greetings, I'm stuck in a time warp using ancient machines from the prehistoric era that should be rightfully displayed in the Smithsonian.
We're running Solaris 7 on FDDI n/w on an E6500 host and wish to use MTU (packet size) > 1500, more like 3072 bytes to begin with and possibly up to 4096... (9 Replies)
Hi,
I've some directory that I used as working directory for a program. At the end of the procedure, the content is deleted. This directory, when I do a ls -l, appears to still take up some space. After a little research, I've seen on a another board of this forum that it's not really taking... (5 Replies)
Hi All,
This is my first post here. Hoping to share and gain knowledge from this great forum !!!!
I've scanned this forum before posting my problem here, but I'm afraid I couldn't find any thread that addresses this exact problem.
I'm trying to split a large XML file (with multiple tag... (7 Replies)
I wanted to know what is the best way to query json formatted files for content? Ex. Data
https://usn.ubuntu.com/usn-db/database-all.json.bz2
When looking at keys as in:
import json
json_data = json.load(open('database-all.json'))
for keys in json_data.iterkeys():
print 'Keys--> {}... (0 Replies)
Discussion started by: metallica1973
0 Replies
LEARN ABOUT DEBIAN
funtbl
funtbl(1) SAORD Documentation funtbl(1)NAME
funtbl - extract a table from Funtools ASCII output
SYNOPSIS
funtable [-c cols] [-h] [-n table] [-p prog] [-s sep] <iname>
DESCRIPTION
[NB: This program has been deprecated in favor of the ASCII text processing support in funtools. You can now perform fundisp on funtools
ASCII output files (specifying the table using bracket notation) to extract tables and columns.]
The funtbl script extracts a specified table (without the header and comments) from a funtools ASCII output file and writes the result to
the standard output. The first non-switch argument is the ASCII input file name (i.e. the saved output from funcnts, fundisp, funhist,
etc.). If no filename is specified, stdin is read. The -n switch specifies which table (starting from 1) to extract. The default is to
extract the first table. The -c switch is a space-delimited list of column numbers to output, e.g. -c "1 3 5" will extract the first
three odd-numbered columns. The default is to extract all columns. The -s switch specifies the separator string to put between columns.
The default is a single space. The -h switch specifies that column names should be added in a header line before the data is output. With-
out the switch, no header is prepended. The -p program switch allows you to specify an awk-like program to run instead of the default
(which is host-specific and is determined at build time). The -T switch will output the data in rdb format (i.e., with a 2-row header of
column names and dashes, and with data columns separated by tabs). The -help switch will print out a message describing program usage.
For example, consider the output from the following funcnts command:
[sh] funcnts -sr snr.ev "ann 512 512 0 9 n=3"
# source
# data file: /proj/rd/data/snr.ev
# arcsec/pixel: 8
# background
# constant value: 0.000000
# column units
# area: arcsec**2
# surf_bri: cnts/arcsec**2
# surf_err: cnts/arcsec**2
# summed background-subtracted results
upto net_counts error background berror area surf_bri surf_err
---- ------------ --------- ------------ --------- --------- --------- ---------
1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008
2 625.000 25.000 0.000 0.000 6976.00 0.090 0.004
3 1442.000 37.974 0.000 0.000 15936.00 0.090 0.002
# background-subtracted results
reg net_counts error background berror area surf_bri surf_err
---- ------------ --------- ------------ --------- --------- --------- ---------
1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008
2 478.000 21.863 0.000 0.000 5376.00 0.089 0.004
3 817.000 28.583 0.000 0.000 8960.00 0.091 0.003
# the following source and background components were used:
source_region(s)
----------------
ann 512 512 0 9 n=3
reg counts pixels sumcnts sumpix
---- ------------ --------- ------------ ---------
1 147.000 25 147.000 25
2 478.000 84 625.000 109
3 817.000 140 1442.000 249
There are four tables in this output. To extract the last one, you can execute:
[sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -n 4
1 147.000 25 147.000 25
2 478.000 84 625.000 109
3 817.000 140 1442.000 249
Note that the output has been re-formatted so that only a single space separates each column, with no extraneous header or comment informa-
tion.
To extract only columns 1,2, and 4 from the last example (but with a header prepended and tabs between columns), you can execute:
[sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -c "1 2 4" -h -n 4 -s " "
#reg counts sumcnts
1 147.000 147.000
2 478.000 625.000
3 817.000 1442.000
Of course, if the output has previously been saved in a file named foo.out, the same result can be obtained by executing:
[sh] funtbl -c "1 2 4" -h -n 4 -s " " foo.out
#reg counts sumcnts
1 147.000 147.000
2 478.000 625.000
3 817.000 1442.000
SEE ALSO
See funtools(7) for a list of Funtools help pages
version 1.4.2 January 2, 2008 funtbl(1)