...
I have an awk code below.
I have an input file which has a line which has a last field with about 4000 characters and it pop up an error stated below. It is too much for awk to take ?
Error: $nawk -f getfull_line_awk myinputfile
nawk: input record `0017c C12efusecntl0 ...' too long
input record number 2795, file myinputfile source line number 1
If you're going to be a UNIX/Linux scripting professional you're going to encounter lots of tools which were written decades ago with hard-coded limits that, by today's standards, are pathetic.
Your choices often boil down to: find a work-around or install more modern and less limited tools (such as the GNU utilities). Sometimes the work-around may entail writing a small utility of your own, in C.
As noted elsewhere in this thread you could install the GNU version of awk (called gawk) or you could use Perl. Both of these are written to avoid hard-coded limits wherever possible. They dynamically allocate memory and re-size their allocations and use the data types which are suitable for the underlying OS for numeric limits (so some things may be limited to 32-bit values on a 32-bit platform, etc).
Other work-arounds depend quite a bit on the problem at hand.
For example, in your code:
Quote:
It might be possible for you to use grep filter out all of the lines that start with ** (a literal pair of asterisk characters) and feed just those through your (limited version of) awk. Obviously that will only work if the lines beginning with ** are smaller than this 4000 character limitation; and assuming that your version of grep can cope with longer lines.
Similar workarounds might be possible by prefiltering with sed or cut (if the stuff you cut out of the long lines isn't needed in the results), etc.
First option of installation of newer version of awk is out because i cannot administer/control the whole system.
I tried to use grep "**" myinputfile but it seems to produce the whole input file again which is very strange
Can anybody provide me with a perl code which as a similar function as my awk code ?
Hi All,
i got few questions...
1) What is the maximum number of files that we can save under a single directory in AIX ? (* we have enough storage/disk space)
2) And what is the maximum number of sub - directories in side a directory?
I know that...every directory is a (special)... (11 Replies)
whats wrong with this addition?
Whats the maximum number of digits can be handled?
pandeeswaran@ubuntu:~/Downloads$ const=201234454654768979799999
pandeeswaran@ubuntu:~/Downloads$ let new+=const
pandeeswaran@ubuntu:~/Downloads$ echo $new
-2152890657037557890
pandeeswaran@ubuntu:~/Downloads$ (4 Replies)
Hi all,
does any one know ,if there is any limitation on rm command
limitation referes here as a size .
Ex:when my script try to rum rm command which have size of nearly 20-22 GB ..CPU load gets high ?
if anyone know the relation of CPU load and limitation of rm command . (8 Replies)
Hello
first, truth been told, I'm not even close to be advanced user. I'm posting here because maybe my question is complicated enough to need your expert help
I need to use awk (or nawk - I don't have gawk) to validate some files by computing the total sum for a large numeric variable.
It... (1 Reply)
Is there a size limit when passing an argument using wildcards? I.E. when I pass an argument in the form (like) "ftp_auto *.txt" - is there a limitation on the size of UNIX expanding "*.txt" ? (1 Reply)
Hello,
I am looking for a way to get around an issue, as I am using the grep command in a very common situation:
grep ^50 File.*.txt | "some awk process"
My problem is that bash throws me an error on the grep command if the directory in question contains several thousands files.
... (6 Replies)
Hi,
I'm having a problem with a while loop syntax that doesn't seem to loop correctly.
TODAY=`date +%d%m%Y`
while read hostname
#for hostname in $(cat $CONFIG)
do
OUTFILE=/tmp/health_check.$hostname.$TODAY
if
then
touch $OUTFILE
func_header
else
rm $OUTFILE
... (2 Replies)
Hi,
Iam using an alias to get the file count from one directory using normal ls command like ls file*|wc -l.If my file increases more than 35,000 ,my alias is not working.It shows that arg list too long.
is that can be limitation of ls or problem in alias?
I would appreciate if anyone can... (2 Replies)
Hi All,
Can anyone please clarify me the following questions:
1. Is there any file size limitation in HP-UX 11i, that I can able to create upto certain size of file (say 2 GB) and not more then that????
2. At max. how many files we can able to keep inside a folder????
3. How many... (2 Replies)
Hi ,
i'm trying to use "find "command with "-size "option but i encounter 2gb file limitation.
Can you confirm this limitation ?
Is there a simple way to do the same thing ?
My command is :
<clazz01g-notes01>/base/base01 # find /base/base01 -name '*.nsf' -size +5242880000c -exec ls... (2 Replies)