Could you quote your findings? I couldn't find anything which would back it up in the man pages of bash or zsh (I didn't check it for other shells). Well, even then, some limit likely applies (simply because the available memory is finite), but I agree that this limit might possibly be larger than the one which usually applies for the command line.
Still, exploiting this limit migh not be wise. I did some experimentation with this in bash and zsh, basically pasting the content of a huge file to the command line, with the result that after a certain size, my system started swapping (obviously because the shell tried to generate the whole command line in-memory, which makes sense) and in fact blocked the whole process (I killed it after a couple minutes). I would conclude that for processing large amounts of data, a solution based on xargs is safer than a command line expansion, even if it is "only" a for loop.
Don Cragun has already provided you with several quotes and arguments that back up my assertions.
With regards to the memory usages argument. Yes, there ultimately a memory limit due to the size of memory. One could of course add more memory, but ultimately this boils down to just the classic programming decision: speed versus memory. One could also say that the use of while read is problematic with a huge file because it is too slow. So with large files one wil probably be better of using a utility, rather than a shell script for processing.
But at any rate there are no line length limitations.
The problem with this kind of for loop construction is not so much memory usage, but lies moreso in the fact that the command substitution part ( `...` or $(...) ) needs to be unquoted for this to work and that makes it vulnerable to interpretation by the shell, like field splitting and wildcard expansions, which can lead to unpredictable results.
--
Quote:
Originally Posted by Don Cragun
[..]
I'm not saying you should never use xargs, but xargs has its own set of problems. And, I would never suggest using:
instead of:
(again with IFS set just for the read command to be just a <newline> character), which is much less memory intensive (if file can be large), but produces the same results (unless the shell runs out of memory while gathering arguments for the for loop).
Hi Don, with the for loop being a shell construction, it would not work that way, since IFS cannot be set local like that. One would need to use something like this (with the remaining caveat of wildcard expansions):
In case of the while read construct IFS can be set local to the read command, but a newline in IFS would never be used, since the read command is line based, so it is equivalent to:
--
Quote:
Originally Posted by MadeInGermany
Thanks, I have corrected my post to uniq.
Post #1 sais it's ok if all values are the same. Perhaps was not meant like this.
Also with uniq -u, you still need to sort first...
These 2 Users Gave Thanks to Scrutinizer For This Post:
I am searching for a process that should be up and running. Im using the following command
ps -ef | grep elasticsearch
to get
elastic+ 1673 1 0 Jan29 ? 05:08:56 /bin/java -Xms4g -Xmx4g -Djava.awt.headless=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC... (1 Reply)
Hi All
I am working on AIX 7.1 and I am trying to show an output that I get from "cat" a log file to email. However in email I get the below output:
In the script I have defined the colors as:
#!/bin/sh
echo "\033
Below is the script I have created to send this output:
... (9 Replies)
Hi Folks,
I have a situation here, where no command is giving any output, and it's not even showing any error message also.
What could be the reason? (3 Replies)
Hello
I am working on one script where I am trying to display all the directories which is inside the workspace but somehow it is giving me weird output and this is occurring only with one directory other also having the result.html file inside the directory.
for i in `ls -1 | egrep -iv... (2 Replies)
Hi,
This is Solaris-10 box and in few of file-system (root file-system of non global zones), usage/available is not showing correct size. I am not able to figure out, what is eating up this space.
Global Server - bdrpod01
Non Global zone - bdrpod01-zputq01
root@bdrpod01:/root# df -h... (2 Replies)
Running solaris 9, on issuing the follwing command
df -h | awk '$5 > 45 {print}'
Filesystems with utilisation > 45% are being displayed as well as those between
5 and-9%!!! (3 Replies)
Hi, I have the following file called addresses, (it is a large file i have only copy and pasted few of the data below) and I am wanting to write a command so it will Find the ratio of mobile (07....) to land line (01....) telephone numbers?
then find the most popular first name and list the... (1 Reply)
Hi,
I have a very frustrating issue! I hope you guys can assist
When a disk is presented out the iSCSI target display a lower disk capacity
SOLARIS VERSION is SOLARIS 10 05/09 Kernel Patch 139555-31
ISCSI Patch 119090-31, 141878-11
Unix Commands To discover Target
bash-3.00# i... (0 Replies)
I'm using the below command to list files older than 2 hours but it returns redundant output, am I missing something.
# find . -mmin +120 -exec ls -l {} \;
total 0
-rw-r--r-- 1 root system 0 Oct 13 09:52 test1
-rw-r--r-- 1 root system 0 Oct 13 09:52 test2
-rw-r--r-- 1 root ... (5 Replies)