This question has been asked many times, but my problem is slightly different.
In my shell script i am connecting to oracle database and loading the results to .dat file. This .dat file is later used to create to .xls file[to be sent to the users] Some times the size of .dat file becomes more than 120000 bytes. In these cases, the sed and other commands used in the script fails with cannot execute [Argument list too long] error.
I tried splitting the .dat file to smaller files, but i am not even able to split the file and it again throws me the same [Argument list too long] error.
The problem is not just with sed, but all commands used in the shell script.
What does ls -l *.dat return? The same error? If so, that's the number of files, not their size - look at using find and possibly xargs. E.g. something like:
Incindetally, sed /^$/d *.dat won't do anything to the actual files (unless you've aliased sed). Is that your intention?
Because of the kernel ARG_MAX value is limited and asterisk character expands the all matching files,directory,links or any other forms.
Therefore your bash(shell) command argument buffer's ( commandline length ) is fill and and then overflow.
You can use * for loop * while loop * or any scriptable methods
* split the arguments to meaning values* xargs (to split to the arguments with find is one good method )
* ls (use the folder hierarchy with asteriks )
* find (use the folder hierarchy without asteriks )
For example :
# (use the while loop and read the arguments one by one (-n1) )
# (use the spaces instead of newlines with xargs and use getdents to get the file entries with find)
and consider the @CarloM 's notice..
your sed command's default output is your stdout (usually your screen tty,pts,console,...) ,
therefore you must redirect to another file for save the changes or use the '-i' parameter , if you have gnu sed for save to your file same time.
Hi ygemici, I am trying to implement your suggestions and i have made couple of changes to my script to use while loop and read from file.
But i am stuck in the below section:
the above section basically echoes the contents of file and sets up the excel file to be created. Any suggestions on this?
Hi ygemici, I am trying to implement your suggestions and i have made couple of changes to my script to use while loop and read from file.
But i am stuck in the below section:
the above section basically echoes the contents of file and sets up the excel file to be created. Any suggestions on this?
Regards,
Hi galaxy_rocky,
if you can write the input file (or sample) and desired output file maybe i can help more
firstly i dont understand exactly what is the desired output ? (what is your $1F , is it fields that in te your input lines plus "F" )
$1F is the first field from the first line of the input file. Similarly $2F is the the second field from the first line of input file. This should loop for all the lines in the input file. Totally there are 13 fields in the input file. That is the reason i am using 13 awk commands!
My sample input file looks like below:
And my sample expected output is :
Basically the fields in the input value are seperated by ";". Based on the number of records in the input file, there should be corresponing entries in the output file.
Please help on this issue.
Regards,
Last edited by galaxy_rocky; 04-16-2014 at 07:56 AM..
Reason: code tag
If every field is treated the same then you can do the Cell tags fairly simply in one awk command. Something like:
EDIT: Actually, you can do away with the while as well:
Hi Team,
Here's the situation.
I have approximately 300000 to 500000 jpg files in /appl/abcd/work_dir
mv /appl/abcd/work_dir /appl/abcd/process_dir
The above move command will work if the jpg files count is close to 50000 (not sure). If the count is less this mv command holds good. But if... (14 Replies)
Hi all,
I am using GNU sed (named gsed under macports) in OSX. I have a directory with a series of files named pool_01.jpg through pool_78802.jpg. I am trying to use this command to rename the files to their checksum + extension.
md5sum * | gsed -e 's/\(*\) \(.*\(\..*\)\)$/mv -v \2 \1\3/e'
... (3 Replies)
Hi I am using find command --
find "directory1" -type f | xargs -i mv {} "directory2"
to avoid above argument list too long problem.
But, issue i am facing is directory1 is having subdirectories due to this i am facing directory traversal problem as i dont want to traverse subdirectories... (9 Replies)
Hi,
i am having some trouble with the below command, can some one suggest me the better way to do it.
grep -l 'ReturnCode=1' `find $Log -newer /tmp/Failed.tmp -print | xargs ls -ld | egrep SUB | egrep -ve 'MTP' -ve 'ABC' -ve 'DEF' -ve 'JKL' -ve 'XYZ' | awk '{print $9}'` > $Home1
Its... (2 Replies)
Dear Experts,
I have a list of 10K files in a directory. I am not able to execute any commands lile ls -lrt, awk, sed, mv, etc........
I wanna execute below command and get the output. How can I achieve it?? Pls help.
root# awk -F'|' '$1 == 1' file_20120710* | wc -l
/bin/awk: Argument list... (2 Replies)
Dear Friends,
The following script processes a 14508 lines log file.
#!/bin/sh
while read line
do
d=`sed 's/* - * \*\/*\/* *\)\] .*/\1/' | tr '/' ' ' | sed 's/\(*\):\(*\)/\1 \2/'`
y=`date -d "${d}" "+%Y%m%d%H%M%S"`
echo "${y}"
done
While running the above script, I am... (4 Replies)
I have a huge set of files (with extension .common) in my directory around 2 million. When I run this script on my Linux with BASH, I get /bin/awk: Argument list too long
awk -F'\t' '
NR == FNR { a=NR }
NR != FNR {
sub(".common", "", FILENAME)
print a, FILENAME, $1
}
'... (1 Reply)
Hi guys
Following command results in
sed -i 's/#/\\#/g' /home/test/sqlstents*
-bash: /bin/sed: Argument list too long
Please help me solve it.. is there any other way i can do this?.. thanks (4 Replies)
I have a wrote a script which consits of the below line.. Below of this script I'm getting this error "ksh: /usr/bin/ls: arg list too long"
The line is
log_file_time=`ssh -i $HOME/.ssh/id_rsa -q $i ls -lrt /bp/karthik/test/data/log/$abc*|tail -1|awk '{print $8}'`
And $abc alias is as "p |... (1 Reply)
Hi
I executed the code
for file in `ls pdb*.ent`
do
new_name=`echo $file | sed 's/^pdb//;s/.ent/.txt/'`
mv $file $new_name
done
Its giving error at ' ls pdb*.ent' argument list too long
i have around 150000 entries
please help
Thank you (6 Replies)