Search Results

Search: Posts Made By: cokedude
10,316
Posted By MadeInGermany
If you want to look for filenames that begin with...
If you want to look for filenames that begin with the value of the i variable:
find / -type f -name "$i*" -exec ls -l {} +Within the "quotes" the shell expands $i but does not evaluate the *...
10,316
Posted By MadeInGermany
1. and 4. look okay. This loop method is very...
1. and 4. look okay.
This loop method is very slow: for each item it needs to scan all available files...
Turn on debug mode with set -x (turn off with set+x)
10,316
Posted By RudiC
is not too good a starting point for an analysis....
is not too good a starting point for an analysis.



Yes - start with a decent request - details like misbehaviour, error messages, file / directory structures, input data, execution logs, ...
25,802
Posted By MadeInGermany
The point is: set a real limit. "Unlimted"...
The point is: set a real limit.
"Unlimted" means a process is allowed to consume all system resources. And a buggy/wild/looping process could do so.
25,802
Posted By vbe
Greetings and Happy New Year! You can't...
Greetings and Happy New Year!

You can't judge like that what is good/bad...
Oracle is no ordinary user: many users access to files using oracle UID the same for processes etc
So it will depend...
25,802
Posted By zxmaus
they both do exactly the same - there is no...
they both do exactly the same - there is no better or worse :)
btw - it is almost always a bad idea to increase the hard limits for a non-root user ... and to set nofiles to unlimited
10,653
Posted By Chubler_XL
Ensure dollar variable to be expanded is outside...
Ensure dollar variable to be expanded is outside of single quotes eg:
echo 'File not found in '"$PWD"', please re-enter'

In your script you would do:

for i in `cat file1`
do
...
9,898
Posted By RudiC
Try alsotr -s ', ' $'\n' < file 1 2 3 4
Try alsotr -s ', ' $'\n' < file
1
2
3
4
9,898
Posted By MadeInGermany
\n being a newline is specific to GNU sed. Put...
\n being a newline is specific to GNU sed.
Put a \newline

sed 's/[,.!?] */&\
/g' /tmp/ports
sed 's/[,.!?] */\
/g' /tmp/ports
The 'two
lines' string work in all Bourne/Posix compatible...
12,422
Posted By MadeInGermany
Straight forward: find . -type d \( -path dir1...
Straight forward:
find . -type d \( -path dir1 -o -path dir2 -o -path dir3 \) -prune -o -name '*rpm*' -print
Note that the default is -a locigal AND and has higher precedence than -o logical OR. So...
12,422
Posted By MadeInGermany
Doesn't it make more sense to have the -type f...
Doesn't it make more sense to have the -type f right from the -o?
Also, if -prune is true then it is printed, unless there is an action like -print or -exec on the other branch.
find / -path /u...
12,422
Posted By Chubler_XL
You don't want to match on -type d before the...
You don't want to match on -type d before the prune as the true condition negates the prune

I'd go with:

find / -path /u -prune -type f -o -name '*rpm*'
12,803
Posted By apmcd47
GNU Zip will decompress .Z files, sogunzip -c...
GNU Zip will decompress .Z files, sogunzip -c somefile.tar.Z | tar xf -orzcat somefile.tar.Z | tar xf -should work without having to install the compress utility. It may even be possible to use GNU...
12,803
Posted By stomp
tar (child): compress: Cannot exec: No such file...
tar (child): compress: Cannot exec: No such file or directory

As the slightly irritating message states: the compress utility is not found. You have to install it.

In Ubuntu the corresponding...
12,223
Posted By Don Cragun
Hi rdrtx1, Unfortunately, the above script...
Hi rdrtx1,
Unfortunately, the above script won't match 80%, 90%, or 100%. One could use:
awk '/dev/ && $5 ~ /([7-9][1-9]|(8|9|10)0)%/' infile
but I find the suggestions provided by...
12,223
Posted By rdrtx1
awk '$5 > 70' FS="[ %] *" infile
awk '$5 > 70' FS="[ %] *" infile
12,223
Posted By MadeInGermany
The trailing % sign makes $5 a string, and the >...
The trailing % sign makes $5 a string, and the > compares strings.
The +0 ensures that the > compares numbers.
Most awk versions will then ignore the trailing % sign.
But a few awk versions...
12,223
Posted By RavinderSingh13
Hello cokedude, Could you please try...
Hello cokedude,

Could you please try following(I haven't tested it though).


awk '/dev/ && $5+0 > 70' Input_file


Thanks,
R. Singh
1,253
Posted By Don Cragun
Certainly not a generic solution, but it seems to...
Certainly not a generic solution, but it seems to work for your coding style:
#!/bin/ksh
awk '
$1 ~ /^printf[(]/ {
NoChange = $0
Comment = "//" $0
while($NF !~ /[)];$/) {
getline...
1,134
Posted By gull04
Hi, Try this; for i in `ls t*` do ...
Hi,

Try this;

for i in `ls t*`
do
a.out ${i}
done

Regards

Gull04
30,584
Posted By Scrutinizer
Yes, in awk you can direct your output to a file:...
Yes, in awk you can direct your output to a file:
awk '
.....
some_condition {
print "success\n" $3 > "some_file"
}
.....
'
30,584
Posted By Aia
./a.out 50 5 4 | awk '{ if ($1...
./a.out 50 5 4 |
awk '{
if ($1 =="success")
{
print "success";
print $3;
print "Writing success" > "success.file"
}
if ($1 =="failure")
...
3,054
Posted By Don Cragun
No. Yes. Sure. And, no! The output you...
No. Yes. Sure. And, no!

The output you showed us from the ipcs utility on your Fedora Linux distro does not conform to the standard in lots of ways. You need to keep the full script I gave you...
3,054
Posted By Don Cragun
We are not using different Linux distros. I am...
We are not using different Linux distros. I am using a UNIX system where the output produced by the ipcs utility follows the formatting requirements specified by the standards.

If what you are...
Forum: AIX 06-19-2014
9,580
Posted By ibmtech
My two cents with rbattel, those are unread...
My two cents with rbattel, those are unread messages, figure out why it is generating it, by logging in as that user and running mail (as mentioned by rbattel).

Also, check your cron, see if you...
Showing results 1 to 25 of 47

 
All times are GMT -4. The time now is 01:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy