Search Results

Search: Posts Made By: criglerj
3,826
Posted By criglerj
A slight refinement to Perderabo's solution. ...
A slight refinement to Perderabo's solution. This is a bit more awk-like.

awk '$10 == "uu" { $10 = "@@" } { print }' adata This can be generalized across multiple lines like this (in a...
Forum: Open Source 08-19-2005
335,660
Posted By criglerj
I started using vi when I had an AT&T Unix PC...
I started using vi when I had an AT&T Unix PC (68010 with attached monitor, 20 Mb hard drive, SVR about 2) and vi was the only (reasonable) game in town. When I got a job using VMS (the Very...
4,812
Posted By criglerj
If the data you're looking for always shows up in...
If the data you're looking for always shows up in the same awk field, e.g., $4, and it's the only thing in that field, then you can speed it up as r2007 suggested, by only checking that field and by...
4,812
Posted By criglerj
nawk -f prueba.awk borrar
nawk -f prueba.awk borrar
4,812
Posted By criglerj
vgersh99's solution is good, but it might be...
vgersh99's solution is good, but it might be possible to refine it a bit. Assuming a nawk/gawk (likewise untested):
k = match($0,/0x000000(01|0B|45|58|64|66)/) {
arr[substr($0, k + 8, 2)]++
}...
4,970
Posted By criglerj
Um, it doesn't work in csh because "if [ $SHELL...
Um, it doesn't work in csh because "if [ $SHELL != /bin/sh ]; then" is not valid csh syntax.
2,797
Posted By criglerj
Yikes! Okay, here's one approach. I'm assuming...
Yikes! Okay, here's one approach. I'm assuming you're showing us one line (record) from either file.

In your BEGIN section, gather all the field 3's from the first file into an array...
2,797
Posted By criglerj
Right. FNR is the record number in the current...
Right. FNR is the record number in the current file; NR is the cumulative record number.

Are these one-line files? If so this is acquiring that HW smell... Also, if these are one-line files,...
2,797
Posted By criglerj
awk -F\| 'FNR==NR{arr[$3]=1;RS=";";next};$30 in...
awk -F\| 'FNR==NR{arr[$3]=1;RS=";";next};$30 in arr' output.txt assets.dat

But this still will never be executed in the second file.
2,797
Posted By criglerj
You said "record separator" but then you...
You said "record separator" but then you specified the field separator.
The manpage tells you that RS and FS are just a predefined variables with default values. The setting from the command line...
Forum: BSD 03-30-2005
4,777
Posted By criglerj
Anyone using NetBSD pkgsrc on FreeBSD?
I confess, I'm getting a hair frustrated with the FreeBSD ports system. Is there a reasonably headache-free way of migrating all my installed apps to pkgsrc? Is it advisable or not? Or should I...
2,357
Posted By criglerj
Your syntax is fine: You're creating an array...
Your syntax is fine: You're creating an array called @var, indexed from 0 to 2, and printing the middle element. To do this from the command line:

perl -e 'my @var = ("abc","efg","hij"); print...
2,196
Posted By criglerj
Given that you're going through log files, the...
Given that you're going through log files, the $UserId and $xxx probably appear in the same relative places, e.g., "fred <stuff> date <stuff> query status". Just replace "<stuff>" with ".*" and wrap...
Forum: BSD 03-28-2005
10,504
Posted By criglerj
It will be easier to answer if we know what you...
It will be easier to answer if we know what you did do.
11,853
Posted By criglerj
I'm taking this to mean there are zero lines in...
I'm taking this to mean there are zero lines in the resulting output. (This is a different question than whether a given userid exists.)

Okay, this is where a temporary file begins to make sense:...
11,853
Posted By criglerj
Do you mean you want to break out of the loop if...
Do you mean you want to break out of the loop if there is no log entry for a given user?
11,853
Posted By criglerj
cat-ting to a temporary file
Your 'cat * > tempfile' is a classic UUOC (useless use of cat). There's nothing wrong with making '*' (without the apostrophes so it gets expanded) the "argument" to the first command in your...
11,853
Posted By criglerj
With all the following, YM,aa,MV. If I...
With all the following, YM,aa,MV.

If I understand your problem correctly, you are trying to find all the log entries on a given date for a given user, then send an email to someone (probably...
4
Awk
1,776
Posted By criglerj
Or even awk 'NF>colcount { colcount = NF} ...
Or even

awk 'NF>colcount { colcount = NF}
{for(i=1; i<=NF;i++) sum[i] += $i}
END {for(i=1; i<=colcount; i++) print "column", i, sum[i]}'
1,949
Posted By criglerj
This is true for every awk and cut I know about,...
This is true for every awk and cut I know about, not just the GNU versions.
57,614
Posted By criglerj
I'd modify Bab00shka's solution in one tiny...
I'd modify Bab00shka's solution in one tiny particular:

awk -v "Fred=$Bob" -v "Joe=$Sue" ........rest of awk statement

In my experience, you can assign strings that have embedded spaces in them...
3,035
Posted By criglerj
One performance refinement to google's method...
One performance refinement to google's method when the number of files is large is to do something like this:

find . -type f | xargs egrep '1234567|1223574|5243673|2356464'

There are three...
1,814
Posted By criglerj
But if you have "hundreds of files," you might be...
But if you have "hundreds of files," you might be hurt by your shell expanding this to a command line longer than it can tolerate. Here, xargs comes to the rescue:

ls ab* | xargs grep -i <search...
6
1,942
Posted By criglerj
If you have a lot of files
If you anticipate a lot of files --- 20? 50? 100? 1000? --- you might consider (using Perderabo's post as a starting point)

find /pc62/exports \( ! -name exports -prune \) \
! -name "ABC*.Z"...
30,800
Posted By criglerj
But if your filenames might have spaces ...
If your filenames might have spaces, I think this is the best solution:

find $DIR -type f -atime +7 -print | while read x; do
echo "$x"
rm "$x"
done

If you're running this as a cron...
Showing results 1 to 25 of 126

 
All times are GMT -4. The time now is 11:02 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy