Search Results

Search: Posts Made By: ailnilanjan
2,802
Posted By Skrynesaver
Solaris 5_10 comes with Perl 5.8 by default which...
Solaris 5_10 comes with Perl 5.8 by default which has the Time::Local module by default.

This allows conversion to epoch

perl -MTime::Local -ne '...
2,802
Posted By rbatte1
The best way would probably be to:- Convert...
The best way would probably be to:-

Convert this to seconds since the Epoch (197001010000 in the same format)
Add the required seconds to change the timezone
Convert back to your format....
736
Posted By Don Cragun
You could also try something like: awk ' ...
You could also try something like:
awk '
BEGIN { FS = OFS = ";"
}
FNR == NR {
a[$1,$2] = $0
next
}
($1,$2) in a {
$0 = a[$1,$2]
}
1' file1 file2
which produces the output you...
736
Posted By RavinderSingh13
Hello, Following may help you in same. ...
Hello,

Following may help you in same.

awk -F";" 'FNR==NR{X[$1 FS $2]=$3;next} ($1 FS $2){if(X[$1 FS $2]){print $1 FS $2 FS X[$1 FS $2]} else {print $1 FS $2}}' File1 File2


Output will...
1,799
Posted By RudiC
Tryawk 'NR==FNR {T[$12]=$28 ...
Tryawk 'NR==FNR {T[$12]=$28
next
}
$19 in T {sub (/tre/, "RSL", $29)
if ($29 != T[$19]) print...
1,799
Posted By RudiC
Quoting Don Cragun:
Quoting Don Cragun:
1,799
Posted By Skrynesaver
You can call Perl within a bash script, just as...
You can call Perl within a bash script, just as you can call sed or awk, the example above, using the same quotes could be dropped into a script as a single command. (Though a few comments might be...
1,799
Posted By Skrynesaver
So something like the following WARNING UNTESTED...
So something like the following WARNING UNTESTED OFF THE TOP OF MY HEAD CODE FOLLOWS:

perl -ne 'BEGIN {open file1_fh, "<", "/path/to/file1";
while(<file1_fh>){@r=split/;/,$_;
push...
1,589
Posted By RudiC
Don't use double quotes around the filenames...
Don't use double quotes around the filenames (i.e. file[12]) as they need to be expanded by the shell. Assuming a typo in your samples' first lines, I'd suggest "replacing field 1 in file 2 with...
1,589
Posted By Don Cragun
You could try something like: awk ' BEGIN...
You could try something like:
awk '
BEGIN { FS = OFS = ";"
}
FNR == NR {
x[$1] = $2
next
}
$1 in x {
$2 = x[$1]
}
1' file[12]which (with your sample files) produces the output:
cellRef...
2,521
Posted By hanson44
I read it as "n == 3". But you're right, it's not...
I read it as "n == 3". But you're right, it's not really clear.

If the intent is "n >= 3", then you either:
- Use the scrutinizer shorter solution designed for "n >= 3", or
- Change == 3 to >=...
2,521
Posted By Scrutinizer
That should work, what is your OS and version? ...
That should work, what is your OS and version?


---
Alternative for 3 or more occurrences, not exactly 3 (not sure what you require):
awk '$2>=M[$1]{M[$1]=$2} ++A[$1]==3{print $1,M[$1]}' file
...
2,521
Posted By hanson44
Here is a way to do it using awk. $ awk...
Here is a way to do it using awk.
$ awk '{cnts[$1]++} $2 > fld2[$1] {fld2[$1]=$2; fld0[$1]=$0} END {for (key in cnts) { if (cnts[key] == 3) print fld0[key] }}' input3
A 5
B 5
C 10
16,855
Posted By alister
printf '%s\n' 1,-500d w q | ed -s file...
printf '%s\n' 1,-500d w q | ed -s file 2>/dev/null

Regards,
alister
16,855
Posted By RudiC
$ tac file | awk 'NR<=500' | tac
$ tac file | awk 'NR<=500' | tac
16,855
Posted By MadeInGermany
For academic interest, I have bug-fixed my 2nd...
For academic interest, I have bug-fixed my 2nd sample.
It also makes use of NR instead of an extra variable.

Scrutinizer presented a fix for Jotne's 1st sample
awk '(s-NR)<500' s="$(wc -l...
16,855
Posted By Yoda
Another approach: [ $( wc -l < file ) -gt 500 ]...
Another approach:
[ $( wc -l < file ) -gt 500 ] && tac file | head -500 | tac
16,855
Posted By Scrutinizer
Not just academic interest. With some...
Not just academic interest. With some implementations of tail the internal buffer is so small that it may not be able to handle 500 lines...


--
Alternative circular buffer (on Solaris use...
16,855
Posted By MadeInGermany
Oh sorry, then it's simply tail -500 fileFor...
Oh sorry, then it's simply
tail -500 fileFor academic interest, the circular buffer is still to be used but must be printed at the very end

awk '
{s[NR%n]=$0}
END {
for (i=NR+1;i<=NR+n;i++)...
16,855
Posted By Jotne
This removes the 500-1 last line. He like to...
This removes the 500-1 last line. He like to save the last 500 lines, not delete.
16,855
Posted By MadeInGermany
Most efficient is to read the file once and store...
Most efficient is to read the file once and store the lines in a circular buffer.
An attempt with awk

awk '
{s[i++]=$0}
{i=i%500}
(i in s){print s[i]}
' file

Perl with its compact arrays...
16,855
Posted By Jotne
This is how to delete tail -n 500 oldfile >...
This is how to delete
tail -n 500 oldfile > newfile
newfile then contains what you want.
16,855
Posted By Jotne
cat tone two three four five six ...
cat tone
two
three
four
five
six
seven
eight
nine
ten

tail -n 3 t
eight
nine
ten

awk '(s-NR)<3' s=$(cat t | wc -l) t
eight
nine
ten
16,855
Posted By RudiC
man tail
man tail
16,855
Posted By Jotne
Maybe some like this works awk '(s-NR)<500'...
Maybe some like this works
awk '(s-NR)<500' s=$(cat file | wc -l) file
Showing results 1 to 25 of 25

 
All times are GMT -4. The time now is 09:35 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy