Search Results

Search: Posts Made By: acsg
4,608
Posted By acsg
Thanks for all your help yazu!! :b: Is it...
Thanks for all your help yazu!! :b:

Is it possible to do it the other way (keep track of the number of lines between repetitions and then make an avg)?

-----------------
I guess I only have to...
4,608
Posted By acsg
There is another small problem I found. The...
There is another small problem I found. The record it keeps is static, meaning it should count the seconds since the last appearance, but what it's doing right now is counting the seconds since the...
4,608
Posted By acsg
I just tried the algorithm and it works for the...
I just tried the algorithm and it works for the example input file but for my actual file, there are a couple of problems.

The input lines in my original file are of the form:
1301892853.870 ...
4,608
Posted By acsg
Thanks for your reply. Yeah what I want is...
Thanks for your reply.
Yeah what I want is something like what you said. So, for your example input file, the output would be:

a- 4 2
b- 2 3
c- 1 0
d- 2 1

the first field being the...
4,608
Posted By acsg
I'm sorry I think I wasn't clear enough. I'd...
I'm sorry I think I wasn't clear enough.

I'd like the average of "every how many lines a certain line is repeated". So say that the line

a b c d e

is repeated first every 2 lines, then the...
4,608
Posted By acsg
Perl- Finding average "frequency" of occurrence of duplicate lines
Hello,

I am working with a perl script that tries to find the average "frequency" in which lines are duplicated. So far I've only managed to find the way to count how many times the lines are...
1,476
Posted By acsg
Perl- Nested 'for' order change
Hello,

I'm quite new to perl so my question is rather basic and I know there is probably a simple way around it but I can't seem to find it.

I have a medium-length code and there is a part that...
4,378
Posted By acsg
Thank you!! This seems to do the trick but I...
Thank you!! This seems to do the trick but I don't quite understand how it does it... could you please explain what the !(NR%7) is for? and why did you use the 'delete' ?
4,378
Posted By acsg
Hello, The desired output is for the whole...
Hello,

The desired output is for the whole input... meaning that I want to count the fact that, for example, channel 2513, appears in all 3 'packets' (groups of 7 lines).
4,378
Posted By acsg
awk- looping through groups of lines
Hello,

I'm working with a file that has three columns. The first one represents a certain channel and the third one a timestamp (second one is not important). Example input is as follows:

2513 ...
13,592
Posted By acsg
It's perfect now, thanks Franklin52! :b:
It's perfect now, thanks Franklin52! :b:
13,592
Posted By acsg
Hmm yeah but the point is that the original...
Hmm yeah but the point is that the original output after the awk that got deleted

awk '{print $1, p += $2}'
was as follows:

0 18767
0.000999928 105950
0.00100017 144925
0.00199986 225315...
13,592
Posted By acsg
I´m actually using quite a huge amount of...
I´m actually using quite a huge amount of information, the output looks something like this:

0 18767
0.000999928 87183
0.00100017 38975
0.00199986 80390
0.00200009 ...
13,592
Posted By acsg
Doesn't work like it used to with the other awk...
Doesn't work like it used to with the other awk before the pipe... it prints the first and second columns without any change.
13,592
Posted By acsg
I'm a bit confused... Is this only for the last...
I'm a bit confused... Is this only for the last awk? or does it include the previous one as well?

awk '{print $1, p += $2}'
13,592
Posted By acsg
I'm not too sure how to pass the file as...
I'm not too sure how to pass the file as "inputfile" because I'm using a pipe from a previous awk result as an input ...so I'm basically doing a whole bunch of awks and this would be the last one:
...
13,592
Posted By acsg
awk- reading input file twice
Hello,

I've been trying to come up with a solution for the following problem; I have an input file with two columns and I want to print as an output the first column without any changes but for...
2,765
Posted By acsg
It's working now, thank you so much for your help...
It's working now, thank you so much for your help and time mirni, |UVI| and summer_cherry :o
2,765
Posted By acsg
Hi mirni, Thanks for the reply. It works...
Hi mirni,

Thanks for the reply. It works well overall but the problem is that it seems to detect every second repeated number as a discontinuity as well, so say that this is the input (if the...
2,963
Posted By acsg
back with a little extra question... is it...
back with a little extra question...
is it possible to print the output in a way that it prints all the lines that contain the same first field, then the next same first field, etc (without doing a...
2,765
Posted By acsg
I tried the new code with this input: 160 ...
I tried the new code with this input:

160 1
160 2
160 3
160 4
160 6 <-- **
160 7
160 8
160 9
160 10
160 10
160 11
160 12
160 13
160 14
160 ...
2,765
Posted By acsg
:D Thank you so much!! You were extremely helpful.
:D Thank you so much!! You were extremely helpful.
2,765
Posted By acsg
It doesn't seem to work, it doesn't print...
It doesn't seem to work, it doesn't print anything... are the first couple of instructions supposed to be wrapped in a BEGIN statement?

---------- Post updated at 11:42 AM ---------- Previous...
2,765
Posted By acsg
awk- comparing fields from the same column, finding discontinuities.
Hello,

I have a file with two fields. The first field repeats itself for quite a while but the second field changes. What I want to do is to go through the first column until its value changes...
2,963
Posted By acsg
Thanks cgkmal! It works :) Now, just to be...
Thanks cgkmal! It works :)

Now, just to be sure, the outputs could be treated separately by another script, right? I printed them out to a file in a spreadsheet, just to see what they'd look like,...
Showing results 1 to 25 of 28

 
All times are GMT -4. The time now is 09:35 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy