Filtrar log com awk, mostrando somente o necessário


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Filtrar log com awk, mostrando somente o necessário
# 1  
Old 08-24-2009
Question Filtering ArcServe Backup logs with awk.

Hey guys how ya doin´ ?
I´m fighting against a filter for my ArcServe backup logs.
I collect every single day the logs from my servers, and my problem comes next.
I have several versions of ArcServe for Windows, and each one saves the backup log in a different way.
I just want to collect some fields that are important to me and export them to a file, that later, I´ll make available through internal web.
I´m using awk, but until now, any parameter had resolved my problem.
You can see that I attached 2 files. One, is my problem itself. Second, is the layout that I´m looking for to export.
I appreciate any kind of help.

See ya !

**************************************************************************************************
E aí pessoal, tudo em cima?
Estou em uma briga danada com um filtro para logs de backup do ArcServe.
Eu coleto diariamente os logs de backup de meus servidores com ArcServe, e meu problema vem a seguir.
Como tenho várias versões de ArcServe, cada um salva o log de uma forma diferente, e eu quero coletar apenas campos que me interessam no log e exportar para um arquivo que será disponibilizado via web, para acessar o log.
Estou usando o awk, mas não consegui nenhum parâmetro que atendesse o que preciso.

Anexei um log que foi exportado do ArcServe e um que seria um layout ideal para saída.

Agradeço toda e qualquer ajuda.
Abraços.

Last edited by metalfreakbr; 08-24-2009 at 06:01 PM.. Reason: Language format
# 2  
Old 08-24-2009
Change the subject of your thread to english please. This forum's language is english, ty.

Do you mind to give a hint which fields and patterns you are looking for? Searching and guessing by your output file is no fun and not accurate.
# 3  
Old 08-24-2009
zaxxon,
Thanks for your hint. The thread's topic was already changed.
I had attached those 2 files thinking in make easier to explain my thoughts, but no problem, I'll show what I'm looking for.
The main problem of ArcServe log is that the fields are random in size, I mean, If you take one single line of the log, you will see for example 5 fields(thinking in AWK). Now, taking another line you will see 8 fields. These are the problems that I've found in manipulate log files from ArcServe.
But, check it out one example:

**ArcServe
Code:
[08/23/2009-01:00:08 ,1,135,0,0,-1,2,18,0,0] Run Backup Job Scheduled for  8/23/09 at  1:00 AM.

[08/23/2009-11:46:40 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Status changed from Active to Ready [Job No: 2] [Description: INTRAPROD01]


** What I'm looking for
Code:
23/08/2009	01:00:08	Run Backup Job Scheduled for  8/23/09 at  1:00 AM.
23/08/2009	11:46:40	[JOBQUEUE]: Status changed from Active to Ready [Job No: 2] [Description: INTRAPROD01]

Now, imagine that I have one single log file from ArcServe which increments after any backup. The problem is collect only the information from the last job.

I hope I made myself clear. If there was any doubt, please ask me.

Last edited by vgersh99; 08-24-2009 at 08:26 PM.. Reason: code tags, PLEASE!
# 4  
Old 08-24-2009
something to start with....

nawk -f metal.awk ArcServe.log

metal.awk:
Code:
BEGIN {
  OFS="\t"
}
NF {
   pivot=index($0,"]")
   split(substr($0, 2,index($0," ")-1),a, "-")
   print a[1] OFS a[2] OFS substr($0,pivot+1)
}

# 5  
Old 08-25-2009
Thanks vgersh99,
This script simply worked perfect.

But now, let me ask you something....
Imagine that I just have one backup log per server. Let´s call it as ArcServe.log.
To run your script, what I need now is separate the output per job. Is that possible make some filter using some key words?
Code:
[08/29/2009-01:00:01 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Status changed from Ready to Active [Job No: 2] [Description: INTRAPROD01]
[08/29/2009-01:00:04 ,1,147,0,0,-1,2,18,0,0] Run Backup Job Scheduled for  8/29/09 at  1:00 AM.
[08/29/2009-01:00:04 ,1,147,0,0,-1,2,18,0,0] Start Backup Operation. (QUEUE=1, JOB=2)
[08/29/2009-01:29:04 ,1,147,0,0,-1,2,18,0,0] Backup Operation Successful.
[08/29/2009-01:29:10 ,1,147,1,0,-1,2,18,0,0] [CAT] SATURDAY [ID:AA25,SESSION:1] is merged.(files=956)
[08/29/2009-01:29:52 ,1,147,2,0,-1,2,18,0,0] [CAT] SATURDAY [ID:AA25,SESSION:2] is merged.(files=2095)
[08/29/2009-01:29:54 ,1,147,0,0,-1,2,18,0,0] Run Command: C:\Program Files\CA\BrightStor ARCserve Backup\LOG\coletalog.bat.
[08/29/2009-01:29:54 ,1,147,0,0,-1,2,18,0,0] Reschedule Backup Job for  8/30/09 at  1:00 AM.
[08/29/2009-01:29:54 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Rescheduled for  8/30/09 at  1:00 AM [Job No: 2] [Description: INTRAPROD01]
[08/29/2009-01:29:54 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Status changed from Active to Ready [Job No: 2] [Description: INTRAPROD01]

Look, that I´ve put in the code the begining of the backup job log, and the last line is the end of the backup job log. I have huge information between those lines, but the begining and the end are almost like that.

Code:
[08/29/2009-01:00:01 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Status changed from Ready to Active [Job No: 2] [Description: INTRAPROD01]
...
[08/29/2009-01:29:54 ,1,0,0,0,-1,2,3,0,0] [JOBQUEUE]: Status changed from Active to Ready [Job No: 2] [Description: INTRAPROD01]

The main tags that are importants are "[JOBQUEUE]: Status changed from XXXXX to XXXXX ....."
Those tags mentioned the start and the end of the job. So, how to concatenate the content of the job between those tags?

Any idea for that output?
Thanks again for helping me.

Last edited by metalfreakbr; 08-30-2009 at 06:20 AM.. Reason: Puting the right tags to make a filter. But I don´t know how to do it yet.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk to grep log

hi expert, i Have log like : dn: EPC-Per=673,HYHGU objectClass: EPC-PerSubs ownerId: 0 groupId: 4001 shareTree: nodeName=HYHGU permissions: 15 EPC-Per: 673 dn:... (2 Replies)
Discussion started by: justbow
2 Replies

2. Shell Programming and Scripting

Report from Log file using awk

Hi Gurus, I've log files like below: extint.performance.log.2014-04-10-00-00-00.13 extint.performance.log.2014-04-11-00-00-00.12 extint.performance.log.2014-04-12-00-00-00.12 extint.performance.log.2014-04-12-14-35-11.11 extint.performance.log.2014-04-13-00-00-00.10... (4 Replies)
Discussion started by: VasuKukkapalli
4 Replies

3. UNIX for Dummies Questions & Answers

Log transformations with awk

Hi, I have a very large csv file with almost 100,000 columns and 200 rows Example.csv ID,field1,field2,field3,field4 A,5,6,7,8 B,1,2,3,4 C,3,5,6,7 I'm trying to find a way, using awk, to log (base 2) all the values in the table (minus the headers and id field) Output ... (5 Replies)
Discussion started by: nemo246
5 Replies

4. Shell Programming and Scripting

Need help with awk and access log

when i run this command: tail -200 /var/log/httpd/access_log | awk -F'' '{sub(/:*$/,"",$2);print $2}' i get: 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56 13/Aug/2012:20:56... (3 Replies)
Discussion started by: SkySmart
3 Replies

5. Shell Programming and Scripting

Log Analysis with AWK with Time difference

I would like to write a shell script that calculated the time difference bettween the log entries. If the time difference is higher as 200 sec. print the complette lines out. My Problem is, i am unable to jump in the next line and calculate the time difference. Thank you for your Help. ... (5 Replies)
Discussion started by: fabian3010
5 Replies

6. UNIX for Dummies Questions & Answers

awk and log files

Hello, I have a series of logs that I need to analyse. each looks something like:234.10.72.175 Mon Mar 02 20:25:00 GMT 2009 226.91.87.86 Thu Mar 05 03:50:26 GMT 2009 226.91.87.86 Thu Mar 05 04:06:07 GMT 2009 Using awk, so far I have been able to count the lines in a... (5 Replies)
Discussion started by: Freaky
5 Replies

7. Shell Programming and Scripting

Arrange log files with AWK

Hello friends, I have too many log files to arrange. I use a simple script to create log files with below format and i forgot to create daily directory for them at the beginning. Because of this i should move all daily logs into a directory that i need to create. a part of "ls -l" output:... (1 Reply)
Discussion started by: EAGL€
1 Replies

8. UNIX for Dummies Questions & Answers

log to columns using AWK

How can one take the log of a column using AWK?:( (4 Replies)
Discussion started by: cosmologist
4 Replies

9. Shell Programming and Scripting

Using Awk for stream log

Hi Master, I've log in here then try parsing log become : 3GPP-IMSI,Calling-Station-Id 528111500111614,6738790448 528111500068173,6738769831 ..... it is possible to use awk? cause i 've try awk ' /\Accounting\ /{f=1} f && /\3GPP-IMSI \:\ String Value\ /{imsi=$4} f &&... (1 Reply)
Discussion started by: bowbowz
1 Replies

10. UNIX for Dummies Questions & Answers

Use awk to log transform

Hello. I'm trying to use the awk command to convert certains fields to lgo base 2, not base 10. I'm trying command lines like: awk '$2 $5 $7 {print log(2)($2), log(2)($5), $7) filein.txt > fileout.txt I'm trying to make a deadline. Thanks for helping a newbie. Here's some fresh karma... (1 Reply)
Discussion started by: jmzcal
1 Replies
Login or Register to Ask a Question