Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Trying to figure out a log dump command Post 302409526 by MrEddy on Thursday 1st of April 2010 12:25:56 PM
Old 04-01-2010
Trying to figure out a log dump command

Ok so i'm relatively new at UNIX and I'm trying to figure out how to make a log dump command.

My situation is a bit odd in that I'm always looking at customers boxes and as such I can't really do much to them. So everything I do in UNIX pretty much has to be a command I can type in by hand. I can't really install any apps to do things for me or anything like that since that required modifying the customers box. So I'm limited to software on the box now. We are freebsd with Bash.

So what I'm trying to do is make a command that will pull data for a specific date and time out of a bunch of different log files in different places. Then take all that information and dump it into a single log file that I can then save and look at. Which will be a big time saver for me. So that instead of having to manually dig my way through 10 log files I can just run one command and get all the data I need made into 1 file. Then I can dig through that.

I'm sure I could just do a great big long string of.. cat /folder/file | grep 'date' >> output file.

But that would require going through the entire command each time and adjust the date and time information which is going to change constantly. So I was hoping I could find a way to use variable somehow. So that I can write the command to look at variables to determine the date and time. This way all I have to do is adjust the variables to the date and time I want and the command will work. I suspect AWK can do this but really I'm just trying to figure out what kind of ideas and options I have for doing something like this.

I would appreciate any approaches to how I could accomplish this task.

---------- Post updated at 11:25 AM ---------- Previous update was at 10:37 AM ----------

Ok so I can grep for variables which is good. And I've found how I can set the variables.
Code:
#testvariable='May'; testvariable2='june'

This will allow me to easily set any number of variables. Then I can simply grep the specific log files for date and time... Now the only snag i'm running into is I can't get multiple spaces into the date. In my logs the date is.

Apr 1

with 2 spaces not 1 and anytime I set the variable it always has 1 space. Any idea how I can get 2 spaces to work?

Last edited by Yogesh Sawant; 04-05-2010 at 09:18 AM.. Reason: added code tags
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Dump Command in Linux

Can anybody tell me please how to use the "dump" command in Linux command. dump -0u /destination /sourcefile or device file. Please correct me. -regards -Iftikhar (1 Reply)
Discussion started by: syedifti
1 Replies

2. Filesystems, Disks and Memory

backup NetApp using dump command

I have been trying to backup my NetApp /vol/vol0 data to local tape drive. It is around 68GB. The tape I am using is DLT tape and should be able to handle 70GB data. However, dump always aborted around reaching 57~58GB data. Tape drive is attached on NetApp. 1st try to dump /vol/vol0 to... (2 Replies)
Discussion started by: yellowfish
2 Replies

3. UNIX for Dummies Questions & Answers

dump command

For ufsdump you type in a "dump level". The man command mentions levels 0-9 and gives examples. Can anyone define what each level refers to? What does a level 0 dump mean? What is a level 9 dump? What are the differences? Thanks in advance:) (1 Reply)
Discussion started by: Patman
1 Replies

4. Red Hat

Dump and restore command usage ??

Hi, I am using RHEL 4.0 I need to take backup of a directory and then restore it to some other location. For taking Backup of final directory, I am using this code: dump -0aj -f /home/vicky/final.dump /home/vicky/final/ Now, I am trying to restore this final.dump to some other... (2 Replies)
Discussion started by: vikas027
2 Replies

5. UNIX for Dummies Questions & Answers

dump command

Hi all, I am using embedded linux with my own RFS. But I want to use the dump command to try and solve some errors, but I don't know in wich package I can find the dump command. I hope someone can help me. With kind regards, Jurrian Dubbeldam (1 Reply)
Discussion started by: Jurrian
1 Replies

6. Infrastructure Monitoring

SNMP log dump

Hello Admin ( and all ) ! Am looking for real time SNMP dump messages, probably some 2 to 3 days old message for efficacy testing in SNMP protocol that is introduced. Since this ours is a real-time site is it possible to get the dumps ? Or is this something like SNMP dumps are not shared... (5 Replies)
Discussion started by: matrixmadhan
5 Replies

7. UNIX for Dummies Questions & Answers

dump command fault

im performing the dump command : dump 0uaf /dev/hdc /home an error msg appers says: DUMP: you cant update the dumpdates file while dumping a subdirectory DUMP: the entire dump is aborted can anyone help (6 Replies)
Discussion started by: semosam
6 Replies

8. Shell Programming and Scripting

Is there a command to close a program with dump?

I'm running into some problems using pkill in my scripts. I'm using a program that needs to dump. (4 Replies)
Discussion started by: ninjaaron
4 Replies

9. Shell Programming and Scripting

Help with change significant figure to normal figure command

Hi, Below is my input file: Long list of significant figure 1.757E-4 7.51E-3 5.634E-5 . . . Desired output file: 0.0001757 0.00751 0.00005634 . . . (10 Replies)
Discussion started by: perl_beginner
10 Replies

10. OS X (Apple)

Can't figure out the correct syntax for a command loading a webkit plugin

Hello, Using Bash on Mac OS X 10.7.5 (Lion). I downloaded a GrowlSafari plugin for Webkit from its GitHub page GitHub - uasi/growl-safari-bridge: GrowlSafariBridge enables arbitrary javascript (including Safari Extensions) to notify via Growl.. In the description it says that after installing for... (0 Replies)
Discussion started by: scrutinizerix
0 Replies
clfmerge(1)							     logtools							       clfmerge(1)

NAME
clfmerge - merge Common-Log Format web logs based on time-stamps SYNOPSIS
clfmerge [--help | -h] [-b size] [-d] [file names] DESCRIPTION
The clfmerge program is designed to avoid using sort to merge multiple web log files. Web logs for big sites consist of multiple files in the >100M size range from a number of machines. For such files it is not practical to use a program such as gnusort to merge the files because the data is not always entirely in order (so the merge option of gnusort doesn't work so well), but it is not in random order (so doing a complete sort would be a waste). Also the date field that is being sorted on is not particularly easy to specify for gnusort (I have seen it done but it was messy). This program is designed to simply and quickly sort multiple large log files with no need for temporary storage space or overly large buf- fers in memory (the memory footprint is generally only a few megs). OVERVIEW
It will take a number (from 0 to n) of file-names on the command line, it will open them for reading and read CLF format web log data from them all. Lines which don't appear to be in CLF format (NB they aren't parsed fully, only minimal parsing to determine the date is per- formed) will be rejected and displayed on standard-error. If zero files are specified then there will be no error, it will just silently output nothing, this is for scripts which use the find com- mand to find log files and which can't be counted on to find any log files, it saves doing an extra check in your shell scripts. If one file is specified then the data will be read into a 1000 line buffer and it will be removed from the buffer (and displayed on stan- dard output) in date order. This is to handle the case of web servers which date entries on the connection time but write them to the log at completion time and thus generate log files that aren't in order (Netscape web server does this - I haven't checked what other web servers do). If more than one file is specified then a line will be read from each file, the file that had the earliest time stamp will be read from until it returns a time stamp later than one of the other files. Then the file with the earlier time stamp will be read. With multiple files the buffer size is 1000 lines or 100 * the number of files (whichever is larger). When the buffer becomes full the first line will be removed and displayed on standard output. OPTIONS
-b buffer-size Specify the buffer-size to use, if 0 is specified then it means to disable the sliding-window sorting of the data which improves the speed. -d Set domain-name mangling to on. This means that if a line starts with as the name of the site that was requested then that would be removed from the start of the line and the GET / would be changed to GET http://www.company.com/ which allows programs like Webal- izer to produce good graphs for large hosting sites. Also it will make the domain name in lower case. EXIT STATUS
0 No errors 1 Bad parameters 2 Can't open one of the specified files 3 Can't write to output AUTHOR
This program, its manual page, and the Debian package were written by Russell Coker <russell@coker.com.au>. SEE ALSO
clfsplit(1),clfdomainsplit(1) Russell Coker <;russell@coker.com.au> 0.06 clfmerge(1)
All times are GMT -4. The time now is 03:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy