Okay, absolute newbie here...
I'm on a Mac trying to split an almost 2 Gig log file on a Unix box into manageable chunks for my web-based log analysis tool.
What do I need to do, what programs do I need to do it?
All and any help appreciated/needed :-)
Cheers (8 Replies)
I have a very large (150 megs) IRC log file from 2000-2001 which I want to cut down to individual daily log files. I have a very basic knowledge of the cat, sed and grep commands. The log file is time stamped and each day in the large log file begins with a "Session Start" string like so:
... (11 Replies)
Hi,
I'm trying to accomplish the following and would like some suggestions or possible bash script examples that may work
I have a directory that has a list of log files that's periodically dumped from a script that is crontab that are rotated 4 generations. There will be a time stamp that is... (4 Replies)
Hi,
I've been searching for a quick way to do this with sed, but to no avail.
I have a file containing a long series of (windows) file paths that are separated by the pattern '@'. I would like to extract each file path so that I can later assign a variable to each path.
Here is the file:... (2 Replies)
hi there,
i have a file named 'x20080613_x20100106.pwr1.gc', i want to isolate the part 'x20080613_x20100106' but by using the following line i isolate the part '.pwr1.gc':
`awk '$0=substr($0, length($0)-7)' $temp`
how can i reverse that?
thank you! (3 Replies)
Can someone please help me how do I find a URL from lines of log file and write all the output to a new file?
For e.g - Log file has similar entries,
39.155.67.5 - - "GET /abc/login?service=http://161.120.36.39/CORPHR/TMA2007/default.asp HTTP/1.1" 401 3218
54.155.63.9 - - "GET... (2 Replies)
I have a file like below . The good pages must have 3 conditions :
The pages that containing page total only must have 50 lines.
The pages that containing customer total only must have 53 lines.
The last page of Customer Total should be the last page.
How can I accomplish separating good... (1 Reply)
To delete log files content older than 30 days and append the lastest date log file date in the respective logs
I want to write a shell script that deletes all log files content older than 30 days and append the lastest log file date in the respective logs
This is my script
cd... (2 Replies)
Hi,
I need a suggestion for an issue in UNIX file.
I have a log file in my system where data is appending everyday and as a consequence the file is increasing heavily everyday.
Now I need a logic to split this file daily basis and remove the files more than 15 days.
Request you to... (3 Replies)
I have a 2 part question on how to this in unix scripting using kshell or c shell.
I have a file described below:
1st record has 2 fields on it
every other record has 22 fields on it.
Example
ABC, email address
Detail 1
Detail 2
Detail 3
.
.
.
1st question is... (4 Replies)
Discussion started by: jclanc8
4 Replies
LEARN ABOUT DEBIAN
igawk
IGAWK(1) Utility Commands IGAWK(1)NAME
igawk - gawk with include files
SYNOPSIS
igawk [ all gawk options ] -f program-file [ -- ] file ...
igawk [ all gawk options ] [ -- ] program-text file ...
DESCRIPTION
Igawk is a simple shell script that adds the ability to have ``include files'' to gawk(1).
AWK programs for igawk are the same as for gawk, except that, in addition, you may have lines like
@include getopt.awk
in your program to include the file getopt.awk from either the current directory or one of the other directories in the search path.
OPTIONS
See gawk(1) for a full description of the AWK language and the options that gawk supports.
EXAMPLES
cat << EOF > test.awk
@include getopt.awk
BEGIN {
while (getopt(ARGC, ARGV, "am:q") != -1)
...
}
EOF
igawk -f test.awk
SEE ALSO gawk(1)
Effective AWK Programming, Edition 1.0, published by the Free Software Foundation, 1995.
AUTHOR
Arnold Robbins (arnold@skeeve.com).
Free Software Foundation Nov 3 1999 IGAWK(1)