Sponsored Content
Full Discussion: Large files
Top Forums UNIX for Dummies Questions & Answers Large files Post 68106 by sehgalniraj on Thursday 31st of March 2005 05:30:07 AM
Old 03-31-2005
Large files

I am trying to understand the webserver log file for an error which has occured on my live web site.

The webserver access file is very big in size so it's not possible to open this file using vi editor. I know the approximate time the error occured, so i am interested in looking for the log file entries just before and just after that time (Each line in this log file contain time stamp).


Is there a easy way to get 50 lines before and 50 lines after this time stamp (from the middle of the file) and redirect that to a temporary file so that i can examine that using any windows editor if required.

Any help will be appreciated...

Regards
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Splitting large files

Hi Unix gurus, We have a masterfile which is to be split into smallerfiles with names as masterfile00,masterfile01,masterfile03...etal I was able to split the file using the "Split" cmd but as masterfileaa,masterfileab.. Is it posiible to change the default suffix? or is there any other... (2 Replies)
Discussion started by: Rvbs
2 Replies

2. UNIX for Dummies Questions & Answers

large files?

How do we check 'large files' is enabled on a Unix box -- HP-UX B11.11 (2 Replies)
Discussion started by: ranj@chn
2 Replies

3. UNIX for Dummies Questions & Answers

gzipping large (2+ gb) files.

Is it possible? I am trying to do it with gzip 1.2.4 and it comes back saying the file type is too large. Any way to compress massive things? (2 Replies)
Discussion started by: LordJezo
2 Replies

4. UNIX for Dummies Questions & Answers

tarring large no. of files

dears, I have a folder containing huge no. of files, some of them are created on AUG 16, AUG 17 and AUG 18, for example. All I want to do is tarring all the files created on a certain date, say AUG 18, in one tar file, only in one command line. So, how to feed all the files created on a certain... (4 Replies)
Discussion started by: marwan
4 Replies

5. UNIX for Dummies Questions & Answers

Need to find large files

I have found the following code on this forum ls -lh | awk '{print $5,$9}' | sort -n Its purpose is to show a list of files in a dir sorted by file size. I need to make it recursive ls -lhR | awk '{print $5,$9}' | sort -n The problem is that there are lots of files on the... (3 Replies)
Discussion started by: jadionne
3 Replies

6. Shell Programming and Scripting

Compare 2 Large files

Hi, i want to compare 2 large files both around 300 mb. They are text files having only one column of numbers. I wish to get the unique values in file2. I tried using diff but it gave an error of memory exhausted. Both files are sorted and i am running on a 1gb ram core 2 duo 2ghz. Help!! Thanks... (4 Replies)
Discussion started by: bezudar
4 Replies

7. UNIX for Dummies Questions & Answers

Renaming Large Files

When I have a file for example with the wrong name I normally use the cp {filename} {new filename} comand so I will have the original as well as the correct named file. Just for backup purposes. Problem: I have a file that is 24gb with a case sensitive filename. The file was named with upper... (3 Replies)
Discussion started by: trek88
3 Replies

8. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

9. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

10. UNIX for Advanced & Expert Users

Iconv on large files

Hi All, I am using iconv to convert huge files. the process is getting killed. I tried the option in below link https://www.unix.com/shell-programming-and-scripting/258825-iconv-large-files.html i.e iconv -f UCS-2 -t UTF-8 < inputfile.txt > outputfile.txt However, the process still gets... (4 Replies)
Discussion started by: tostay2003
4 Replies
VLOGGER(1)						User Contributed Perl Documentation						VLOGGER(1)

NAME
vlogger - flexible log rotation and usage tracking in perl SYNOPSIS
vlogger [OPTIONS]... [LOGDIR] DESCRIPTION
Vlogger is designed to make webserver log rotation simple and easy to manage. It deals with VirtualHost logs automatically, so only one directive is required to manage all hosts on a webserver. Vlogger takes piped output from Apache or another webserver, splits off the first field, and writes the logs to logfiles in subdirectories. It uses a filehandle cache to avoid resource limitations. It will start a new logfile at the beginning of a new day, and optionally start new files when a certain filesize is reached. It can maintain a symlink to the most recent log for easy access. Optionally, host parsing can be disabled for use in ErrorLog directives. To use vlogger, you need to add a "%v" to the first part of your LogFormat: LogFormat "%v %h %l %u %t "%r" %>s %b "%{Referer}i" "%{User-Agent}i"" combined Then call it from a customlog: CustomLog "| /usr/sbin/vlogger -s access.log -u www-logs -g www-logs /var/log/apache" combined OPTIONS
Options are given in short format on the command line. -a Do not autoflush files. This may improve performance but may break logfile analyzers that depend on full entries in the logs. -e ErrorLog mode. In this mode, the host parsing is disabled, and the file is written out using the template under the specified LOGDIR. -n Disables rotation. This option disables rotation altogether. -f MAXFILES Maximum number of filehandles to keep open. Defaults to 100. Setting this value too high may result in the system run- ning out of file descriptors. Setting it too low may affect performance. -u UID Change user to UID when running as root. -g GID Change group to GID when running as root. -t TEMPLATE Filename template using Date::Format codes. Default is "%m%d%Y-access.log", or "%m%d%Y-error.log". When using the -r option, the default becomes "%m%d%Y-%T-access.log" or "%m%d%Y-%T-error.log". -s SYMLINK Specifies the name of a symlink to the current file. -r SIZE Rotate files when they reach SIZE. SIZE is given in bytes. -d CONFIG Use the DBI usage tracker. -h Displays help. -v Prints version information. DBI USAGE TRACKER
Vlogger can automatically keep track of per-virtualhost usage statistics in a database. DBI and the relevant drivers (eg. DBD::mysql) needs to be installed for this to work. Create a table in your database to hold the data. A "mysql_create.sql" script is provided for using this feature with MySQL. Configure the dsn, user, pass and dump values in the vlogger-dbi.conf file. The "dump" parameter controls how often vlogger will dump its stats into the database (the default is 30 seconds). Copy this file to somewhere convienient on your filesystem (like /etc/apache/vlogger-dbi.conf) and start vlogger with "-d /etc/apache/vlogger-dbi.conf". You might want to use this feature to easily bill customers on a daily/weekly/monthly basis for bandwidth usage. SEE ALSO
cronolog(1), httplog(1) BUGS None, yet. AUTHORS Steve J. Kondik <;shade@chemlab.org> WWW: http://n0rp.chemlab.org/vlogger perl v5.8.6 2005-03-18 VLOGGER(1)
All times are GMT -4. The time now is 10:36 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy