Large files


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Large files
# 1  
Old 03-31-2005
Large files

I am trying to understand the webserver log file for an error which has occured on my live web site.

The webserver access file is very big in size so it's not possible to open this file using vi editor. I know the approximate time the error occured, so i am interested in looking for the log file entries just before and just after that time (Each line in this log file contain time stamp).


Is there a easy way to get 50 lines before and 50 lines after this time stamp (from the middle of the file) and redirect that to a temporary file so that i can examine that using any windows editor if required.

Any help will be appreciated...

Regards
# 2  
Old 03-31-2005
This is still going to be slow, but nethertheless, if you're using GNU grep (please state if you're not, try "grep --version", if this displays an error, you're not using GNU grep), do something like the following....

Code:
grep -C 50 "my_timestamp_here" logfile > matched_lines

As I say, if you're not using GNU grep let us know which OS/shell you're using and we'll assist further.

Cheers
ZB
# 3  
Old 03-31-2005
Thanks for the reply..

I am not using GNU grep. The server i am using : SunOS bart 5.8 Generic_108528-21 sun4u sparc SUNW,Ultra-4

Shell is : ksh

Regards
# 4  
Old 03-31-2005
There is no doubt a far quicker way of doing this (we're going through the file twice, so it'll take a while), but something like
Code:
#!/bin/ksh

matches=`grep -n "TIMESTAMP_HERE" logfile | cut -f1 -d:`
first_match=`echo "$matches" | awk 'NR==1{print $0}'`
last_match=`echo "$matches" | awk '{c=$0}END{print c}'`

sed -n "$((first_match - 50)),$((last_match + 50))p" logfile

Should do it - just redirect the output of this script to your temporary file

Tested on SunOS sunbox 5.9 Generic_112234-07 i86pc i386 i86pc, ksh

Cheers
ZB

Last edited by zazzybob; 03-31-2005 at 07:59 AM.. Reason: Removed debug code
# 5  
Old 03-31-2005
Looks tricky,
Thanks a lot for your help
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Iconv on large files

Hi All, I am using iconv to convert huge files. the process is getting killed. I tried the option in below link https://www.unix.com/shell-programming-and-scripting/258825-iconv-large-files.html i.e iconv -f UCS-2 -t UTF-8 < inputfile.txt > outputfile.txt However, the process still gets... (4 Replies)
Discussion started by: tostay2003
4 Replies

2. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

3. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

4. UNIX for Dummies Questions & Answers

Renaming Large Files

When I have a file for example with the wrong name I normally use the cp {filename} {new filename} comand so I will have the original as well as the correct named file. Just for backup purposes. Problem: I have a file that is 24gb with a case sensitive filename. The file was named with upper... (3 Replies)
Discussion started by: trek88
3 Replies

5. Shell Programming and Scripting

Compare 2 Large files

Hi, i want to compare 2 large files both around 300 mb. They are text files having only one column of numbers. I wish to get the unique values in file2. I tried using diff but it gave an error of memory exhausted. Both files are sorted and i am running on a 1gb ram core 2 duo 2ghz. Help!! Thanks... (4 Replies)
Discussion started by: bezudar
4 Replies

6. UNIX for Dummies Questions & Answers

Need to find large files

I have found the following code on this forum ls -lh | awk '{print $5,$9}' | sort -n Its purpose is to show a list of files in a dir sorted by file size. I need to make it recursive ls -lhR | awk '{print $5,$9}' | sort -n The problem is that there are lots of files on the... (3 Replies)
Discussion started by: jadionne
3 Replies

7. UNIX for Dummies Questions & Answers

tarring large no. of files

dears, I have a folder containing huge no. of files, some of them are created on AUG 16, AUG 17 and AUG 18, for example. All I want to do is tarring all the files created on a certain date, say AUG 18, in one tar file, only in one command line. So, how to feed all the files created on a certain... (4 Replies)
Discussion started by: marwan
4 Replies

8. UNIX for Dummies Questions & Answers

gzipping large (2+ gb) files.

Is it possible? I am trying to do it with gzip 1.2.4 and it comes back saying the file type is too large. Any way to compress massive things? (2 Replies)
Discussion started by: LordJezo
2 Replies

9. UNIX for Dummies Questions & Answers

large files?

How do we check 'large files' is enabled on a Unix box -- HP-UX B11.11 (2 Replies)
Discussion started by: ranj@chn
2 Replies

10. Shell Programming and Scripting

Splitting large files

Hi Unix gurus, We have a masterfile which is to be split into smallerfiles with names as masterfile00,masterfile01,masterfile03...etal I was able to split the file using the "Split" cmd but as masterfileaa,masterfileab.. Is it posiible to change the default suffix? or is there any other... (2 Replies)
Discussion started by: Rvbs
2 Replies
Login or Register to Ask a Question