Sponsored Content
Top Forums UNIX for Dummies Questions & Answers How to view a big file(143M big) Post 25139 by google on Thursday 25th of July 2002 08:45:12 AM
Old 07-25-2002
Alternative to head or tail:
if you dont like viewing 200 lines at a time using head or tail, then you can just download an editor for your WinX PC (EditPlus Pro at http://www.editplus.com/ ). Use it to grab your log file (EditPlus has an FTP feature built in) and view your file. EditPlus is great for huge files like this.
 

8 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

big file processeing

hi, i have a very big file that holding data, how could i pick line by line from this file. the following process can illustrate better: file ------------------- 123444444 | 122314567 |-----------data 146689000 | c=123444444 ---------- c is variable process c ... (3 Replies)
Discussion started by: omran
3 Replies

2. Solaris

wtmpx file is too big

Hi, I am using Sun Solaris 5.9 OS. I have found a file called wtmpx having a size of 5.0 GB. I want to clear this file using :>/var/adm/wtmpx. My query is, would it cause any problem to the running live system. Could anyone suggest the best method to clear the file without causing problem to... (6 Replies)
Discussion started by: Vijayakumarpc
6 Replies

3. Filesystems, Disks and Memory

Write Speed into a big file (in Gb's)

If a file size increases in Linux/UNIX to say in GB's then will there be a decrease in write speed. I mean will it take more time to write to a large file then to a small one?? Please clarify? Thanks in advance (2 Replies)
Discussion started by: anilgurwara
2 Replies

4. UNIX for Dummies Questions & Answers

How big is too big a config.log file?

I have a 5000 line config.log file with several "maybe" errors. Any reccomendations on finding solvable problems? (2 Replies)
Discussion started by: NeedLotsofHelp
2 Replies

5. UNIX for Advanced & Expert Users

Split a big file into two others files

Hello, i have a very big file that has more then 80 MBytes (100MBytes). So with my CVS Application I cannot commit this file (too Big) because it must have < 80 MBytes. How can I split this file into two others files, i think the AIX Unix command : split -b can do that, buit how is the right... (2 Replies)
Discussion started by: steiner
2 Replies

6. Shell Programming and Scripting

Delete rows from big file

Hi all, I have a big file (about 6 millions rows) and I have to delete same occurrences, stored in a small file (about 9000 rews). I have tried this: while read line do grep -v $line big_file > ok_file.tmp mv ok_file.tmp big_file done < small_file It works, but is very slow. How... (2 Replies)
Discussion started by: Tibbeche
2 Replies

7. Emergency UNIX and Linux Support

Getting VALUE from Big XML File -- That's All

We got data that was supposed to be CSV, but was sent in a huge XML file. I've downloaded xmlstarlet, but I'm darned if I can get it to operate the "sel" feature to look down a path and get any sort of value. I see pieces of what should be paths, but they seem to have extraneous characters, and... (7 Replies)
Discussion started by: gmark99
7 Replies

8. UNIX for Beginners Questions & Answers

How to convert CR to LF in a big file?

Hello Friends, I have a big file that is transferred to my UNIX system and it seems it has CR as the line delimiter When I run file <filename> <filename>: ASCII text, with CR line terminators How do I convert the file to one with LF as terminators so that my code that runs on UNIX can... (3 Replies)
Discussion started by: mehimadri12
3 Replies
tail(1) 																   tail(1)

NAME
tail - deliver the last part of a file SYNOPSIS
/usr/bin/tail [ +-s number [lbcr]] [file] /usr/bin/tail [-lbcr] [file] /usr/bin/tail [ +- number [lbcf]] [file] /usr/bin/tail [-lbcf] [file] /usr/xpg4/bin/tail [-f | -r] [-c number | -n number] [file] /usr/xpg4/bin/tail [ +- number [l | b | c] [f]] [file] /usr/xpg4/bin/tail [ +- number [l] [f | r] ] [file] The tail utility copies the named file to the standard output beginning at a designated place. If no file is named, the standard input is used. Copying begins at a point in the file indicated by the -cnumber, -nnumber, or +-number options (if +number is specified, begins at distance number from the beginning; if -number is specified, from the end of the input; if number is NULL, the value 10 is assumed). number is counted in units of lines or byte according to the -c or -n options, or lines, blocks, or bytes, according to the appended option l, b, or c. When no units are specified, counting is by lines. The following options are supported for both /usr/bin/tail and /usr/xpg4/bin/tail. The -r and -f options are mutually exclusive. If both are specified on the command line, the -f option is ignored. -b Units of blocks. -c Units of bytes. -f Follow. If the input-file is not a pipe, the program does not terminate after the line of the input-file has been copied, but enters an endless loop, wherein it sleeps for a second and then attempts to read and copy further records from the input-file. Thus it can be used to monitor the growth of a file that is being written by some other process. -l Units of lines. -r Reverse. Copies lines from the specified starting point in the file in reverse order. The default for r is to print the entire file in reverse order. /usr/xpg4/bin/tail The following options are supported for /usr/xpg4/bin/tail only: -c number The number option-argument must be a decimal integer whose sign affects the location in the file, measured in bytes, to begin the copying: + Copying starts relative to the beginning of the file. - Copying starts relative to the end of the file. none Copying starts relative to the end of the file. The origin for counting is 1; that is, -c+1 represents the first byte of the file, -c-1 the last. -n number Equivalent to -cnumber, except the starting location in the file is measured in lines instead of bytes. The origin for counting is 1. That is, -n+1 represents the first line of the file, -n-1 the last. The following operand is supported: file A path name of an input file. If no file operands are specified, the standard input is used. See largefile(5) for the description of the behavior of tail when encountering files greater than or equal to 2 Gbyte ( 2**31 bytes). Example 1: Using the tail Command The following command prints the last ten lines of the file fred, followed by any lines that are appended to fred between the time tail is initiated and killed. example% tail -f fred The next command prints the last 15 bytes of the file fred, followed by any lines that are appended to fred between the time tail is initi- ated and killed: example% tail -15cf fred See environ(5) for descriptions of the following environment variables that affect the execution of tail: LANG, LC_ALL, LC_CTYPE, LC_MES- SAGES, and NLSPATH. The following exit values are returned: 0 Successful completion. >0 An error occurred. See attributes(5) for descriptions of the following attributes: /usr/bin/tail +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWcsu | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ /usr/xpg4/bin/tail +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWxcu4 | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ cat(1), head(1), more(1), pg(1), dd(1M), attributes(5), environ(5), largefile(5), standards(5) Piped tails relative to the end of the file are stored in a buffer, and thus are limited in length. Various kinds of anomalous behavior can happen with character special files. 13 Jul 2005 tail(1)
All times are GMT -4. The time now is 12:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy