Delete first 100 lines from a BIG File


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users Delete first 100 lines from a BIG File
# 8  
Old 06-17-2012
Hi, Scrutinizer.
Quote:
Originally Posted by Scrutinizer
It seems to me any solution should always make use a temporary intermediate file for safety reasons. If we read the whole file into memory and then write it back to the same file, we run the risk of losing the original in case of power failure during the write-back phase..

With a temporary file there is only mv involved, which is only a rename if the temporary file is in the same dir on the same file system, so a temporary file in the same directory instead of /tmp for example may be preferably. If we use /tmp for example for the intermediate file, a temporary rename to .bak of the original until the move from /tmp may be required and safest will probably be to keep the .bak until the user deletes it..
When was the last time you know that that happened to anyone? Most important *nix boxes will be backed up with a UPS. I think it is far more likely to fall prey to plain old user error.

However, I sometimes allow myself the luxury of following:
Quote:
"Unix was not designed to stop its users from doing stupid things, as that would also stop them from doing clever things." - Doug Gwyn
-- Unix philosophy - Wikipedia, the free encyclopedia

Best wishes ... cheers, drl Smilie
# 9  
Old 06-17-2012
Well that is probably because nobody does it that way Smilie

Backed up with a UPS hey? I have seen systems go down or heard of systems going down because...
  • Two power chords of two were plugged into the same phase of the ups...
  • Two power chords of three were plugged into the same phase of the ups and when that group failed the remaining power supply could not handle it
  • The second power supply had been broken and so was the monitoring system, so when the other one failed..
  • Someone took out both power chords.
  • Someone knocked out both power chords
  • Hard disks failed and not all logical volumes were mirrored
  • Controllers failed and the other channel was on the same controller
  • Memory died..
  • A power failure during maintenance of the UPS
  • The UPS ran out and the diesel generator ran out of diesel
  • The UPS ran out and the power relay that was supposed to switch the generator on did not function because the battery of the tiny UPS that served only that relay was way overdue and had gone dead..
  • Un uneven current draw on one phase and down went the UPS and it took everything with it..
  • A high availabiliy cluster failover for whatever reason..
  • Systems with officially no SPOFs had SPOFs
  • etcetera
  • and so forth
Indirectly, most of this is human error of course, for example because manuals or procedures were ignored, but the reality is... that that is reality, since most teams do not consist entirely of the cream of the crop and Murphy is having a ball in data centers.. Smilie

Last edited by Scrutinizer; 06-17-2012 at 08:01 PM..
# 10  
Old 06-18-2012
@unohu
Please explain why do not want to use a workfile and the context in which this logfile is created?

If the log file is open by a process, there is no method to shorten the file without using a workfile (and even that method risks losing new log entries). However it is possible to retain the same inode (is that your issue?).

Check whether the file is open with the fuser command.
# 11  
Old 06-18-2012
If I create new file with same name it remains as blank file.
# 12  
Old 06-19-2012
@unoho
Sounds like your logfile is open by an application.
The usual technique when a logfile is open (like with some of the system logs) is to copy the file, then null it. This retains any file permissions and usually does not upset the application but be aware that there is a small time window where you might lose a message.

Code:
# Copy the logfile
cp -p /path/logfile /path/newname
# Null the logfile
> /path/logfile

This technique does not work for data files.
It is however much better to do logfile maintenance while the application is turned off, but in the case of some system processes this may not be possible.

If your system has the logrotate command, check it out.
This User Gave Thanks to methyl For This Post:
# 13  
Old 06-19-2012
Hi, methyl.

I wonder if unoho meant that he did something like:
Code:
sed 'mumble' infile > infile

which would zero infile before executing sed. Hard to guess, though, if he's not forthcoming and specific about what he did and what error message or condition he got ... cheers, drl
This User Gave Thanks to drl For This Post:
# 14  
Old 06-20-2012
Hi methyl
Yes, log file is opened by another application. If I change the inode of log file while taking backup then application can not detect new inode. As a result I get blank file. The easiest solution may be execute a script(same as you suggested) when application is down.
Code:
logrotate

command is not available.

@drl I tried 2-3 possibilities which more/less same as given below.
My logfile name is test.log which has 100lines(say) and backup file is back.txt . Here the problem is test.log remains blank even when application is active and should have some log statement.
Code:
cp test.log back.log
sed '1,100d' test.log>temp.log 
mv temp.log test.log

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

How to copy only some lines from very big file?

Dear all, I have stuck with this problem for some days. I have a very big file, this file can not open by vi command. There are 200 loops in this file, in each loop will have one line like this: GWA quasiparticle energy with Z factor (eV) And I need 98 lines next after this line. Is... (6 Replies)
Discussion started by: phamnu
6 Replies

2. Shell Programming and Scripting

Want to extract certain lines from big file

Hi All, I am trying to get some lines from a file i did it with while-do-loop. since the files are huge it is taking much time. now i want to make it faster. The requirement is the file will be having 1 million lines. The format is like below. ##transaction, , , ,blah, blah... (38 Replies)
Discussion started by: mad man
38 Replies

3. UNIX for Dummies Questions & Answers

Delete records from a big file based on some condition

Hi, To load a big file in a table,I have a make sure that all rows in the file has same number of the columns . So in my file if I am getting any rows which have columns not equal to 6 , I need to delete it . Delimiter is space and columns are optionally enclosed by "". This can be ... (1 Reply)
Discussion started by: hemantraijain
1 Replies

4. Shell Programming and Scripting

Delete rows from big file

Hi all, I have a big file (about 6 millions rows) and I have to delete same occurrences, stored in a small file (about 9000 rews). I have tried this: while read line do grep -v $line big_file > ok_file.tmp mv ok_file.tmp big_file done < small_file It works, but is very slow. How... (2 Replies)
Discussion started by: Tibbeche
2 Replies

5. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

6. Shell Programming and Scripting

Re: Deleting lines from big file.

Hi, I have a big (2.7 GB) text file. Each lines has '|' saperator to saperate each columns. I want to delete those lines which has text like '|0|0|0|0|0' I tried: sed '/|0|0|0|0|0/d' test.txt Unfortunately, it scans the file but does nothing. file content sample:... (4 Replies)
Discussion started by: dipeshvshah
4 Replies

7. Shell Programming and Scripting

Print #of lines after search string in a big file

I have a command which prints #lines after and before the search string in the huge file nawk 'c-->0;$0~s{if(b)for(c=b+1;c>1;c--)print r;print;c=a}b{r=$0}' b=0 a=10 s="STRING1" FILE The file is 5 gig big. It works great and prints 10 lines after the lines which contains search string in... (8 Replies)
Discussion started by: prash184u
8 Replies

8. Shell Programming and Scripting

How to delete lines in a file that have duplicates or derive the lines that aper once

Input: a b b c d d I need: a c I know how to get this (the lines that have duplicates) : b d sort file | uniq -d But i need opossite of this. I have searched the forum and other places as well, but have found solution for everything except this variant of the problem. (3 Replies)
Discussion started by: necroman08
3 Replies

9. Solaris

delete first 100 lines from a file

I have a file with 28,00,000 lines of rows in this the first 80 lines will be chunks . I want to delete the chunks of 80 lines. I tried tail -f2799920 filename. is there any efficient way to do this. Thanks in advance. (7 Replies)
Discussion started by: salaathi
7 Replies

10. Solaris

delete first 100 lines rather than zero out of file

Hi experts, in my solaris 9 the file- /var/adm/messeages growin too first. by 24 hours 40MB. And always giving the below messages-- bash-2.05# tail -f messages Nov 9 16:35:38 ME1 last message repeated 1 time Nov 9 16:35:38 ME1 ftpd: wtmpx /var/adm/wtmpx No such file or directory Nov 9... (7 Replies)
Discussion started by: thepurple
7 Replies
Login or Register to Ask a Question