On my solaris-10 box, there are two log files which are being used (written by) some application. Those are hige files of 3 gb and 5 gb. I tried to nullify it, but it shows me zero size and after few seconds it will show me its original size. "ls -ltr" shows me big size, but "du -sh file" shows me in kb. We do not want to stop application to clear these files. Is there any way to clear them without stopping application. Below outputs should say my problem more clearly.
The commands:
deallocate all blocks allocated to those files at that time, but it doesn't close the file descriptors and does not reset the file offset that determines the position in the file where the next data written will be placed by in the processes that are writing to those files.
The next time the process writes something to one of those files, it will write it to the spot in the file just after the last place it wrote into that file. That will not allocate any disk blocks for the bytes in the file you previously deallocated, so what you end up with is known as a holey file which contains unallocated blocks that have never been written. If you try to read data from those blocks (such as by running cat soa.out), those unallocated blocks will appear as though null bytes had been written into those bytes.
If you change the program(s) that are writing those log files to add the O_APPEND flag to the oflag argument to the call to open() that opens the log files, it will reset the position in the log file where it writes data to the current end of file every time it writes to the log file. So, if you clear the log file using >logfile, the next write to the log file will be at the start of the file instead of leaving a huge hole at the start of the file.
Thanks Don for explaining it so well. I will need to go long way to get this change on the program which is writing this file.
Is there something, I can do something from OS side now ?
Is there any other better way from OS side, which I should adopt going forward ? Because this will fill up again and then application guys will come to System Admin again.
You have cleared the files content. The size reported by ls is irrelevant except for processes reading the files. As far as disk usage is concerned, the space has been recovered and the disk will only fill up with the new data written.
jlliagre, when I do do "cat soa.out", it was taking very long time to give output, so I did Ctlr+C and came out. This made me think that data is still so big here, even after nullify it. Am I wrong here ?
In future, what will be best to nullify it again, without taking application downtime ?
jlliagre, when I do do "cat soa.out", it was taking very long time to give output, so I did Ctlr+C and came out. This made me think that data is still so big here, even after nullify it. Am I wrong here ?
In future, what will be best to nullify it again, without taking application downtime ?
If you're going to use cat filename, you have a huge file and cat will read every byte of it. If you look at du filename, the number of disk blocks used to store the contents of the file may be small (as I explained before).
You haven't told us anything about what this application does nor why it is writing gigabytes into a file that the people using the application don't want to see. If you have a process that your users say you have to run continuously and it writes gigabytes of logs that no one wants to see, you can choose one of several options (including, but not limited to):
Get the people to wrote the application to change it.
Restart it regularly (and rotate or delete the log files while it is stopped).
Buy bigger disks to hold all of the data you nobody wants.
Patch the kernel to set the O_APPEND flag in the kernel table entry for the file descriptor involved and truncate it the way you're doing it now.
Refuse to run an application that fills up your filesystems with huge amounts of unwanted data until your users provide a version of the application that work correctly.
Try changing the log file to be a symlink pointing to /dev/null the next time you reboot your system before you start the application. (If the application removes or rotates log files when it restarts, this won't work.)
Also many start/stop scripts use #!/bin/sh which on Solaris 10 is classic Bourne shell. If those log files are created by redirecting stdout with the >> operator then normally that file can be truncated properly, but not when using classic Bourne Shell. Try running the startup script with #!/usr/xpg4/bin/sh instead...
I want to write a script using Oscam Cardsharing server
this is my test:
cat oscam.log | grep "error" sample output:
2018/10/17 16:43:07 5C94A12E p (cccam) cccam(r) test.dyndns.org: login failed, error Once I've found an error, I need to remove its information inside another file :
... (5 Replies)
Hello All !
I need your help on this case,
I have a csv file with this:
ITEM105;ARI FSR;2016-02-01 08:02;243
ITEM101;ARI FSR;2016-02-01 06:02;240
ITEM032;RNO TLE;2016-02-01 11:03;320
ITEM032;RNO TLE;2016-02-02 05:43;320
ITEM032;RNO TLE;2016-02-01 02:03;320
ITEM032;RNO... (2 Replies)
ENVIROMENT
Linux: Fedora Core release 1 (Yarrow)
iPlanet: iPlanet-WebServer-Enterprise/6.0SP1
Log Path: /usr/iplanet/servers/https-company/logs
I have iPlanet log rotation enabled rotating files on a daily basis.
The rotated logs are NOT compressed & are taking up too much space.
I... (7 Replies)
To delete log files content older than 30 days and append the lastest date log file date in the respective logs
I want to write a shell script that deletes all log files content older than 30 days and append the lastest log file date in the respective logs
This is my script
cd... (2 Replies)
Hello,
I tried to search on the site a way to delete lines on log files but I didn't find what I am looking for... I hope someone will be able to help me.
I do not know how to explain this, so I will do my best.
I have a log file and I want to delete all second lines.
Example :
... (3 Replies)
Hi all.
I have a database log file in which log data get appended to it daily. I want to do a automatic maintainence of this log by going through the log and deleting lines belonging to a certain date.
How should i do it? Please help. Thanks.
Example. To delete all lines prior to Jun... (4 Replies)
Hey all. I have a file that has roughly 115,000 lines in it. There are a few lines of information that I don't want in it, but I don't want to search through all of the lines to find the ones that I don't want. Is there a way to do a mass delete of the lines that I don't want?
Thanks for the... (4 Replies)