Redirecting log files to null writing junk into log files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Redirecting log files to null writing junk into log files
# 1  
Old 11-28-2017
Redirecting log files to null writing junk into log files

Redirecting log files to null writing junk into log files.

i have log files which created from below command

Code:
 exec <processname> >$logfile

but when it reaches some size i am redirecting to null while process is running like
Code:
 >$logfile

manually but after that it writes some junk into log file for first few lines. how to avoid that ?
# 2  
Old 11-28-2017
I'm not sure I quote understand. Can you show the code block this is in, some sample input and the output you get?

When you exec a process, it will replace the current shell, so (depending on your OS and version) the redirect you have might be ignored. Can you post the output from uname -a to show your OS and the output from ps to show your shell.



Thanks, in advance,
Robin
# 3  
Old 11-28-2017
Code:
HP-UX phxlevht B.11.31 U ia64 4230347391 unlimited-user license

output of uname -a

Code:
   PID TTY       TIME COMMAND
  1402 pts/11    0:00 ps
   347 pts/11    0:00 ksh

output of ps.
my script is
Code:
exec sedec >logfilepath

the process is still running but the size is little huge so i manually redirected to null as below

Code:
>logfilepath

the process prints some garbage and then with proper text. it is not the process writing garbage i think.
# 4  
Old 11-28-2017
Oh, now I see. Sorry, I was confused by your phrase "redirecting to null" You have tried to truncate the file separately. The problem you might have is that the file is still open by your running process, so might not actually free up space in the filesystem, although it should appear to become zero bytes.

What happens to the standard error from exec sedec >$logfile By default, that will be sent to the screen or whatever is standard output when the process begins, before your redirect takes effect. Could this be the output of you process complaining that the file has been reset/and or replaced?


What else do you have running from the same session? I would expect that you can issue >otherfile without a problem.




I hope that this helps,
Robin
This User Gave Thanks to rbatte1 For This Post:
# 5  
Old 11-28-2017
WHAT is the garbage? What is correct after that?
If I get you correctly, the truncation works correctly, and the old log file contents is lost.
Does the process keep the old file open? Check with e.g. lsof. Try sth. like echo "Marker 1" > $logfile.
This User Gave Thanks to RudiC For This Post:
# 6  
Old 11-28-2017
garbage is junk charactet like
Code:
 @@@@@@@@@@@@@

correct means it start printing the logs from the process.

i think yes the process keep on running so it still open.
# 7  
Old 11-28-2017
Quote:
Originally Posted by greenworld123
the process prints some garbage and then with proper text. it is not the process writing garbage i think.
I think you have already gotten very good advice about how to solve it but it might be helping you to understand what is going on:

When a process "opens" a file it calls some OS function (namely fopen()) and part of this "opening" is that the OS sets up an environment through which the process can access the file. Part of this is to find out how big (=how many bytes) a file is. The process also gets a "place" where it "stands" right now. This "place" can be moved forward, backwards, etc., but only within the limits of the length of the file.

Say, a program opens a file and is told that the file is 10 bytes long. Right now it "stands" on byte 1 and it can read it, which would move the place it stands forward to byte 2, etc.. It can also do things like "go forward 3 bytes and then read (or write) 2 bytes from there". It can also add to the file, which would increase the size so that now it can position its place to byte 11. But if it tries to do something impossible (like "go to byte number <behind the current length>" it would receive an error because the OS "knows" that the file is only as long as it is.

All this works well as long as one process accesses a file. But in your case a process opened a file and wrote lots of bytes into it, making its length some big number in the "internal bookkeeping" of the OS. Now a second process (your shell command) truncated the file and but for the first program it is still as long as it was when it last added to it. If it tries to read something from further up (like when it tries to print the content) of course it will get garbage because what it reads is some random block on a disk which is not part of the file any more - but the program won't know that.

Log writing processes should therefore NOT write into log files continuously but open and close the log for every write action separately.

I hope this helps.

bakunin
This User Gave Thanks to bakunin For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Want to delete the junk files from a directory which are not listed in a TEXT file

Hello Everyone, I want to delete the image files from a directory, which are not listed in a TEXT file. The directory contains large number of image files (in millions) required / not required. I want to delete the image files which are "not required". I have generated a Text file having... (3 Replies)
Discussion started by: Praveen Pandit
3 Replies

2. Shell Programming and Scripting

Writing the output of set -x into Log files

Hi Guys, I am using set -x in my script to track the flow of the script. But if i want to write the output of the set -x into a log file, how do i do it? Thanks, Ajay (3 Replies)
Discussion started by: Ajay Venkatesan
3 Replies

3. Shell Programming and Scripting

Monitor log entries in log files with no Date format? - Efficient logcheck?

is there a way to efficiently monitor logfiles that do not have a date or time format? i have several logs on several different servers that need to be monitored. but i realized writing a script for this would be very complex and time consuming giving the variety of things i need to check for i.e.... (2 Replies)
Discussion started by: SkySmart
2 Replies

4. Shell Programming and Scripting

Delete log files content older than 30 days and append the lastest date log file date

To delete log files content older than 30 days and append the lastest date log file date in the respective logs I want to write a shell script that deletes all log files content older than 30 days and append the lastest log file date in the respective logs This is my script cd... (2 Replies)
Discussion started by: sreekumarhari
2 Replies

5. Shell Programming and Scripting

auto encryption and decryption of files during log in and log off

we r to develope a project which involves automatic encryption of all the text files user was working upon during logg off and to decrypt them during log on this is to be done by writing a shell script can anyone help (2 Replies)
Discussion started by: vyom
2 Replies

6. Shell Programming and Scripting

How can view log messages between two time frame from /var/log/message or any type of log files

How can view log messages between two time frame from /var/log/message or any type of log files. when logfiles are very big and especially many messages with in few minutes, I would like to display log messages between 5 minute interval. Could you pls give me the command? (1 Reply)
Discussion started by: johnveslin
1 Replies

7. Shell Programming and Scripting

Writing files without temporary files

Hey Guys, I was wondering if someone would give me a hand with an issue I'm having, let me explain the situation: I have a file that is constantly being written to and read from with updated lines: # cat activity.file activity1 activity2 activity3 activity4 activity5 This file... (2 Replies)
Discussion started by: bashshadow1979
2 Replies

8. UNIX for Dummies Questions & Answers

awk and redirecting to files

Hello, I have a set of data (comma seperated) that I want to save to multiple files according to one of the fields in the data set. I can easily do this with the following script: BEGIN { FS = OFS = ","} NF {print $0 >> ($2 "" ".csv")} It works perfectly on a set of dummy data I have set... (8 Replies)
Discussion started by: pfft
8 Replies

9. UNIX for Dummies Questions & Answers

system log files and core files?

Solaris v5.6 What log files should be checked out as part of your sys admin daily routine? I've printed out my syslog.conf file, and looked in /var/log and found authlog, syslog, and POPlog. I know of /var/adm/messages. What others should I be looking for? I know of the "find" command. I... (8 Replies)
Discussion started by: Westy564
8 Replies

10. UNIX for Dummies Questions & Answers

Redirecting output to multiple log files?

If I wanted to redirect output to multiple log files, what would be the best way to do that? echo "Unix is awesome" >>unixgod.log >>unixgod.log (3 Replies)
Discussion started by: darthur
3 Replies
Login or Register to Ask a Question