I have two log files which keeps appending every sec...I want to extract a certain field from each file(using awk for extracting the data) and compare them in real time...
ex:
log1
122
234
567
log2
234
567
log3
122
i need a log3 which keeps appending the data found in log1 and not in log2..
log1 which keeps appending every sec
234,abc
678,def
345,fgh
awk -F"," '{print $1}' log1
will give me
234
678
345
log2 which keeps appending every sec
345, ghi
678, jkl
awk -F"," '{print $1}' log2
will give me
345
678
now i want a log3 which compares the output of the above two commands and keeps appending... ex: i should get 234 in log3 as its there in log1 and not in log2
All the differences would be captured in the file 'update'
This is a continuously running script, run it as a background process
Refresh rate can be altered through the sleep value given in for loop
Minimal time taken in comparison and generating what is not available in first file with respect to the second file
Thanks....but this doesnt give me a realtime log...by the time i do cat and compare i would have lost many entries in the meantime(log gets appended with data every sec) .I want a log which keeps comparing in real time and gets appended every time it finds a diff in file...like in tail -f
I have a primary server where certain files are created real time. These files have varying file sizes. I want to FTP or copy them over to a different server server as soon a file gets created. I have to ensure that only full file is copied. The receiving end process expects a FULL file. I am ok... (3 Replies)