Shell script replied multiple process for file has successfully created
Hi All,
I have written the following code
But When I run 30 instance (same time via cron) of the script multiple output comes on the script stating that "Successfully lck file created" with same name.
Please help
Moderator's Comments:
Please use code tags for your data and code, thanks
We are missing some code to understand...e.g. how is written the loop?
then how do you run 30 instances? you call the script 30 times or is it the loop?...
I run 30 instance of script with crontab or TWS on same time.
loop is picking up the information from Some DATABASE which return set of parameter.
and i have to use it once for each instance of script.
The two step test and create lock file method you're using:
has an obvious race condition when you have 30 scripts trying to create a lock file in the same namespace. All 30 scripts can test and find that the lock file does not exist before any of them get to the touch command to create the lock file. And, if the lock file exists when touch runs, touch will not report an error if the file already exists.
If you're using cron to create 30 jobs doing the same thing and then having them fight each other to create separate lock files; why not have the 30 different cron jobs pass a parameter to each job saying which lock file that job should use. Then there won't be any contention and there won't be any race condition.
If that can't be done and your scripts really have to fight each other to create separate locks, use the shell's set -C command to turn on file redirection no-clobber mode and just use a redirection to test for the existence of the lock file and create it atomically if it didn't already exist. For example:
Although this was written and tested with a Korn shell, this will work with any shell that meets basic set options and output redirections. (Note that the 2>| /dev/null in the subshell overrides the no-clobber mode on the redirection of stderr so you won't get a diagnostic from the shell if the lock file has been created by another process.)
I wish to by pass a process if the file is over a certain size?
not sure this makes sense
current bit of the script below
#if we are bypAssing the OCR
if ; then
echo Bypassing HOTFOLDER OCR
HOT_FOLDER_DIR=$BATCH_POST_OCR_DIR;
potential change below? would this work would I need... (1 Reply)
Hi All, Looking for a quick LINUX shell script which can continuously monitors the flle size, report the process which is creating a file greater than certain limit and also kill that process. Can someone please help me on this? (4 Replies)
Hi All,
I am new in scripting and working in a project where we have RSyslog servers over CentOS v7 and more than 200 network devices are sending logs to each RSyslog servers. For each network devices individual folders create on the name of the each network devices IP addresses.The main... (7 Replies)
Hello!
Need help to write a Linux script that can be run from windows using command/Cygwin/any other way. I am new to scripting, actually i am trying to automate server health check like free disk space, memory along with few services status, if any services is not running then start services ,... (7 Replies)
Hi All,
Need a small help in writing a shell script which can delete a few lines from a file which is currently being used by another process.
File gets appended using tee -a command due to which its size is getting increased.
Contents like :
25/09/2012 05:18 Run ID:56579677-1
My... (3 Replies)
Hi All,
We have a multi-threaded application.
During the course of action, each process creates some files. Is there any way to know which process has created a particular file ?
Ex:
Suppose we have 3 process running A, B and C in the application and some files FILE1 FILE2 FILE3 and... (4 Replies)
I have multiple input files that I want to manipulate using a shell script. The files are called 250.1 through 250.1000 but I only want the script to manipulate 250.300 through 250.1000. Before I was using the following script to manipulate the text files:
for i in 250.*; do
|| awk... (4 Replies)
I have a local linux machine in which the files are dumped by a remote ubuntu server. If the process in remote server has any problem then empty files are created in local machine. Is there any way using perl script to check if the empty files are being created and delete them and then run a shell... (2 Replies)