Sponsored Content
Full Discussion: keeping a process alive ?
Operating Systems Solaris keeping a process alive ? Post 302474536 by jim mcnamara on Wednesday 24th of November 2010 01:17:38 PM
Old 11-24-2010
Yes. use crontab to schedule a 'check_it_is_running' script every x minutes. The check it script will restart the other script if it is no longer running.

Every 5 minutes
Code:
5 * * * * /path/to/myscript/check.sh 2&>1 >> /path/to/logs/check.log

Note: your script has to recreate your environment fully by sourcing /etc/profile and .profile
for the user. Bartus solution is definitely preferred.
 

8 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

how to kill process while keeping the submitted job running

:confused: I have a process which was schedule to run from 8am - 6pm daily. The scripts will search & run the commands that i was predefined in the database. Ususally, there were about 6-7 jobs for each submission and each job takes about 1-2 hrs and running one by one. And, I have a cron job... (3 Replies)
Discussion started by: hk_newbie
3 Replies

2. UNIX for Dummies Questions & Answers

Keeping cron jobs alive...?

Hi, I'm very new to Unix so please bear with me... :) Here is my requirement: I need to create a cron job to run two different scripts at 1 a.m. every day. Here's what I did: I used the "crontab -e" command and created a crontab file using the vi editor. When I exit the editor using... (3 Replies)
Discussion started by: yogiB
3 Replies

3. UNIX for Dummies Questions & Answers

Check whether a process is alive or not

Hi Everybody I have small requirement that needs to be implemented in shell script. Currently i have shell script which invokes a java process say "Process A" which runs in background. If some one tries to invoke again the same shell script , then there should be some mechanism inside the... (23 Replies)
Discussion started by: appleforme1415
23 Replies

4. Shell Programming and Scripting

keeping 10 process running at the same time

Hi guys, I neet to run sqlldr to charge about 50,000 files every day to my DWH, so I need to make an script to keep about 100 processes of sqlldr running at the same time. So, the issue is that i've been trying for a few days to make an script which can keep that amount of processes running, so... (2 Replies)
Discussion started by: razziel
2 Replies

5. Shell Programming and Scripting

Keeping std & err outputs alive and feed one log file in a one-shot way

Bonjour, I have a large script with a lot of print statements and misc commands. Standard and error outputs are redirected like into the following code : #!/usr/bin/ksh LOG=/<some dir>/log > $LOG exec >>${LOG} 2>>${LOG} print aaaaa print bbbbb print ccccc ... some_cmd That way,... (5 Replies)
Discussion started by: Fundix
5 Replies

6. Tips and Tutorials

My "Bread and Butter" Process Keep Alive Perl Script....

For the newbies, I should have posted this years ago.... Here is the standard (tiny) "bread and butter" perl script (on Linux) I use in my crontab files to insure key processes are alive ( just in case ! ) like httpd, named, sshd, etc. The example below if for named...... ... (1 Reply)
Discussion started by: Neo
1 Replies

7. What is on Your Mind?

Is this forum alive?

Odd thing. I posted in the Solaris forum, new user, just asking for a bit of advice. Nothing too complicated. As of this post there have been 140 views and zero replies. So that got me thinking, is this normal? I had a look around, and I see the same thing on many other threads, and in other... (2 Replies)
Discussion started by: _JonB_
2 Replies

8. Programming

Socket Keep-Alive

Hi I'm adding http 1.1 GET to my project and trying to use “Keep-Alive” HTTP connections to the host, The problem is when I recv() the first page, it succeeds. However, the 2nd consecutive recv() will receive zero bytes, for which I really have no idea. As per HTTP 1.1 I have Connection: ... (6 Replies)
Discussion started by: Projecteer
6 Replies
All times are GMT -4. The time now is 03:51 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy