Sponsored Content
Full Discussion: Script not running from cron
Operating Systems Solaris Script not running from cron Post 302497845 by Cvg on Friday 18th of February 2011 07:23:36 AM
Old 02-18-2011
Here it is:
Code:
#!/usr/bin/ksh

typeset -i quant
flag=n
DBOS=`env | grep  BO | egrep -v "longmsumu1|stream" | awk '{ FS="=+" } { print $1 }' | xargs`
for env in `echo $DBOS`
do
        /apps/sum_glob/gbo_live/sparse/bin/dmg_cronlaunch -ENVI $env -EXE  /home/gbouausr/bin/failq.ksh
        LOADERS=`grep LDR /tmp/result.txt | awk '{printf $1 "\n"}' | xargs`
        for ldr in $LOADERS
        do
                quant=`grep $ldr /tmp/result.txt | awk '{printf $2}'`
                if [ "$quant" -gt 100 ]
                then
                        flag=y
                        echo "$env: $ldr  $quant" >> /tmp/extr_over10k.txt
                fi
        done
                rm /tmp/result.txt
done
if [ $flag = "y" ]
then
        cat /tmp/extr_over10k.txt | mailx -s "Instances with more than 100 trades in the Fail Queue" -r <mail_address>"
fi

rm /tmp/extr_over10k.txt

Moderator's Comments:
Mod Comment
Please use code tags when posting data and code samples!

Last edited by vgersh99; 02-18-2011 at 10:59 AM.. Reason: code tags, please!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Running script using cron

I am running a script by scheduling it using the cron. The line in the cron file is - 10 * * * * ksh -v /apps/gofis/svam/cos_automation/cos_automation.sh vpqa > /apps/gofis/svam/cos_automation/cron.log 2>&1 But after the job is executed, the cron.log contains some part from... (4 Replies)
Discussion started by: ankurgupta
4 Replies

2. UNIX for Dummies Questions & Answers

Running a script in cron question

There is this script I'd like to put into cron, but it asks for date verification. It'll prompt you to press enter to continue. Usually, 100% of the time the dates are ok, so is there a way to run this script in cron and bypass the "enter" prompt? (3 Replies)
Discussion started by: NycUnxer
3 Replies

3. Shell Programming and Scripting

Running a script with cron

I have the following script (trapsize) that checks a file size on my syslog server, and if the file is gt 6g, it will mail an alert to the admin for inspection. The following works like a champ when I execute ./trapsize logged in as root user using bash shell. FILESIZE=$(ls -l /opt2/fwsm/fwsm... (3 Replies)
Discussion started by: altamaha
3 Replies

4. Shell Programming and Scripting

ftp script is not running from CRON

Hi I have an FTP script, which ftp's the files from one unix box to another box. It works from the command when I Issue > ksh ftp.ksh when I schedule it in CRON, it is not being executed automatically. Any thoughts please Thanks Ravi. (4 Replies)
Discussion started by: ravi.balley
4 Replies

5. Shell Programming and Scripting

Problems running script in cron...

Hi all, I have a script running on a Solaris 8 box and the first thing it does it check which user is executing it; if ; then echo "This script must be run as testuser" 1>&2 exit 1 fi This works fine when manually running the script however when adding into that users' crontab it... (1 Reply)
Discussion started by: JayC89
1 Replies

6. Shell Programming and Scripting

Script running using cron

Hello All, I am running the below script.when i am running from shell or terminal its running fine but running using cron its not working. ################################ b36376 27 % cat make_nis_account_ankit.sh #!/bin/ksh ... (2 Replies)
Discussion started by: ajaincv
2 Replies

7. Shell Programming and Scripting

Running shell script via cron

Hi Guys, I do have a shell script that I scheduled to run via the cron but when the script don't run. But when I run the script manually it does run perfectly... What might be the problem? Thanks. (1 Reply)
Discussion started by: Phuti
1 Replies

8. UNIX for Dummies Questions & Answers

Script not running through cron on solaris 5.8

Hi All, I am running a script thorugh cron which is given below. this script is not doing its defined job through cron ,files are still in unzipped state. But when i run this script as ./script.sh it gets executed fine and does all that is required. also when i run thi script as sh... (2 Replies)
Discussion started by: Jcpratap
2 Replies

9. Shell Programming and Scripting

Running same script through cron gives different output

Hi All, I am running the below shell script through cron and surprisingly it gives different output $uname -a Linux 2.6.18-194.3.1.7.3.el5xen #1 SMP Fri Jul 30 00:08:45 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux $ echo $SHELL /bin/bash shell script: cat sar_cpu.sh #!/bin/bash ... (10 Replies)
Discussion started by: a1_win
10 Replies

10. Shell Programming and Scripting

Script not running in cron

Hi All, I have a script which is running fine while triggered manually, However if I placed in crontab it throwing an error. #!/usr/bin/ksh set -vx lc=1 st_date=$(`date "+%Y%m%d"`) LOGFILE=/home/transfer.log.$st_date file="/home/OM_WF.log.$st_date" Manual run - lc=1 + lc=1... (4 Replies)
Discussion started by: nag_sathi
4 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 02:25 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy