Sponsored Content
Full Discussion: running sql in crontab
Top Forums UNIX for Dummies Questions & Answers running sql in crontab Post 26675 by dorilevy on Thursday 22nd of August 2002 06:14:33 AM
Old 08-22-2002
the script -

Code:
#!/bin/csh 
set c_str = "rating_info/rating@bscsprod"
set file=`ls -lrt /var/tmp/Rep4* | tail -1 | awk '{print$9}'`
set workfile="/tmp/Rep4.txt"
set temp="'"
set ttt="\.\." 
cat $file | sed 's/'$temp'/./g' > /tmp/a.txt
cat /tmp/a.txt | sed 's/'$ttt'/-/g' > /tmp/b.txt
cat /tmp/b.txt | sed 's/'\,'/\  /g' > $workfile
rm /tmp/a.txt
rm /tmp/b.txt
foreach stam ("00-04" "04-08" "08-12" "12-16" "16-20" "20-24" "> 25" )
 cat $workfile | grep "$stam" | cut -f2-6 > /tmp/sss.txt
 set s_str=""
 foreach num ( `more /tmp/sss.txt` )
   if ( `echo $num | cut -c1 ` == "." ) then
    set s_num="0$num"
   else
    set s_num="$num"
   endif
   set s_str="$s_str $s_num"
 end
 sqlplus $c_str @/bill/app01/bscsprod/scripts/dori/update_delay.sql `echo $s_str` "'$stam'"
 rm /tmp/sss.txt
end
set delay=`cat $workfile | grep "Delay" | cut -f2`
sqlplus $c_str @/bill/app01/bscsprod/scripts/dori/update_unpr.sql "$delay" 'delay'
set udate=`head -1 $workfile`
sqlplus $c_str @/bill/app01/bscsprod/scripts/dori/update_time.sql "'$udate'" 'update'

added code tags for readability --oombera

Last edited by oombera; 02-18-2004 at 06:45 PM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

SQL vs Crontab

I've been trying to execute a script which call some SQL Queries for midnight report, using crontab I got the next mesage: Message file sp1<lang>.msb not found Error 6 initializing SQL*Plus Message file sp1<lang>.msb not found Error 6 initializing SQL*Plus If I use the "at" command it... (4 Replies)
Discussion started by: alex blanco
4 Replies

2. UNIX for Dummies Questions & Answers

running .sh, .sql .etc how to?

hi all yes, I am a beginner in Unix,-sun solorais--V8... what is the command to run files.... single and batch jobs....any help would be great... Cheers ps...i know running files in Sqlplus on Unix prompt, I can use the @' sign...but how does this work.. Cheers E (2 Replies)
Discussion started by: etravels
2 Replies

3. Shell Programming and Scripting

crontab and shell script that executes a sql.

Problem: I have a crontab and when it kicks off, xxx.sh shell script is called. Which has a nohup sqlplus session call. Problem is sql does not get executed and a text file is not getting created. Only a empty log file is getting created. Are there any constraints for crontab to open a sql... (6 Replies)
Discussion started by: radhika
6 Replies

4. Shell Programming and Scripting

Access SQL with crontab

Hi, I wrote a script shell whose function is to make a remove of directories depending on the result of a query on Oracle database. When I execute my script shell directly from unix, it works fine. But, when I put it in a line of a crontab, it doesn't work no more, because of the sql... (1 Reply)
Discussion started by: tbeghain
1 Replies

5. UNIX for Advanced & Expert Users

Crontab is not running!!!

Hi experts, need your helpo. after editing the crontab while saving the file it says- "/tmp/crontabRlaauT" 1 line, 77 characters cron may not be running - call your system administrator And i checked after certain time. script in cron is not running. I got a mail in user saying... (1 Reply)
Discussion started by: thepurple
1 Replies

6. UNIX for Dummies Questions & Answers

Issue while running from crontab

Hi All, Here is my command which I've scheduled to be run from crontab, but it's giving error message: rah: rahhost executable needed but not in PATH My cmd is- 36 10 * * * /opt/IBM/dwe/db2/V9.5/bin/rah "df -m" >> /db2home/bculinux/Files/log/db.out 2>&1 Though I've added path... (5 Replies)
Discussion started by: NARESH1302
5 Replies

7. UNIX for Dummies Questions & Answers

Crontab + Script + .sql

Hi guys today i'll bring to you a new problem that i need to execute. So what i need to do it's create a script that: conect to some database logon run a .sql script logoff and close the connection after that, put this script on the crontab To set up the crontab it's ok for me, i think... (3 Replies)
Discussion started by: Newer
3 Replies

8. UNIX for Dummies Questions & Answers

crontab not running script

Hi All, I am having the below script to be run from crontab, it it doesnt run. 1 * * * * /home/cobr_ext/test.sh > /home/cobr_ext/temp.txt when i run i manally it runs without any issues. Could please help me as to why doesnt it run the script.:( (7 Replies)
Discussion started by: abhi_123
7 Replies

9. Solaris

Crontab is not loading/running

hi, i have crontab where i have put my db backup scripts but crontab do not run at specific time. i have checkted the scripts andwhen run the script as ./mydbbkp.sh so it successfully run and do the job but from cron it is not running whihcmeans cron is not running. i have an oracle user under... (3 Replies)
Discussion started by: janakors
3 Replies

10. Shell Programming and Scripting

Storing multiple sql queries output into variable by running sql command only once

Hi All, I want to run multiple sql queries and store the data in variable but i want to use sql command only once. Is there a way without running sql command twice and storing.Please advise. Eg : Select 'Query 1 output' from dual; Select 'Query 2 output' from dual; I want to... (3 Replies)
Discussion started by: Rokkesh
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 03:38 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy