Crontab issue

Tags
beginners

 

 
Thread Tools Search this Thread
# 1  
Old 06-28-2013
Crontab issue

Hello,

I have a bash script that finds files older than 31 days and deletes them. I have this file loading into crontab to run everyday. It ran fine the first time i loaded it in, but now when I try to run it manually (bash file.sh) I get errors.

Here is the script
Code:
TIME=" -maxdepth 1 -mtime +31"
DIR="/media/ExternalArchive/"

FIND=$(find ${DIR}${TIME})

rm -r ${FIND}

This is what I am doing to add it to crontab.
"crontab -e"
(load using nano)
"0 0 * * * bash /dir/file"

This should run it every day. The error I get when I try bash file.sh is as follows.
": command not found1:
find: missing argument to `-mtime'
rm: cannot remove `\r\r': No such file or directory"

With the "command not found" error appearing multiple times. Does running it with crontab do something to the file that makes it so I cannot run it normally? Crontab is not working either, the loaded file will not execute.

Thanks
# 2  
Old 06-28-2013
you should never run rm from cron without a specific starting directory ...

you may actually have clobbered your operating system by accident ... try running find on the command line by itself and check for errors ... confirm with a simple ls -l on a known directory ...
# 3  
Old 06-28-2013
Quote:
Originally Posted by jrymer
rm: cannot remove `\r\r': No such file or directory"
Errors about \r mean "stop editing your scripts in microsoft notepad". Editing your scripts in Windows has filled them with carriage returns.

Code:
tr -d '\r' < wintext > unixtext

# 4  
Old 06-28-2013
Find with the proper options give me the correct list of files, confirmed by ls -l. Is there another way to rm the files without moving them to a starting directory?
# 5  
Old 06-28-2013
see modified script below ... run in cron as /dir/file > /dev/null 2>&1 ... no need to call bash anymore as the script does it automatically ... confirm script runs correctly in a test directory before putting in cron ...
Code:
#! /bin/bash

TIME=" -maxdepth 1 -mtime +31"
DIR="/media/ExternalArchive/"
LOG=/dir/log

if [ -d $DIR ]
then
      cd $DIR
      find . ${TIME} -exec rm -r () \;
else
      echo "$DIR not found. $0 exiting."
fi > $LOG 2>&1

exit 0


Last edited by Just Ice; 06-28-2013 at 02:37 PM..
# 6  
Old 06-30-2013
give a try without cron .Trigger the script manually on a known directory.Post the results.
# 7  
Old 06-30-2013
Quote:
Originally Posted by Just Ice
see modified script below ... run in cron as /dir/file > /dev/null 2>&1
I think this will not work, because there is no attempt made to set any environment, namely no "PATH" variable. This looks like running into Cron Problem Number One.

I hope this helps.

bakunin
 

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

Similar Threads More UNIX and Linux Forum Topics You Might Find Helpful
Thread Thread Starter Forum Replies Last Post
Another issue with crontab rafa_fed2 Shell Programming and Scripting 30 07-24-2013 09:08 PM
Crontab Issue aravindj80 UNIX for Advanced & Expert Users 10 05-30-2013 03:21 PM
Crontab issue lovelysethii Shell Programming and Scripting 7 04-01-2013 01:14 PM
Crontab file issue cjashu Solaris 3 02-23-2013 12:29 PM
Crontab issue ! top.level Solaris 1 12-19-2012 08:53 AM
Crontab Issue..!!! gayisada UNIX for Dummies Questions & Answers 1 05-02-2011 06:55 AM
Crontab issue h@foorsa.biz AIX 3 05-02-2011 04:55 AM
Crontab issue pinga123 UNIX for Advanced & Expert Users 1 03-17-2011 06:22 AM
Issue with crontab Sriranga Shell Programming and Scripting 12 06-24-2010 10:19 AM
crontab issue tonijel Shell Programming and Scripting 7 10-30-2009 09:31 AM
crontab issue mac4rfree Shell Programming and Scripting 2 06-30-2009 06:51 AM
crontab issue crackthehit007 UNIX for Advanced & Expert Users 4 03-23-2009 12:32 AM
crontab issue vaddi HP-UX 11 01-28-2008 06:07 AM
Crontab Issue rickyt00 UNIX for Dummies Questions & Answers 2 08-06-2005 10:31 AM
crontab issue Semper_Tempus Solaris 6 07-19-2005 11:13 PM
All times are GMT -4. The time now is 04:40 PM.

Unix & Linux Forums Content Copyright 1993-2018. All Rights Reserved.