I have created a shell script to gzip the public_html files on our website. I have tested this script on another directory on our site and it worked, but when I replaced the directory with the public_html directory it failed.
I am executing this script via a cron job.
This is the script that works.
This is the script that does not work
I ran the cron job twice. This is the error the cron job gives me:
Am I missing something in the formatting? Would simply changing the directories cause the error? I have tried asking our host if there is something special about the public_html directory that would prevent this, but alas... no help there!
Thanks.
Last edited by Franklin52; 02-07-2012 at 08:56 AM..
Reason: Please use code tags for code and data samples, thank you
This is set up in our cPanel cronjobs. The name of the file I have it in is zip.sh.
The file ran successfully with this cron to the first directory, but not to the public_html directory.
Should I try running the script directly from the cron tab instead of from the shell file?
---------- Post updated 02-01-12 at 01:01 PM ---------- Previous update was 01-31-12 at 05:35 PM ----------
Thanks for help. My host told me that it was an issue with the size of our site (1.5GB) vs. a timeout script they have running on the shared server.
Last edited by Franklin52; 02-07-2012 at 08:57 AM..
Reason: Please use code tags for code and data samples, thank you
Full script is posted above. It is a simple one line script.
My host has told me that it is working, but because we are on a shared server, it takes too long to execute and their (server's) kill/time-out scripts are causing mine to fail. I had the same problem with our database dump script.
Hello folks
I hope every one is fine. I need to ask one question.
I have directory
/xx/abcd/data/
inside that data there are files like
11.txt
23.txt
12.txt
*.txt
i want that i will do compress each txt file inside that directory /xx/abcd/data/.
But it will not gzip data... (1 Reply)
Hello dear Community,
I have a task to wrtie a script which will gzip not zipped files in a directory and itīs subdirectories. I succeeded in gzippung the directory but not the subdirectories:
#/bin/bash
#go to the directory where to zip
cd $1
#Zip unzipped files
for i in `ls | xargs... (2 Replies)
Hi All,
I would like to do a loop on all the files with extension .out in my directory this files should be sorted alphabetically.
Then I need to assign to each of them a progressive ID of three digits (maybe in HEX format?).
First I tried to list manually the files as
ARRAY=(
A-001.out ... (5 Replies)
hello there
i want to creat a file that count how many files i have in the directory.
for this i use the command
find . -type f | wc -l > 1In1.myfile
the problem with this command is that it not update after i add a new file in the directory.
Anyone got any ideas how i can... (5 Replies)
Here's what I have thus far:
cp -r /home/mydom/public_html/products/Widget/ /home/mydom/public_html/
This works fine but suppose the folder in public_html has a different name (Main_Widget). The cron above needs to copy the files within the folder (Widget) instead of the folder itself. How... (1 Reply)
I have list of files named file_username_051208_025233.log. Here 051208 is the date and 025233 is the time.I have to run thousands of files daily.I want to put all the files depending on the date of running into a date directory.Suppose if we run files today they should put into 05:Dec:08... (3 Replies)
Hi,
There are multiple files in a directory with different names.How can they be gzipped such that the timestamp of the files is not changed. (2 Replies)
I received a tar file of a directory with 50,000 files in it. Is it possible to extract the files in the tar file without first creating the directory?
ie. Doing tar -xvf filename.tar extracts as follows:
x directory/file1.txt
x directory/file2.txt
.
.
.
I would like to avoid... (4 Replies)
hello,
i have a makefile in which i am specifying the option for creating the object files of the source files.
The option which i am using is this :
gcc -c main.c first.c
by default these object files are created in the same directory in which the makefile is present.
what option... (1 Reply)
I'd like to delete ALL files on a daily basis within a directory that are over a day old. Anyone know how I can automate this through Cron as I have 146 websites to administer.
I've tried...
30 02 * * * /home/myspace/tmp/webalizer -atime + 1\! -type d -exec rm -f {} \;
but all i get is an... (1 Reply)