01-31-2010
Download image every 24hrs + put it in php site
Hi everyone, I am all new to this and I might not know what I am asking about
I have access to a cPanel with cron tab scheduler. I know nothing re scripting creating php etc..
I own a ftp mainly for own email address, some storage, and a forum (ready-made one :P )
I have some knowledge and I learn fast (i think).
Right, what I would like is downloading an image from a website every 24 hours, give it a name with a date and time, plus put it in a php file so I can can see say 100 or 200 images, next to each other.
I've read something about cron, but I think I started from the middle
I will be grateful for your help guys. I believe it seems easy task, but not for me :/
Last edited by zYx; 04-05-2010 at 03:06 PM..
9 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I'm new to scripting. I'm trying to write a script to download files from ftp site, the following is the script and the message i get after running the script. no files were downloaded :(
Thanks advance!
script:
#!/usr/bin/ksh
DAY=`date --date="-1 days" +%y%m%d`
ftp -v -n "ftp.address" <<... (5 Replies)
Discussion started by: tiff-matt
5 Replies
2. Linux
Hey everyone, my wife has purchased a bundle package of a bunch of images from a site, and now has to download each one of them manually. There are about 500 downloads, and it's quite a hassle to browse to each page and download them all individually.
I would like to write a shell script to... (2 Replies)
Discussion started by: paqman
2 Replies
3. Shell Programming and Scripting
Hi,
I have to connect to online site and download some data from the online site but sometimes the site which i have to connect will be busy
or it will not be working in such a case i have to connect atleast 3 times and try to download the data.
I tried like this:
use CGI::Carp... (2 Replies)
Discussion started by: vanitham
2 Replies
4. Shell Programming and Scripting
Ok, this is quite weird.
wget -r mysite.com/mylink/
should get all the files recursively from the 'mylink' folder.
The problem is that wget saves an index.html file!
When I open this index.html with my browser I realize that it shows all the files in the current folder (plus an option to move... (3 Replies)
Discussion started by: hakermania
3 Replies
5. Shell Programming and Scripting
Hi there, I'm new to shell scripting and need some help if possible?
I need to create a shell script (.sh) to run as a cron job on an ubuntu linux server to connect to an external sftp sites directory using credentials (which I have) and then download to our internal ftp server and then copy... (3 Replies)
Discussion started by: ghath
3 Replies
6. Shell Programming and Scripting
A buddy of mine was telling me last night that you can write a bash script that will download an entire site in gedit?? Is this true??? I think I am going to fall in love with bash :D Any good tutorials?? (15 Replies)
Discussion started by: graphicsman
15 Replies
7. Shell Programming and Scripting
I've an HTML page where the pie chart is generated with google java code with the required input values in UNIX.
The HMTL page is generated in UNIX and then when it loads in browser, the code is interpreted thought internet and the pie chart is generated. This is done by the java code in the... (4 Replies)
Discussion started by: Amutha
4 Replies
8. UNIX for Dummies Questions & Answers
Hello,
any way to download file from image captcha download protected website? The download link is not static but session based, generated.
I can do also via web browser, but i trust rather command line, maybe im wrong (1 Reply)
Discussion started by: postcd
1 Replies
9. Shell Programming and Scripting
I need to go to "vpnbook"vpnbook.com on the web (can't put in name yet)and open "Openvpn" tab
On that page I need to get the username:vpnbook and the next line password:????????
I need to put those two in line one and two of a file "pwfile"
When I have those I need to open openVPN with the... (1 Reply)
Discussion started by: tytower
1 Replies
LEARN ABOUT OPENDARWIN
queuedefs
queuedefs(4) File Formats queuedefs(4)
NAME
queuedefs - queue description file for at, batch, and cron
SYNOPSIS
/etc/cron.d/queuedefs
DESCRIPTION
The queuedefs file describes the characteristics of the queues managed by cron(1M). Each non-comment line in this file describes one queue.
The format of the lines are as follows:
q.[njobj][nicen][nwaitw]
The fields in this line are:
q The name of the queue. a is the default queue for jobs started by at(1); b is the default queue for jobs started by batch (see
at(1)); c is the default queue for jobs run from a crontab(1) file.
njob The maximum number of jobs that can be run simultaneously in that queue; if more than njob jobs are ready to run, only the first
njob jobs will be run, and the others will be run as jobs that are currently running terminate. The default value is 100.
nice The nice(1) value to give to all jobs in that queue that are not run with a user ID of super-user. The default value is 2.
nwait The number of seconds to wait before rescheduling a job that was deferred because more than njob jobs were running in that job's
queue, or because the system-wide limit of jobs executing has been reached. The default value is 60.
Lines beginning with # are comments, and are ignored.
EXAMPLES
Example 1: A sample file.
#
#
a.4j1n
b.2j2n90w
This file specifies that the a queue, for at jobs, can have up to 4 jobs running simultaneously; those jobs will be run with a nice value
of 1. As no nwait value was given, if a job cannot be run because too many other jobs are running cron will wait 60 seconds before trying
again to run it.
The b queue, for batch(1) jobs, can have up to 2 jobs running simultaneously; those jobs will be run with a nice(1) value of 2. If a job
cannot be run because too many other jobs are running, cron(1M) will wait 90 seconds before trying again to run it. All other queues can
have up to 100 jobs running simultaneously; they will be run with a nice value of 2, and if a job cannot be run because too many other jobs
are running cron will wait 60 seconds before trying again to run it.
FILES
/etc/cron.d/queuedefs queue description file for at, batch, and cron.
SEE ALSO
at(1), crontab(1), nice(1), cron(1M)
SunOS 5.10 1 Mar 1994 queuedefs(4)