07-08-2010
Shell script to automatically download files
I am new to shell scripting and need to write a program to copy files that are posted as links on a specific url. I want all the links copied with the same file name and the one posted on the webpage containing the url onto a specific directory. That is the first part. The second part of the script is to delete the files from the directory that are older than 3 days. Normally, I use "wget filename" from the specified directory then simply delete with the remove command manually. How would I set up the script to run automatically, twice daily? Thanks.
8 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Good day,
I'm new to linux environment...Is there any scripts available for me to check ports (lets say port 80 and 21) through shell with just a single commandline?
Any response is very much appreciated..
thanks (4 Replies)
Discussion started by: arsonist
4 Replies
2. Shell Programming and Scripting
hi ,
i m trying to run bash scrip automaticially but i dont know how i can do this an anybody tell me how i can autorun shell script when i logon .
thanks (9 Replies)
Discussion started by: tahir23
9 Replies
3. Linux
Hey everyone, my wife has purchased a bundle package of a bunch of images from a site, and now has to download each one of them manually. There are about 500 downloads, and it's quite a hassle to browse to each page and download them all individually.
I would like to write a shell script to... (2 Replies)
Discussion started by: paqman
2 Replies
4. Shell Programming and Scripting
i have a script which has 2 options.
a
b
And a has 6 sub options.
i want to write a script which will call the parent script and give options automatically.
examle:
linasplg11:/opt/ss/kk/01.00/bin # startup.sh
/opt/ss/rdm/01.00
Please select the component to... (2 Replies)
Discussion started by: Aditya.Gurgaon
2 Replies
5. Shell Programming and Scripting
Hello all
i am working on creating shell script to download files daily
example : file12_10_2009.txt.gz
next day this file will be file13_10_2009.txt.gz
and so on..
i need help to know how to download this incrimental date files daily ?
regards (1 Reply)
Discussion started by: mogabr
1 Replies
6. Shell Programming and Scripting
hi
please help me out here,
i want to use curl command in shell script to test web pages,
what i have is an opening page, when i click on a button on opening page, the next page comes up and then i have to upload a file n then click another button to submit and then comes the output page,... (2 Replies)
Discussion started by: Olivia
2 Replies
7. Shell Programming and Scripting
Hello friends,
I have hundreds files in hand, which need extract some data from logs and read these data into an input file.
Here I will explain in detail using these two files as attached. read some data from .log file and write it into the .in file.
**explanation is given inside two... (9 Replies)
Discussion started by: liuzhencc
9 Replies
8. UNIX for Advanced & Expert Users
How to download in bulky compressed (zip, 7z, bzip, xz, etc) archive files from a repository automatically by use of wget ? (3 Replies)
Discussion started by: abdulbadii
3 Replies
LEARN ABOUT BSD
crosspost
CROSSPOST(8) System Manager's Manual CROSSPOST(8)
NAME
crosspost - create the links for cross posted articles
SYNOPSIS
crosspost [ -D dir ] [ -s ] [ file... ]
DESCRIPTION
Crosspost reads group and article number data from files or standard input if none are specified. (A single dash in the file list means to
read standard input.) It uses this information to create the hard, or symbolic, links for cross posted articles. Crosspost is designed to
be used by InterNetNews to create the links as the articles come in. Normally innd creates the links but by having crosspost create the
links innd spends less time waiting for disk IO. In this mode one would start innd(8) using the ``-L'' flag.
Crosspost expects input in the form:
group.name/123 group2.name/456 group3.name/789
with one line per article. Any dots in the input are translated into "/" to translate the news group into a pathname. The first field is
assumed to be the name of an existing copy of the article. Crosspost will attempt to link all the subsequent entries to the first using
hard links if possible or symbolic links if that fails.
By default, crosspost processes its input as an INN channel feed written as a ``WR'' entry in the newsfeeds(5) file, for example:
crosspost:*:Tc,Ap,WR:/usr/lib/news/bin/crosspost
To process the history file and re-create all the links for all articles use:
awk <history -F' ' '(NF > 2){print $3}' | crosspost
(where the -F is followed by a tab character.)
The ``-D'' flag can be used to specify where the article spool is stored. The default directory is /var/spool/news.
By default crosspost will fsync(2) each article after updating the links. The ``-s'' flag can be used to prevent this.
HISTORY
Written by Jerry Aguirre <jerry@ATC.Olivetti.Com>.
SEE ALSO
newsfeeds(5), innd(8).
CROSSPOST(8)