Thanks for explaining more. It's quite clear what you are aiming for. Even your original explanation was pretty clear. Try the following:
What I don't like is that it starts 20 background processes. When you want to eventually kill them it might be a little problematic. But I don't see any way around having a bunch of background processes. And it seems like the original script was designed to have multiple background processes, so you are OK with that.
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Dear all,
I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory.
Example: Login page sends a... (3 Replies)
Hi All,
Sorry to ask this question and i am not sure whether it is possible. please reply to my question. Thanks in advance.
I need a perl script ( or any linux compatible scripts ) to copy the graphical contents of the webpage to a word pad.
Say for example, i have a documentation site... (10 Replies)
Hi,
Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt"
Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
First, let me state that I am completely out of my realm with this.
I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance.
I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Hello,
Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser.
Thanks. (2 Replies)
Hi,
I need to download a zip file from my the below US govt link.
https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP
I only have wget utility installed on the server.
When I use the below command, I am getting error 403... (2 Replies)
hello,
I am trying to refresh my web page which is created in bash script.
I have a HTML page which when press a button calls a bash script. this bash script created the same page with dynamic data.
When pressing the button I am calling to a function that set time out of 7 seconds and and after... (1 Reply)
Hello,
I'm new in the forum and really beginer, and also sorry form my bad english.
I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies
LEARN ABOUT DEBIAN
mirrortool
MIRRORTOOL(1) OMT documentation. MIRRORTOOL(1)NAME
mirrortool.pl - OpaL Mirror Tool (OMT)
DESCRIPTION
Creates a mirror of a webpage. It has a number of features such as link rewriting and more. (See the options below).
USAGE
mirrortool.pl [options] [url] [options] [url] [...]
OPTIONS --images : Include <img src=xxx>:s in the download. (default)
--noimages : Do not include <img src=xxx>:s in the download.
--depth n : Maximum recursion depth. (default 1)
--store "regexp" : Files matching regexp are actually stored locally.
: It is possible to | separate (with or).
--rewrite "from=>to" : Urls are rewritten using this rules.
: It is possible to | separate (with or).
: Do not rewrite the dir, because that it will affect
: later lookup. Have to fix this sometime.
--what "regexp" : Files matching regexp are downloaded and traversed.
: It is possible to | separate (with or).
--dir basedir : Where to store local files.
--nohostcheck : Do not check if url points to other host.
--notreecheck : Do not check if url points to other dirtree.
--force : Overwrite all files.
--debug : Print debug-messages.
--retry n : Number of times an url will be retried (default 1)
--auth user:pass : use Basic Authentication
--proxy url : Use a proxy server (like http://u:p@localhost/).
--help : Print this text.
AUTHOR
Ola Lundqvist <opal@lysator.liu.se>
SEE ALSO mirrortool.pl(1)perl v5.8.8 2002-04-15 MIRRORTOOL(1)