Sponsored Content
Top Forums Shell Programming and Scripting Random web page download wget script Post 302796773 by hanson44 on Saturday 20th of April 2013 07:20:44 PM
Old 04-20-2013
Thanks for explaining more. It's quite clear what you are aiming for. Even your original explanation was pretty clear. Try the following:
Code:
#!/bin/bash
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

#Number of users to mimic simultaneously
users=20

function one_user () {
  local user=$1
  while [ 1 -eq 1 ]; do
    local wait=`expr $RANDOM % 120 + 30`
    local n=`expr $RANDOM % 3`
    local url=${url_list[$n]}
    wget -p $url
    # echo user = $user wait = $wait url = $url
    sleep $wait
  done
  }

for (( user = 1; user <= $users; user++ )); do
  one_user $user &
done

What I don't like is that it starts 20 background processes. When you want to eventually kill them it might be a little problematic. But I don't see any way around having a bunch of background processes. And it seems like the original script was designed to have multiple background processes, so you are OK with that. Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

running shell script thru WEB page ....

....passing variable via list... here 's the HTML code extract : **************** <form method=post action=http://servername/cgi-bin/cgi-comptage_diff.ksh> <table border...........> .............. </table> <table bgcolor=#FFFFFF width="980"> ... (6 Replies)
Discussion started by: Nicol
6 Replies

2. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

3. Shell Programming and Scripting

how to redirect to a web-page by shell script

Dear all, I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory. Example: Login page sends a... (3 Replies)
Discussion started by: ravi18s
3 Replies

4. Shell Programming and Scripting

Perl script to copy contents of a web page

Hi All, Sorry to ask this question and i am not sure whether it is possible. please reply to my question. Thanks in advance. I need a perl script ( or any linux compatible scripts ) to copy the graphical contents of the webpage to a word pad. Say for example, i have a documentation site... (10 Replies)
Discussion started by: anand.linux1984
10 Replies

5. UNIX for Dummies Questions & Answers

Possible to download web page's text to a file?

Hi, Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt" Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
Discussion started by: Breanne
1 Replies

6. HP-UX

Help running a unix script from a web page

First, let me state that I am completely out of my realm with this. I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance. I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Discussion started by: grinds
0 Replies

7. UNIX for Dummies Questions & Answers

List and download web page files

Hello, Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser. Thanks. (2 Replies)
Discussion started by: shadyuk
2 Replies

8. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

9. Shell Programming and Scripting

Refresh web page in bash script

hello, I am trying to refresh my web page which is created in bash script. I have a HTML page which when press a button calls a bash script. this bash script created the same page with dynamic data. When pressing the button I am calling to a function that set time out of 7 seconds and and after... (1 Reply)
Discussion started by: SH78
1 Replies

10. UNIX for Beginners Questions & Answers

How to use cURL to download web page with authentification (form)?

Hello, I'm new in the forum and really beginer, and also sorry form my bad english. I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies
MIRRORTOOL(1)							OMT documentation.						     MIRRORTOOL(1)

NAME
mirrortool.pl - OpaL Mirror Tool (OMT) DESCRIPTION
Creates a mirror of a webpage. It has a number of features such as link rewriting and more. (See the options below). USAGE
mirrortool.pl [options] [url] [options] [url] [...] OPTIONS
--images : Include <img src=xxx>:s in the download. (default) --noimages : Do not include <img src=xxx>:s in the download. --depth n : Maximum recursion depth. (default 1) --store "regexp" : Files matching regexp are actually stored locally. : It is possible to | separate (with or). --rewrite "from=>to" : Urls are rewritten using this rules. : It is possible to | separate (with or). : Do not rewrite the dir, because that it will affect : later lookup. Have to fix this sometime. --what "regexp" : Files matching regexp are downloaded and traversed. : It is possible to | separate (with or). --dir basedir : Where to store local files. --nohostcheck : Do not check if url points to other host. --notreecheck : Do not check if url points to other dirtree. --force : Overwrite all files. --debug : Print debug-messages. --retry n : Number of times an url will be retried (default 1) --auth user:pass : use Basic Authentication --proxy url : Use a proxy server (like http://u:p@localhost/). --help : Print this text. AUTHOR
Ola Lundqvist <opal@lysator.liu.se> SEE ALSO
mirrortool.pl(1) perl v5.8.8 2002-04-15 MIRRORTOOL(1)
All times are GMT -4. The time now is 06:59 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy