Sponsored Content
Top Forums Shell Programming and Scripting Random web page download wget script Post 302796773 by hanson44 on Saturday 20th of April 2013 07:20:44 PM
Old 04-20-2013
Thanks for explaining more. It's quite clear what you are aiming for. Even your original explanation was pretty clear. Try the following:
Code:
#!/bin/bash
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

#Number of users to mimic simultaneously
users=20

function one_user () {
  local user=$1
  while [ 1 -eq 1 ]; do
    local wait=`expr $RANDOM % 120 + 30`
    local n=`expr $RANDOM % 3`
    local url=${url_list[$n]}
    wget -p $url
    # echo user = $user wait = $wait url = $url
    sleep $wait
  done
  }

for (( user = 1; user <= $users; user++ )); do
  one_user $user &
done

What I don't like is that it starts 20 background processes. When you want to eventually kill them it might be a little problematic. But I don't see any way around having a bunch of background processes. And it seems like the original script was designed to have multiple background processes, so you are OK with that. Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

running shell script thru WEB page ....

....passing variable via list... here 's the HTML code extract : **************** <form method=post action=http://servername/cgi-bin/cgi-comptage_diff.ksh> <table border...........> .............. </table> <table bgcolor=#FFFFFF width="980"> ... (6 Replies)
Discussion started by: Nicol
6 Replies

2. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

3. Shell Programming and Scripting

how to redirect to a web-page by shell script

Dear all, I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory. Example: Login page sends a... (3 Replies)
Discussion started by: ravi18s
3 Replies

4. Shell Programming and Scripting

Perl script to copy contents of a web page

Hi All, Sorry to ask this question and i am not sure whether it is possible. please reply to my question. Thanks in advance. I need a perl script ( or any linux compatible scripts ) to copy the graphical contents of the webpage to a word pad. Say for example, i have a documentation site... (10 Replies)
Discussion started by: anand.linux1984
10 Replies

5. UNIX for Dummies Questions & Answers

Possible to download web page's text to a file?

Hi, Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt" Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
Discussion started by: Breanne
1 Replies

6. HP-UX

Help running a unix script from a web page

First, let me state that I am completely out of my realm with this. I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance. I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Discussion started by: grinds
0 Replies

7. UNIX for Dummies Questions & Answers

List and download web page files

Hello, Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser. Thanks. (2 Replies)
Discussion started by: shadyuk
2 Replies

8. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

9. Shell Programming and Scripting

Refresh web page in bash script

hello, I am trying to refresh my web page which is created in bash script. I have a HTML page which when press a button calls a bash script. this bash script created the same page with dynamic data. When pressing the button I am calling to a function that set time out of 7 seconds and and after... (1 Reply)
Discussion started by: SH78
1 Replies

10. UNIX for Beginners Questions & Answers

How to use cURL to download web page with authentification (form)?

Hello, I'm new in the forum and really beginer, and also sorry form my bad english. I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies
Ns_Url(3aolserver)					   AOLserver Library Procedures 					Ns_Url(3aolserver)

__________________________________________________________________________________________________________________________________________________

NAME
Ns_AbsoluteUrl, Ns_ParseUrl, Ns_RelativeUrl, Ns_SkipUrl - URL manipulation routines SYNOPSIS
#include "ns.h" int Ns_AbsoluteUrl(Ns_DString *pds, char *url, char *baseurl) int Ns_ParseUrl(char *url, char **pprotocol, char **phost, char **pport, char **ppath, char **ptail) char * Ns_RelativeUrl(char *url, char *location) char * Ns_SkipUrl(Ns_Request *request, int n) _________________________________________________________________ DESCRIPTION
Ns_AbsoluteUrl(pds, url, baseurl) Construct an URL based on baseurl but with as many parts of the incomplete url as possible. Return NS_OK or NS_ERROR. Ns_ParseUrl(url, pprotocol, phost, pport, ppath, ptail) Parse a URL into its component parts. Pointers to the protocol, host, port, path, and "tail" (last path element) will be set by ref- erence in the passed-in pointers. The passed-in url will be modified. Ns_RelativeUrl(url, location) If the url passed in is for this server, then the initial part of the URL is stripped off. e.g., on a server whose location is http://www.foo.com, Ns_RelativeUrl of "http://www.foo.com/hello" will return "/hello". Returns a pointer to the beginning of the relative url in the passed-in url, or NULL if error. Will set errno on error. Ns_SkipUrl(request, n) Return a pointer n elements into the request's url. SEE ALSO
nsd(1), info(n) KEYWORDS
AOLserver 4.0 Ns_Url(3aolserver)
All times are GMT -4. The time now is 11:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy