Random web page download wget script


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Random web page download wget script
# 1  
Old 04-20-2013
Random web page download wget script

Hi,

I've been attempting to create a script that downloads web pages at random intervals to mimic typical user usage. However I'm struggling to link $url to the URL list and thus wget complains of a missing URL. Any ideas?

Thanks

Code:
#!/bin/sh
#URL List
url1="http://www.bbc.co.uk"
url2="http://www.cnn.com"
url3="http://www.msn.com"

#Number of users to mimic simultaneously
users=20

for (( c=1; c<=$users; c++ ))
do 
(
wait=`echo $(( RANDOM% 120 + 30 ))`
rand=`echo $(( RANDOM% 3 + 1 ))`
url=url"$rand"
wget -p $url
sleep $wait
)&
done

# 2  
Old 04-20-2013
Code:
#!/bin/sh
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

#Number of users to mimic simultaneously
users=20

for (( c = 1; c <= $users; c++ )); do
  wait=`expr $RANDOM % 120 + 30`
  n=`expr $RANDOM % 3`
  url=${url_list[$n]}
  wget -p $url
  sleep $wait
done

# 3  
Old 04-20-2013
There's no need to use expr or backticks.

i.e.
Code:
wait=$(( RANDOM% 120 + 30 ))

would work just fine.
# 4  
Old 04-20-2013
Works great however I included the ampersand to mimic x amount of users ($users) simultaneously. The working example waits for wget to finish before moving on to the next and ends after x number URLs ($users) have been downloaded.

So, the script should forever loop with each "user" downloading a URL from the list at random intervals.

Thanks again.
# 5  
Old 04-20-2013
If the loop should forever loop, then I would think it would not use the for loop. Right? Because the for loop will go 20 times, and then the script is finished. Is that correct, or am I missing something? Should we just use an infinite loop like while [ 1 -eq 1 ] to make it go forever?
# 6  
Old 04-20-2013
Power

Not sure what the best approach is here. Please bare with me while I attempt to explain in a rudimentary fashion. What I need it to do is this:


user 1
download url
wait random
loop

&

user 2
download url
wait random
loop

&

user n
download url
wait random
loop

What i'm trying to avoid is running the following code in 20 different sessions i.e. setsid by using a single script to do the same job.

Code:
#!/bin/sh
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

for (( ; ; )); do
  wait=`expr $RANDOM % 120 + 30`
  n=`expr $RANDOM % 3`
  url=${url_list[$n]}
  wget -p $url
  sleep $wait
done

I hope that makes sense.
# 7  
Old 04-20-2013
Thanks for explaining more. It's quite clear what you are aiming for. Even your original explanation was pretty clear. Try the following:
Code:
#!/bin/bash
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

#Number of users to mimic simultaneously
users=20

function one_user () {
  local user=$1
  while [ 1 -eq 1 ]; do
    local wait=`expr $RANDOM % 120 + 30`
    local n=`expr $RANDOM % 3`
    local url=${url_list[$n]}
    wget -p $url
    # echo user = $user wait = $wait url = $url
    sleep $wait
  done
  }

for (( user = 1; user <= $users; user++ )); do
  one_user $user &
done

What I don't like is that it starts 20 background processes. When you want to eventually kill them it might be a little problematic. But I don't see any way around having a bunch of background processes. And it seems like the original script was designed to have multiple background processes, so you are OK with that. Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

How to use cURL to download web page with authentification (form)?

Hello, I'm new in the forum and really beginer, and also sorry form my bad english. I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies

2. Shell Programming and Scripting

Refresh web page in bash script

hello, I am trying to refresh my web page which is created in bash script. I have a HTML page which when press a button calls a bash script. this bash script created the same page with dynamic data. When pressing the button I am calling to a function that set time out of 7 seconds and and after... (1 Reply)
Discussion started by: SH78
1 Replies

3. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

4. UNIX for Dummies Questions & Answers

List and download web page files

Hello, Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser. Thanks. (2 Replies)
Discussion started by: shadyuk
2 Replies

5. HP-UX

Help running a unix script from a web page

First, let me state that I am completely out of my realm with this. I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance. I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Discussion started by: grinds
0 Replies

6. UNIX for Dummies Questions & Answers

Possible to download web page's text to a file?

Hi, Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt" Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
Discussion started by: Breanne
1 Replies

7. Shell Programming and Scripting

Perl script to copy contents of a web page

Hi All, Sorry to ask this question and i am not sure whether it is possible. please reply to my question. Thanks in advance. I need a perl script ( or any linux compatible scripts ) to copy the graphical contents of the webpage to a word pad. Say for example, i have a documentation site... (10 Replies)
Discussion started by: anand.linux1984
10 Replies

8. Shell Programming and Scripting

how to redirect to a web-page by shell script

Dear all, I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory. Example: Login page sends a... (3 Replies)
Discussion started by: ravi18s
3 Replies

9. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

10. Shell Programming and Scripting

running shell script thru WEB page ....

....passing variable via list... here 's the HTML code extract : **************** <form method=post action=http://servername/cgi-bin/cgi-comptage_diff.ksh> <table border...........> .............. </table> <table bgcolor=#FFFFFF width="980"> ... (6 Replies)
Discussion started by: Nicol
6 Replies
Login or Register to Ask a Question