Sponsored Content
Top Forums Shell Programming and Scripting Random web page download wget script Post 302796803 by shadyuk on Saturday 20th of April 2013 09:01:15 PM
Old 04-20-2013
Thanks Hanson, much of it was your work.

I'm using it to differentiate between two mobile systems although it could be used for stress testing.

Another useful addition would be to include some file downloads, say 10MB. However, I just tested this and the grep in the script wouldn't work if i included the file URLs. In the example below, i would need the info "5.7s". Any idea?

Code:
[root@scripts]# wget http://download.thinkbroadband.com/5MB.zip
--2013-04-21 01:32:47--  http://download.thinkbroadband.com/5MB.zip
Resolving download.thinkbroadband.com... 80.249.99.148, 2a02:68:1:7::1
Connecting to download.thinkbroadband.com|80.249.99.148|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 5242880 (5.0M) [application/zip]
Saving to: `5MB.zip'

100%[==============================================================================================================================>] 5,242,880   1.34M/s   in 5.7s

2013-04-21 01:32:54 (900 KB/s) - `5MB.zip' saved [5242880/5242880]

---------- Post updated at 08:01 PM ---------- Previous update was at 07:38 PM ----------

Ah, figured it out. I used the last occurrence of "in" as the target as thats common across wget web page and file downloads.

Code:
#!/bin/bash
url_list=( http://www.bbc.co.uk http://www.cnn.com http://www.msn.com )

#Number of users to mimic simultaneously
users=20

#wget arguments
args="-E -H -T 30 -k -K -p --delete-after --no-cache -e robots=off"

function one_user () {
  local user=$1
  while [ 1 -eq 1 ]; do
    local wait=`expr $RANDOM % 120 + 30`
    local n=`expr $RANDOM % 3`
    local url=${url_list[$n]}
    time=`date +"%T"`
    date=`date +"%m-%d-%y"`
    wget=`wget $args $url 2>&1 | awk '/in/{a=$0}END{print a}' | awk -F "in" '{print$2}'`
    echo $date,$time,client$user,$url,$wget
    # echo user = $user wait = $wait url = $url
    sleep $wait
  done
  }

for (( user = 1; user <= $users; user++ )); do
  one_user $user &
done


Last edited by shadyuk; 04-20-2013 at 09:45 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

running shell script thru WEB page ....

....passing variable via list... here 's the HTML code extract : **************** <form method=post action=http://servername/cgi-bin/cgi-comptage_diff.ksh> <table border...........> .............. </table> <table bgcolor=#FFFFFF width="980"> ... (6 Replies)
Discussion started by: Nicol
6 Replies

2. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

3. Shell Programming and Scripting

how to redirect to a web-page by shell script

Dear all, I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory. Example: Login page sends a... (3 Replies)
Discussion started by: ravi18s
3 Replies

4. Shell Programming and Scripting

Perl script to copy contents of a web page

Hi All, Sorry to ask this question and i am not sure whether it is possible. please reply to my question. Thanks in advance. I need a perl script ( or any linux compatible scripts ) to copy the graphical contents of the webpage to a word pad. Say for example, i have a documentation site... (10 Replies)
Discussion started by: anand.linux1984
10 Replies

5. UNIX for Dummies Questions & Answers

Possible to download web page's text to a file?

Hi, Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt" Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
Discussion started by: Breanne
1 Replies

6. HP-UX

Help running a unix script from a web page

First, let me state that I am completely out of my realm with this. I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance. I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Discussion started by: grinds
0 Replies

7. UNIX for Dummies Questions & Answers

List and download web page files

Hello, Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser. Thanks. (2 Replies)
Discussion started by: shadyuk
2 Replies

8. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

9. Shell Programming and Scripting

Refresh web page in bash script

hello, I am trying to refresh my web page which is created in bash script. I have a HTML page which when press a button calls a bash script. this bash script created the same page with dynamic data. When pressing the button I am calling to a function that set time out of 7 seconds and and after... (1 Reply)
Discussion started by: SH78
1 Replies

10. UNIX for Beginners Questions & Answers

How to use cURL to download web page with authentification (form)?

Hello, I'm new in the forum and really beginer, and also sorry form my bad english. I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies
GKRELLKAM-LIST(5)						File Formats Manual						 GKRELLKAM-LIST(5)

NAME
gkrellkam-list - format of the list description files used by GKrellKam DESCRIPTION
GKrellKam lists describe a series of image sources, used in order or at random by the GKrellKam plugin. Online lists (downloadable by GKrellKam with HTTP) use this same syntax. Blank lines and comments (beginning with '#') in GKrellKam lists are ignored. Every line of text that does not begin with a tab character is considered an image source. An image source can be one of several types: image Sources of type image consist of a complete local filename to an image file. url Type url is just that; a ftp:// or http:// URL pointing at an online image file. script Type script is a system command, executed in a shell by GKrellKam when it's time to get this image. The output of the command is assumed to be a complete filename of a local image file, optionally terminated with newlines/whitespace. The corresponding image is loaded. list Type list causes a different GKrellKam list to be included in the current one. Each image source line should look like this: type: name The "type: " part of the line can be omitted for images and URLs, to make these list files backwards compatible with earlier versions of GKrellKam, but it is suggested that you specify them. It will enhance the readability of the list file. There are also properties that can be applied to types image, url, and script. Properties must follow the image source line that they mod- ify, and must begin with a tab character. These include: tooltip This sets the message shown when the mouse is kept over the image panel in GKrellKam for a few seconds. If not set, the image's filename or URL is shown instead. seconds This changes the number of seconds that an image will remain displayed. If not set, the "Default number of seconds" set for that panel in the GKrellM configuration window will be used. refresh If set, this sets the minimum length of time that must elapse before the image is re-loaded. If the image cycles up before that time, the old image will be used. For script types, this means that the script will not be re-run. For url types, the image will not be fetched using wget. When this property is not set, url images will be downloaded every time they are displayed, and scripts will be run every time. EXAMPLE
This is a simple example of a GKrellKam list. For a better example, see example.list from this distribution. # .krellkam.list # This list rotates through ~/pics/mypic[1-4].jpg along with a # webcam, and then displays the pictures in ~/lists/sub.list image: /home/paul/pics/mypic1.jpg /home/paul/pics/mypic2.jpg image: /home/paul/pics/mypic3.jpg http://www.usu.edu/webcam/fullsize.jpg [tab] tooltip: This is the building where I work image: /home/paul/pics/mypic4.jpg [tab] seconds: 4 [tab] tooltip: Don't display this one very long list: /home/paul/lists/sub.list FILES
$HOME/.krellkam.list - The default source for the first GKrellKam panel example.list - A sample GKrellKam list AUTHOR
GKrellKam was written by paul cannon <paul@cannon.cs.usu.edu>. This manual page was written by the same author for inclusion in the GKrel- lKam distribution, and may be used by others. SEE ALSO
wget(8) Dec 7, 2001 GKRELLKAM-LIST(5)
All times are GMT -4. The time now is 08:41 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy