WGET cycling on an updating page


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting WGET cycling on an updating page
# 1  
Old 05-14-2010
WGET cycling on an updating page

Hello,

I am experiencing an issue while downloading a few pages using wget. All of them work without a problem except one which is a page that does a tail on the log and as a result is constantly getting updated.

wget here seems to run endlessly and needs to be manually killed. I wanted to check if there was something that could be done to prevent this? I am currently letting it run for a specified number of seconds.

The command is
Code:
wget -q --no-check-certificate -O "/home/wgettest"  --user="$user" --password="$password" https://10.10.0.30:8082/Accesslog/tail-f/main

Thanks,
-p
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

2. Hardware

USB power cycling poblem on RPI

I am having a problem with cycling USB bus power on the RPI B+ (3.18.7+). Each time I power the USB bus off and on, a device plugged into it gets a higher Device number, and eventually the bus crashes (does not enumerate new devices any more) As a demonstration, I wrote the python script... (4 Replies)
Discussion started by: vthielen
4 Replies

3. Shell Programming and Scripting

Random web page download wget script

Hi, I've been attempting to create a script that downloads web pages at random intervals to mimic typical user usage. However I'm struggling to link $url to the URL list and thus wget complains of a missing URL. Any ideas? Thanks #!/bin/sh #URL List url1="http://www.bbc.co.uk"... (14 Replies)
Discussion started by: shadyuk
14 Replies

4. Shell Programming and Scripting

Wget and single page

Good evening to all!! I'm trying to become familiar with wget. I would like to download a page from Wikipedia with all images and CSSs but without going down to all links present in the page. It should be named index.html. I would like also to save it to /mnt/us inside a new folder. This is... (5 Replies)
Discussion started by: silver18
5 Replies

5. Shell Programming and Scripting

Print multiple copies page by page using lp command

Hi I have a pdf file that is being generated using the rwrun command in the shell script. I then have the lp command in the shell script to print the same pdf file. Suppose there are 4 pages in the pdf file , I need to print 2 copies of the first page, 2 copies of the second page , then 2... (7 Replies)
Discussion started by: megha2525
7 Replies

6. UNIX for Dummies Questions & Answers

wget with semicolon in page name

So, I'd like to wget a webpage, as its not going to stick around forever - but the problem is the webpage has a semicolon in it. wget http://example.com/stuff/asdf;asdf obviously doesn't get the right webpage. Any good way around this? (2 Replies)
Discussion started by: Julolidine
2 Replies

7. Shell Programming and Scripting

KSH switches editin modes when cycling through history. Why?

Hello all, Working in KSH using Solaris, the default editor is VIM. So, per session, I run a small rc script which calls export editor=emacs This works for commands at the prompt. But if I cycle through command history (Using the up arrow) the command line editor defaults to VIM. How can I... (2 Replies)
Discussion started by: eggmatters
2 Replies

8. Shell Programming and Scripting

awk updating one file with another, comparing, updating

Hello, I read and search through this wonderful forum and tried different approaches but it seems I lack some knowledge and neurones ^^ Here is what I'm trying to achieve : file1: test filea 3495; test fileb 4578; test filec 7689; test filey 9978; test filez 12300; file2: test filea... (11 Replies)
Discussion started by: mecano
11 Replies

9. Shell Programming and Scripting

How to get the page size (of a url) using wget

Hi , I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone? Regards, Raj (1 Reply)
Discussion started by: rajbal
1 Replies

10. UNIX for Dummies Questions & Answers

Power Cycling

Hello Friends , I have been reading some of the Sys Admin notes when i came across a term "Power Cycling" Can anybody please explain what this means Thank You (1 Reply)
Discussion started by: DPAI
1 Replies
Login or Register to Ask a Question
JIGDO-LITE(1)															     JIGDO-LITE(1)

NAME
jigdo-lite - Download jigdo files using wget SYNOPSIS
jigdo-lite [ URL ] DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download. Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL. wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template' file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the pieces. `.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one mirror out of a list of all Debian mirrors. If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line options are recognized: -h --help Output short summary of command syntax. -v --version Output version number. --scan FILES Do not ask for "Files to scan", use this path. --noask Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo- lite from cron jobs or in other non-interactive environments. SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget') CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>. AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images for the Debian Linux distribution more convenient. 19 May 2006 JIGDO-LITE(1)