Sponsored Content
Full Discussion: Wget -i URLs.txt problem
Top Forums UNIX for Dummies Questions & Answers Wget -i URLs.txt problem Post 302732629 by Keith londrie on Sunday 18th of November 2012 08:58:31 PM
Old 11-18-2012
RE: wget -i URLs.txt

Hi Corona688,

Thanks for your post. The membership site I belong to is resell-rights-weekly.com and I just login and click the links to download to my home computer. I want to bypass my home computer and copy the files for that week's downloads. server to server is much faster than me trying to DSL them down and back up to my server. The input file is necessary because new downloads are put on the site each week. I will then put the urls in URLs.txt before the wget, set up as a cron to run every Monday and bring the files over in a fraction of the time to copy. I had it working partially but could not remember the switches I set.

Here is my next try: -->> wget -i URLs.txt --post-data 'user=klondrie&password=XXXX' -o wgetlogfile.txt -c

What do you think? What would you change? This should be a piece of cake. I do not see a lot of security as I can login and click the links to download to my computer. Need them on my server though.

Any more help available?
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Sorting problem "sort -k 16,29 sample.txt > output.txt"

Hi all, Iam trying to sort the contents of the file based on the position of the file. Example: $cat sample.txt 0101020060731 ## Header record 1c1 Berger Awc ANP20070201301 4000.50 1c2 Bose W G ANP20070201609 6000.70 1c2 Andy CK ANP20070201230 28000.00... (3 Replies)
Discussion started by: ganapati
3 Replies

2. UNIX for Advanced & Expert Users

Wget FTP problem!

Hi, I've tried to download from ftp sites by wget but it failed and says "Service unavailable" but when I use sftp in binary mode and use "get" command it works perfectly. What's the problem? BTW: I tried both passive and active mode in wget. thnx for ur help (9 Replies)
Discussion started by: mjdousti
9 Replies

3. Shell Programming and Scripting

Problem with wget

Hi, I want to download some patches from SUN by using a script and I am using "wget" as the utillity for this. The website for downloading has a "https:" in its name as below https://sunsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h and on running wget as below wget... (1 Reply)
Discussion started by: max29583
1 Replies

4. Shell Programming and Scripting

Extract urls from index.html downloaded using wget

Hi, I need to basically get a list of all the tarballs located at uri I am currently doing a wget on urito get the index.html page Now this index page contains the list of uris that I want to use in my bash script. can someone please guide me ,. I am new to Linux and shell scripting. ... (5 Replies)
Discussion started by: mnanavati
5 Replies

5. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

6. UNIX for Dummies Questions & Answers

find lines in file1.txt not found in file2.txt memory problem

I have a diff command that does what I want but when comparing large text/log files, it uses up all the memory I have (sometimes over 8gig of memory) diff file1.txt file2.txt | grep '^<'| awk '{$1="";print $0}' | sed 's/^ *//' Is there a better more efficient way to find the lines in one file... (5 Replies)
Discussion started by: raptor25
5 Replies

7. Shell Programming and Scripting

Problem with wget and cookie

Dear people, I got a problem with an scrip using wget to download pdf-files from an website which uses session-cookies. Background: for university its quite nasty to look up weekly which new homeworks, papers etc. are available on the different sites of the universites chairs. So I wanted a... (1 Reply)
Discussion started by: jackomo
1 Replies

8. Shell Programming and Scripting

Download pdf's using wget convert to txt

wget -i genedx.txt The code above will download multiple pdf files from a site, but how can i download and convert these to .txt? I have attached the master list (genedx.txt - which contains the url and file names) as well as the two PDF's that are downloaded. I am trying to have those... (7 Replies)
Discussion started by: cmccabe
7 Replies

9. Proxy Server

Problem with wget

I cannot download anything using wget in centos 6.5 and 7. But I can update yum etc. # wget https://wordpress.org/latest.tar.gz --2014-10-23 13:50:23-- https://wordpress.org/latest.tar.gz Resolving wordpress.org... 66.155.40.249, 66.155.40.250 Connecting to wordpress.org|66.155.40.249|:443...... (3 Replies)
Discussion started by: nirosha
3 Replies
zsync(1)							   File Transfer							  zsync(1)

NAME
zsync - Partial/differential file download client over HTTP SYNTAX
zsync [ -u url ] [ -i inputfile ] [ -o outputfile ] [ { -s | -q } ] [ -k file.zsync ] [ -A hostname=username:password ] { filename | url } zsync -V DESCRIPTION
Downloads a file over HTTP. zsync uses a control file to determine whether any blocks in the file are already known to the downloader, and only downloads the new blocks. Either a filename or a URL can be given on the command line - this is the path of the control file for the download, which normally has the name of the actual file to downlaod with .zsync appended. (To create this .zsync file you have to have a copy of the target file, so this file should be generated by the person providing the download). zsync downloads to your current directory. It looks for any file in the directory of the same name as the file to download. If it finds one, it assumes that this is an earlier or incomplete version of the new file to download, and scans this file for any blocks that it can use to build the target file. (It also looks for a file of the same name with .part appended, so it will automatically find previously interrupted zsync downloads and reuse the data already downloaded. If you know that the local file to use as input has a different name, you must use -i) zsync retrieves the rest of the target file over HTTP. Once the download is finished, the old version (if the new file wants the same name) is moved aside (a .zs-old extension is appended). The modification time of the file is set to be the same as the remote source file (if specified in the .zsync). OPTIONS
-A hostname=username:password Specifies a username and password to be used with the given hostname. -A can be used multiple times (with different hostnames), in cases where e.g. the .zsync file is on a different server from the download, or there are multiple download servers (there could be different auth details for different servers - and zsync never assumes that your password should be sent to a server other than the one named - otherwise redirects would be dangerous!). -i inputfile Specifies (extra) input files. inputfile is scanned to identify blocks in common with the target file and zsync uses any blocks found. Can be used multiple times. -k file.zsync Indicates that zsync should save the zsync file that it downloads, with the given filename. If that file already exists, then zsync will make a conditional request to the web server, such that it will only download it again if the server's copy is newer. zsync will append .part to the filename for storing it while it is downloading, and will only overwrite the main file once the download is done - and if the download is interrupted, it will resume using the data in the .part file. -o outputfile Override the default output file name. -q Suppress the progress bar, download rate and ETA display. -s Deprecated synonym for -q. -u url This specifies the referring URL. If you have a .zsync file locally (if you downloaded it separately, with wget, say) and the .zsync file contains a relative URL, you need to specify where you got the .zsync file from so that zsync knows which server and path to use for the rest of the download (this is analogous to adding a <base href="..."> to a downloaded web page to make the links work). -V Prints the version of zsync. FILES
ENVIRONMENT VARIABLES
http_proxy Should be the [http://]hostname:port for your web proxy, if one is required to access the target web server(s). EXAMPLES
zsync -i /var/lib/apt/lists/server.debian.org_debian_dists_etch_main_binary-i386_Packages http://zsync.moria.org.uk/s/etch/Packages.zsync AUTHORS
Colin Phipps <cph@moria.org.uk> SEE ALSO
zsyncmake(1) Colin Phipps 0.6.2 zsync(1)
All times are GMT -4. The time now is 07:46 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy