Sponsored Content
Full Discussion: Wget
Top Forums UNIX for Dummies Questions & Answers Wget Post 302612341 by Smiling Dragon on Sunday 25th of March 2012 05:26:24 PM
Old 03-25-2012
I just tried your above example and it works for me (404 not found for each url though).
Code:
[kadath:~]$ wget -i test.txt
--2012-03-26 10:23:39--  http://www.domain.com/play.php?key=uu67j567jj6y
Resolving www.domain.com (www.domain.com)... 117.121.253.254
Connecting to www.domain.com (www.domain.com)|117.121.253.254|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2012-03-26 10:23:40 ERROR 404: Not Found.

--2012-03-26 10:23:40--  http://www.domain.com/play.php?key=y567uj657u7
Connecting to www.domain.com (www.domain.com)|117.121.253.254|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2012-03-26 10:23:40 ERROR 404: Not Found.

[kadath:~]$ cat test.txt
http://www.domain.com/play.php?key=uu67j567jj6y
http://www.domain.com/play.php?key=y567uj657u7

I'd suggest having a closer look at your urllist.txt file:
Code:
[kadath:~]$ cat -vet test.txt
http://www.domain.com/play.php?key=uu67j567jj6y$
http://www.domain.com/play.php?key=y567uj657u7$

The -vet parameters instruct cat to display all the usually unprintable characters.
The $ symbol at the end is the linefeed/newline character. I'm wondering if you might have some ^M characters in there too indicating a carriage return, or some tabs or other rubbish at the start of the line - those can upset a lot of programs and might be causing your trouble.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. Shell Programming and Scripting

wget help

i am trying to ftp files/dirs with wget. i am having an issue where the path always takes me to my home dir even when i specify something else. For example: wget -m ftp://USER:PASS@IP_ADDRESS/Path/on/remote/box ...but if that path on the remote box isn't in my home dir it doesn't change to... (0 Replies)
Discussion started by: djembeplayer
0 Replies

3. Shell Programming and Scripting

Help with wget

Hi, i need temperature hourly from a web page Im using wget to get the web page. I would like to save the page downloaded in a file called page. I check the file everytime i run the wget function but its not saving but instead creates a wx.php file....Each time i run it...a new wx.php file is... (2 Replies)
Discussion started by: vadharah
2 Replies

4. Shell Programming and Scripting

wget

Hi I want to download some files using wget , and want to save in a specified directory. Is there any way to save it.Please suggest me. (1 Reply)
Discussion started by: mnmonu
1 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. Shell Programming and Scripting

WGET help!

Hi Friends, I have an url like this https://www.unix.com/help/ In this help directory, I have more than 300 directories which contains file or files. So, the 300 directories are like this http://unix.com/help/ dir1 file1 dir2 file2 dir3 file3_1 file3_2... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

7. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

8. UNIX for Dummies Questions & Answers

Wget help

How can I download only *.zip and *.rar files from a website <index> who has multiple directories in root parent directory? I need wget to crawl every directory and download only zip and rar files. Is there anyway I could do it? (7 Replies)
Discussion started by: galford
7 Replies

9. Shell Programming and Scripting

Wget and gz

Can wget be used to goto a site and piped into a .gz extrated command? wget ftp://ftp.ncbi.nlm.nih.gov/pub/clinvar/vcf_GRCh37 | gunzip -d clinvar_20150603.vcf.gz (1 Reply)
Discussion started by: cmccabe
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
Libravatar::URL(3pm)					User Contributed Perl Documentation				      Libravatar::URL(3pm)

NAME
Libravatar::URL - Make URLs for Libravatars from an email address SYNOPSIS
use Libravatar::URL; my $url = libravatar_url( email => 'larry@example.org' ); DESCRIPTION
See <http://www.libravatar.org> for more information. Functions libravatar_url # By email my $url = libravatar_url( email => $email, %options ); # By OpenID my $url = libravatar_url( openid => $openid, %options ); Constructs a URL to fetch the Libravatar for the given $email address or $openid URL. %options are optional. "libravatar_url" will accept all the options of "gravatar_url" in Gravatar::URL except for "rating" and "border". The available options are... size Specifies the desired width and height of the avatar (they are square). Valid values are from 1 to 512 inclusive. Any size other than 80 may cause the original image to be downsampled using bicubic resampling before output. size => 40, # 40 x 40 image default The url to use if the user has no avatar. default => "http://www.example.org/nobody.jpg" Relative URLs will be relative to the base (ie. libravatar.org), not your web site. Libravatar defines special values that you may use as a default to produce dynamic default images. These are "identicon", "monsterid", "wavatar" and "retro". "404" will cause the URL to return an HTTP 404 "Not Found" error instead and "mm" will display the same "mystery man" image for everybody. See <http://www.libravatar.org/api> for more info. If omitted, Libravatar will serve up their default image, the orange butterfly. base This is the URL of the location of the Libravatar server you wish to grab avatars from. Defaults to <http://cdn.libravatar.org/avatar/> for HTTP and <https://seccdn.libravatar.org/avatar/> for HTTPS. short_keys If true, use short key names when constructing the URL. "s" instead of "size", "d" instead of "default" and so on. short_keys defaults to true. https If true, serve avatars over HTTPS instead of HTTP. You should select this option if your site is served over HTTPS to avoid browser warnings about the presence of insecure content. https defaults to false. LICENSE
Copyright 2011, Francois Marier <fmarier@gmail.com>. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. See http://dev.perl.org/licenses/artistic.html SEE ALSO
<http://www.libravatar.org> - The Libravatar web site <http://www.libravatar.org/api> - The Libravatar API documentation perl v5.14.2 2012-04-04 Libravatar::URL(3pm)
All times are GMT -4. The time now is 05:34 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy