02-15-2009
What version of wget are you using? I do not have this problem downloading single files. It could also be that particular webpage is hiding links in javascript or something.
6 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I have been experimenting with rsync as a scriptable backup option, reading various example and tips pages, including Time Machine for every Unix out there - IMHO
That page seems to describe the exact behavior I want: The ability to make a "full backup" snapshot regularly, but with rsync... (0 Replies)
Discussion started by: fitzwilliam
0 Replies
2. Shell Programming and Scripting
Hi, for my own interest I want to scrape a lot of data off the Maple Story game rankings page.
The problem is, when I want to get the data at this page
maplestory(dot)nexon(dot)net/Rankings/OverallRanking.aspx?type=overall&s=&world=0&job=0&pageIndex=6
It gives me the data at this page
... (3 Replies)
Discussion started by: seagaia
3 Replies
3. Shell Programming and Scripting
I am using wget to crawl a website using the following command:
wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://www.stupidsite.com
What I have found is that after two days of crawling some links are still not downloaded. For example, if some page has 10 links in it as anchor texts... (1 Reply)
Discussion started by: shoaibjameel123
1 Replies
4. Solaris
When loooking at files in a directory using ls, how can I tell if I have a hard link or soft link? (11 Replies)
Discussion started by: Harleyrci
11 Replies
5. Post Here to Contact Site Administrators and Moderators
(split from another thread)
Hi.
Can you please post a copy of the exact link you used?
I have no trouble accessing either the readme, or the link to "Featured Books and Articles by Active Forum Members - Links"
Thanks. (2 Replies)
Discussion started by: Scott
2 Replies
6. AIX
Hi
I'm logged in as root in an aix box
Which command will list all the soft links and hard links present in the server ? (2 Replies)
Discussion started by: newtoaixos
2 Replies
LEARN ABOUT DEBIAN
jigdo-lite
JIGDO-LITE(1) JIGDO-LITE(1)
NAME
jigdo-lite - Download jigdo files using wget
SYNOPSIS
jigdo-lite [ URL ]
DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download.
Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL.
wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template'
file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the
pieces.
`.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one
mirror out of a list of all Debian mirrors.
If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line
options are recognized:
-h --help
Output short summary of command syntax.
-v --version
Output version number.
--scan FILES
Do not ask for "Files to scan", use this path.
--noask
Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo-
lite from cron jobs or in other non-interactive environments.
SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget')
CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>.
AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images
for the Debian Linux distribution more convenient.
19 May 2006 JIGDO-LITE(1)