wget - force link conversion for all links?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting wget - force link conversion for all links?
# 1  
Old 02-15-2009
wget - force link conversion for all links?

Hello,

In using wget with the -k option to convert links to relative URLs, I am finding that not all the links get converted in a recursive download, and when downloading a single file, none of them do. I am assuming that this is because wget will only convert those URLs for files it has downloaded, as it seems to describe in the manual.

I would like to be able to force wget to convert all of the links, even on single file downloads. I have read through the man page and do not see a way to do this, and have found nothing about it googling.

Any input, or suggestions of another utility to do this?

Thank you kindly,

Allasso
# 2  
Old 02-15-2009
What version of wget are you using? I do not have this problem downloading single files. It could also be that particular webpage is hiding links in javascript or something.
Login or Register to Ask a Question

Previous Thread | Next Thread

6 More Discussions You Might Find Interesting

1. AIX

List all the soft links and hard links

Hi I'm logged in as root in an aix box Which command will list all the soft links and hard links present in the server ? (2 Replies)
Discussion started by: newtoaixos
2 Replies

2. Post Here to Contact Site Administrators and Moderators

Broken Links in the Site Link Directory

(split from another thread) Hi. Can you please post a copy of the exact link you used? I have no trouble accessing either the readme, or the link to "Featured Books and Articles by Active Forum Members - Links" Thanks. (2 Replies)
Discussion started by: Scott
2 Replies

3. Solaris

Hard Links and Soft or Sym links

When loooking at files in a directory using ls, how can I tell if I have a hard link or soft link? (11 Replies)
Discussion started by: Harleyrci
11 Replies

4. Shell Programming and Scripting

wget crawl website by extracting links

I am using wget to crawl a website using the following command: wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://www.stupidsite.com What I have found is that after two days of crawling some links are still not downloaded. For example, if some page has 10 links in it as anchor texts... (1 Reply)
Discussion started by: shoaibjameel123
1 Replies

5. Shell Programming and Scripting

Help with using lynx/wget/curl when a link has an ampersand

Hi, for my own interest I want to scrape a lot of data off the Maple Story game rankings page. The problem is, when I want to get the data at this page maplestory(dot)nexon(dot)net/Rankings/OverallRanking.aspx?type=overall&s=&world=0&job=0&pageIndex=6 It gives me the data at this page ... (3 Replies)
Discussion started by: seagaia
3 Replies

6. UNIX for Dummies Questions & Answers

rsync with --link-dest doesn't create hard links

I have been experimenting with rsync as a scriptable backup option, reading various example and tips pages, including Time Machine for every Unix out there - IMHO That page seems to describe the exact behavior I want: The ability to make a "full backup" snapshot regularly, but with rsync... (0 Replies)
Discussion started by: fitzwilliam
0 Replies
Login or Register to Ask a Question