Extract urls from index.html downloaded using wget


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Extract urls from index.html downloaded using wget
# 1  
Old 10-13-2010
Extract urls from index.html downloaded using wget

Hi,
I need to basically get a list of all the tarballs located at uri
I am currently doing a wget on urito get the index.html page

Now this index page contains the list of uris that I want to use in my bash script.

can someone please guide me ,.

I am new to Linux and shell scripting.

Thanks,
M
# 2  
Old 10-13-2010
You want to look at wget resursive download options in particular the -r (recursive) and -l (level).

Typically wget -r -l 1 http://my.site.com/index.html
# 3  
Old 10-13-2010
This creates a directory structure of the site itself. I donot want to create a directory stucture. Basically, just like index.html , i want to have another text file that contains all the URLs present in the site.

Thanks,
M
# 4  
Old 10-13-2010
Oh I see, how about this:
Code:
awk 'BEGIN{ RS="<a *href *= *\""} NR>2 {sub(/".*/,"");print; }' index.html


Last edited by Chubler_XL; 10-13-2010 at 08:08 PM.. Reason: Simplify code
This User Gave Thanks to Chubler_XL For This Post:
# 5  
Old 10-13-2010
Thank you! That helped a lot Smilie
# 6  
Old 10-13-2010
lynx -dump http://www.domain.com/index.html | grep -A999 "^References$" | tail -n +3 | awk '{print $2 }'
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

BASH scripting - Preventing wget messed downloaded files

hello. How can I detect within script, that the downloaded file had not a correct size. linux:~ # wget --limit-rate=20k --ignore-length -O /Software_Downloaded/MULTIMEDIA_ADDON/skype-4.1.0.20-suse.i586.rpm ... (6 Replies)
Discussion started by: jcdole
6 Replies

2. UNIX for Dummies Questions & Answers

Wget -i URLs.txt problem

Hi Everyone, I have a problem with wget using an input file of URLs. When I execute this -> wget -i URLs.txt I get the login.php pages transferred but not the files I have in the URLs.txt file. I need to use the input file because it will have new products to download each week. I want my VA to... (3 Replies)
Discussion started by: Keith londrie
3 Replies

3. Shell Programming and Scripting

Specific image to be downloaded with wget

Hello All, I have gone through Google and came to know that we can download images from a site using wget. Now I am been asked to check whether an image is populated in a site or not. If yes, please send that image to an address as an attachment.. Say for example, the site is Wiki -... (6 Replies)
Discussion started by: sathyaonnuix
6 Replies

4. Shell Programming and Scripting

How to remove urls from html files

Does anybody know how to remove all urls from html files? all urls are links with anchor texts in the form of <a href="http://www.anydomain.com">ANCHOR</a> they may start with www or not. Goal is to delete all urls and keep the ANCHOR text and if possible to change tags around anchor to... (2 Replies)
Discussion started by: georgi58
2 Replies

5. Shell Programming and Scripting

extract fields from a downloaded html file

I have around 100 html files and in each html file I have 5-6 such paragraphs of a company and I need to extract the Name of the company from either the one after "title" or "/company" and then the number of employees and finally the location . <div class="search_result"> <div... (1 Reply)
Discussion started by: gubbu
1 Replies

6. Shell Programming and Scripting

Remove external urls from .html file

Hi everyone. I have an html file with lines like so: link href="localFolder/..."> link href="htp://..."> img src="localFolder/..."> img src="htp://..."> I want to remove the links with http in the href and imgs with http in its src. I'm having trouble removing them because there... (4 Replies)
Discussion started by: CowCow339
4 Replies

7. Shell Programming and Scripting

Help with WGET and renaming downloaded files :(

Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
Discussion started by: o0110o
10 Replies

8. Shell Programming and Scripting

how to limit files downloaded by wget

I am trying to download a page and retrieve only wav and mp3 files via wget. the website is: Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites my command is : wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html When not using the -A wav,mp3... (2 Replies)
Discussion started by: Narnie
2 Replies

9. Shell Programming and Scripting

Extract URLs from HTML code using sed

Hello, i try to extract urls from google-search-results, but i have problem with sed filtering of html-code. what i wont is just list of urls thay apears between ........<p><a href=" and next following " in html code. here is my code, i use wget and pipelines to filtering. wget works, but... (13 Replies)
Discussion started by: L0rd
13 Replies
Login or Register to Ask a Question