Sponsored Content
Top Forums Shell Programming and Scripting Extract URLs from HTML code using sed Post 302375709 by L0rd on Sunday 29th of November 2009 04:11:26 AM
Old 11-29-2009
Extract URLs from HTML code using sed

Hello,

i try to extract urls from google-search-results, but i have problem with sed filtering of html-code.
what i wont is just list of urls thay apears between ........<p><a href=" and next following " in html code.

here is my code, i use wget and pipelines to filtering. wget works, but there is problem by filtering - i get nothing after filering Smilie
Code:
wget -q -U "Mozilla/5.001" -O - "http://www.google.com/search?q=searchphrase&num=100&start=200" | grep "^<p><a href=" | sed "s/<p><a href=\([^>]*\).*/\1/";

output after filtering, should looking like this - url separated by newline

Code:
http://www.cgdfgfaeh.com/drtuadfh
http://www.uzrfgfaeh.com/dwerh
http://www.dfgaeh.com/
http://ugdfgfaeh.com/dfgdfadfh.htm
http://www.fhdfgfgfaeh.com/urt
http://www.dfgdfgfaeh.com/sdfdsfdf
http://hefcv-gfaeh.com/tu65
http://www.zhjfaeh.com/

please help me
thx
and sory for bad englishSmilie
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How do I extract text only from html file without HTML tag

I have a html file called myfile. If I simply put "cat myfile.html" in UNIX, it shows all the html tags like <a href=r/26><img src="http://www>. But I want to extract only text part. Same problem happens in "type" command in MS-DOS. I know you can do it by opening it in Internet Explorer,... (4 Replies)
Discussion started by: los111
4 Replies

2. UNIX for Advanced & Expert Users

sed to extract HTML content

Hiya, I am trying to extract a news article from a web page. The sed I have written brings back a lot of Javascript code and sometimes advertisments too. Can anyone please help with this one ??? I need to fix this sed so it picks up the article ONLY (don't worry about the title or date .. i got... (2 Replies)
Discussion started by: stargazerr
2 Replies

3. Shell Programming and Scripting

sed to extract only floating point numbers from HTML

Hi All, I'm trying to extract some floating point numbers from within some HTML code like this: <TR><TD class='awrc'>Parse CPU to Parse Elapsd %:</TD><TD ALIGN='right' class='awrc'> 64.50</TD><TD class='awrc'>% Non-Parse CPU:</TD><TD ALIGN='right' class='awrc'> ... (2 Replies)
Discussion started by: pondlife
2 Replies

4. Shell Programming and Scripting

SED to extract HTML text data, not quite right!

I am attempting to extract weather data from the following website, but for the Victoria area only: Text Forecasts - Environment Canada I use this: sed -n "/Greater Victoria./,/Fraser Valley./p" But that phrasing does not sometimes get it all and think perhaps the website has more... (2 Replies)
Discussion started by: lagagnon
2 Replies

5. Shell Programming and Scripting

Extract urls from index.html downloaded using wget

Hi, I need to basically get a list of all the tarballs located at uri I am currently doing a wget on urito get the index.html page Now this index page contains the list of uris that I want to use in my bash script. can someone please guide me ,. I am new to Linux and shell scripting. ... (5 Replies)
Discussion started by: mnanavati
5 Replies

6. Shell Programming and Scripting

Remove external urls from .html file

Hi everyone. I have an html file with lines like so: link href="localFolder/..."> link href="htp://..."> img src="localFolder/..."> img src="htp://..."> I want to remove the links with http in the href and imgs with http in its src. I'm having trouble removing them because there... (4 Replies)
Discussion started by: CowCow339
4 Replies

7. Shell Programming and Scripting

help with sed needed to extract content from html tags

Hi I've searched for it for few hours now and i can't seem to find anything working like i want. I've got webpage, saved in file par with form like this: <html><body><form name='sendme' action='http://example.com/' method='POST'> <textarea name='1st'>abc123def678</textarea> <textarea... (9 Replies)
Discussion started by: seb001
9 Replies

8. Shell Programming and Scripting

How to remove urls from html files

Does anybody know how to remove all urls from html files? all urls are links with anchor texts in the form of <a href="http://www.anydomain.com">ANCHOR</a> they may start with www or not. Goal is to delete all urls and keep the ANCHOR text and if possible to change tags around anchor to... (2 Replies)
Discussion started by: georgi58
2 Replies

9. Shell Programming and Scripting

Trying to extract domain and tld from list of urls.

I have done a fair amount of searching the threads, but I have not been able to cobble together a solution to my challenge. What I am trying to do is to line edit a file that will leave behind only the domain and tld of a long list of urls. The list looks something like this: www.google.com... (3 Replies)
Discussion started by: chamb1
3 Replies

10. Shell Programming and Scripting

Awk/sed HTML extract

I'm extracting text between table tags in HTML <th><a href="/wiki/Buick_LeSabre" title="Buick LeSabre">Buick LeSabre</a></th> using this: awk -F "</*th>" '/<\/*th>/ {print $2}' auto2 > auto3 then this (text between a href): sed -e 's/\(<*>\)//g' auto3 > auto4 How to shorten this into one... (8 Replies)
Discussion started by: p1ne
8 Replies
PHATCH(1)						      General Commands Manual							 PHATCH(1)

NAME
Phatch - Photo Batch Processor DESCRIPTION
Phatch is a simple photo batch processor. It handles all popular image formats and can duplicate (sub)folder hierarchies. It can also batch resize, rotate, rename, ... and more in minutes instead of hours or days if you do it manually. SYNOPSIS
Phatch [actionlist] Phatch [options] [actionlist] [image folders/files/urls] Phatch --inspect [image files/urls] Phatch --droplet [actionlist/recent] [image files/urls] OPTIONS
--version Show program's version number and exit. -h, --help Show the command line options which are accepted by Phatch -c, --console Run Phatch as console program without a gui -d, --droplet Run Phatch as a gui droplet --desktop Save always on desktop -f, --force Ignore errors --fonts Initialize fonts (only for installation scripts) -i, --interactive Interactive -k, --keep Keep existing images (don't overwrite) -l LOCALE Specify locale language (for example en or en_GB) -n, --inspect Inspect metadata (requires exif & iptc plugin) --no-save No save action required at the end -r, --recursive Include all subfolders -t, --trust Do not check images first --unsafe Allow Geek action and unsafe expressions -v, --verbose Verbose EXAMPLES
phatch action_list.phatch phatch --verbose --recursive action_list.py image_file.png image_folder phatch --inspect image_file.jpg phatch --droplet recent phatch -l el AUTHOR
Stani (spe.stani.be (at) gmail.com) User Commands February 2009 PHATCH(1)
All times are GMT -4. The time now is 06:13 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy