10-17-2010
Quote:
Originally Posted by
Scrutinizer
The added difficulty here in this case is that besides the url's also the descriptions had to be extracted which in themselves can contain tags. This became to complicated with the approach I had chosen..
Yes, I understand....
I have seen efficient regex that can easily extract entire URLs, even with tags and more complex, generalized URLs. I don't have them in front of me, so I can't back up my claims at the moment.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi ,
I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone?
Regards,
Raj (1 Reply)
Discussion started by: rajbal
1 Replies
2. UNIX for Dummies Questions & Answers
I have a html file called myfile. If I simply put "cat myfile.html" in UNIX, it shows all the html tags like <a href=r/26><img src="http://www>. But I want to extract only text part.
Same problem happens in "type" command in MS-DOS.
I know you can do it by opening it in Internet Explorer,... (4 Replies)
Discussion started by: los111
4 Replies
3. Solaris
Hi All,
In our unix server we have an apache web server running. I can access the default apache web page from my windows machine.
Now, I want to create my own webpage. Therefore I created webpage at /export/home/myname/test.html file. Where do I need to place this file and what do I need... (0 Replies)
Discussion started by: pkm_oec
0 Replies
4. Web Development
I am planning to create an html page that will count number of connected ports, challenge for me is how to put it in a page. Thanks! (1 Reply)
Discussion started by: webmunkey23
1 Replies
5. UNIX for Dummies Questions & Answers
Hi All,
Thanks for reading.
I am not sure if I am asking this in the correct group. But here it goes:
There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour.
Now I want to view it... (1 Reply)
Discussion started by: deepakgang
1 Replies
6. Red Hat
Hi All,
Thanks for reading.
I am not sure if I am asking this in the correct group. But here it goes:
There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour.
Now I want to view it... (6 Replies)
Discussion started by: deepakgang
6 Replies
7. Shell Programming and Scripting
Hi All,
I have some HTML files and my requirement is to extract all the anchor text words from the HTML files along with their URLs and store the result in a separate text file separated by space. For example, <a href="/kid/stay_healthy/">Staying Healthy</a>
which has /kid/stay_healthy/ as... (3 Replies)
Discussion started by: shoaibjameel123
3 Replies
8. Shell Programming and Scripting
Hey guys, looking for a way to encode a string into URL and HTML in a bash script that I'm making to encode strings in various different digests etc.
Can't find anything on it anywhere else on the forums.
Any help much appreciated, still very new to bash and programming etc. (4 Replies)
Discussion started by: 3therk1ll
4 Replies
9. Shell Programming and Scripting
Hi I am try to use curl to send a static xml file using url encoding to a web page using post. This has to go through a particular port on our firewall as well. This is my first exposure to curl and am not having much success, so any help you can supply, or point me in the right direction would be... (1 Reply)
Discussion started by: Paul Walker
1 Replies
10. Post Here to Contact Site Administrators and Moderators
Hi
I just tried to post following link while answering, its not parsing properly, just try on your browser
Tried to paste while answering :
https://www.unix.com/302873559-post2.htmlNot operator is not coming with HTML/PHP tags so attaching file (2 Replies)
Discussion started by: Akshay Hegde
2 Replies
LEARN ABOUT DEBIAN
urlscan
URLSCAN(1) General Commands Manual URLSCAN(1)
NAME
urlscan - browse the URLs in an email message from a terminal
SYNOPSIS
urlscan [options] < message
urlscan [options] message
DESCRIPTION
urlscan accepts a single email message on standard input, then displays a terminal-based list of the URLs in the given message. Selecting
a URL will invoke sensible-browser(1) on it (and hence any browser specified in the BROWSER environment variable).
urlscan is primarily intended to be used with the mutt (1) mailreader, but it should work well with any terminal-based mail program.
urlscan is similar to urlview(1), but has the following additional features:
1. Support for more message encodings, such as quoted-printable and base64.
2. Extraction and display of the context surrounding each URL.
OPTIONS
-b, --background
Run the Web browser in the background, so you can select another URL without closing it (this will not work with terminal-based Web
browsers such as lynx, links, or w3m).
-c, --compact
Display a simple list of the extracted URLs, instead of showing the context of each URL.
MUTT INTEGRATION
To integrate urlscan with mutt, include the following two commands in ~/.muttrc:
macro index,pager cb "<pipe-message> urlscan<Enter>" "call urlscan to extract URLs out of a message"
macro attach,compose cb "<pipe-entry> urlscan<Enter>" "call urlscan to extract URLs out of a message"
Once these lines are in your mutt configuration file, pressing Control-b will allow you to browse and open the URLs in the currently
selected message.
SEE ALSO
/usr/share/doc/urlscan/README, sensible-browser(1), urlview(1), mutt(1)
AUTHOR
This manual page was written by Daniel Burrows <dburrows@debian.org>.
December 10, 2006 URLSCAN(1)