Sponsored Content
Top Forums Shell Programming and Scripting How to extract url from html page? Post 302463348 by Neo on Sunday 17th of October 2010 03:55:00 AM
Old 10-17-2010
I used to use Regex Buddy (to create and test regex) for this. They had some stock regex that was quite good for extracting URLs from text. This is really a great tool but sadly only runs on Windows (and on Linux using Wine), as I recall. Using the tool, you create, test and debug complex regex. You can even optimize the regex for performance. Then, you cut-and-paste the regex into your code or application. I highly recommend this tool. I would be running it now, but sadly my XP machine died and I'm running OSX on the desktop and only Android on the go.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to get the page size (of a url) using wget

Hi , I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone? Regards, Raj (1 Reply)
Discussion started by: rajbal
1 Replies

2. UNIX for Dummies Questions & Answers

How do I extract text only from html file without HTML tag

I have a html file called myfile. If I simply put "cat myfile.html" in UNIX, it shows all the html tags like <a href=r/26><img src="http://www>. But I want to extract only text part. Same problem happens in "type" command in MS-DOS. I know you can do it by opening it in Internet Explorer,... (4 Replies)
Discussion started by: los111
4 Replies

3. Solaris

Accessing a HTML page

Hi All, In our unix server we have an apache web server running. I can access the default apache web page from my windows machine. Now, I want to create my own webpage. Therefore I created webpage at /export/home/myname/test.html file. Where do I need to place this file and what do I need... (0 Replies)
Discussion started by: pkm_oec
0 Replies

4. Web Development

findstr in html page

I am planning to create an html page that will count number of connected ports, challenge for me is how to put it in a page. Thanks! (1 Reply)
Discussion started by: webmunkey23
1 Replies

5. UNIX for Dummies Questions & Answers

Publishing HTML Page

Hi All, Thanks for reading. I am not sure if I am asking this in the correct group. But here it goes: There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour. Now I want to view it... (1 Reply)
Discussion started by: deepakgang
1 Replies

6. Red Hat

Publishing HTML Page

Hi All, Thanks for reading. I am not sure if I am asking this in the correct group. But here it goes: There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour. Now I want to view it... (6 Replies)
Discussion started by: deepakgang
6 Replies

7. Shell Programming and Scripting

Extracting anchor text and its URL from HTML files in BASH

Hi All, I have some HTML files and my requirement is to extract all the anchor text words from the HTML files along with their URLs and store the result in a separate text file separated by space. For example, <a href="/kid/stay_healthy/">Staying Healthy</a> which has /kid/stay_healthy/ as... (3 Replies)
Discussion started by: shoaibjameel123
3 Replies

8. Shell Programming and Scripting

URL/HTML encoding

Hey guys, looking for a way to encode a string into URL and HTML in a bash script that I'm making to encode strings in various different digests etc. Can't find anything on it anywhere else on the forums. Any help much appreciated, still very new to bash and programming etc. (4 Replies)
Discussion started by: 3therk1ll
4 Replies

9. Shell Programming and Scripting

Use curl to send a static xml file using url encoding to a web page using pos

Hi I am try to use curl to send a static xml file using url encoding to a web page using post. This has to go through a particular port on our firewall as well. This is my first exposure to curl and am not having much success, so any help you can supply, or point me in the right direction would be... (1 Reply)
Discussion started by: Paul Walker
1 Replies

10. Post Here to Contact Site Administrators and Moderators

Page Not Found error while parsing url

Hi I just tried to post following link while answering, its not parsing properly, just try on your browser Tried to paste while answering : https://www.unix.com/302873559-post2.htmlNot operator is not coming with HTML/PHP tags so attaching file (2 Replies)
Discussion started by: Akshay Hegde
2 Replies
RE_COMP(3)						     Linux Programmer's Manual							RE_COMP(3)

NAME
re_comp, re_exec - BSD regex functions SYNOPSIS
#define _REGEX_RE_COMP #include <sys/types.h> #include <regex.h> char *re_comp(char *regex); int re_exec(char *string); DESCRIPTION
re_comp() is used to compile the null-terminated regular expression pointed to by regex. The compiled pattern occupies a static area, the pattern buffer, which is overwritten by subsequent use of re_comp(). If regex is NULL, no operation is performed and the pattern buffer's contents are not altered. re_exec() is used to assess whether the null-terminated string pointed to by string matches the previously compiled regex. RETURN VALUE
re_comp() returns NULL on successful compilation of regex otherwise it returns a pointer to an appropriate error message. re_exec() returns 1 for a successful match, zero for failure. ATTRIBUTES
Multithreading (see pthreads(7)) The re_comp() and re_exec() functions are not thread-safe. CONFORMING TO
4.3BSD. NOTES
These functions are obsolete; the functions documented in regcomp(3) should be used instead. SEE ALSO
regcomp(3), regex(7), GNU regex manual COLOPHON
This page is part of release 3.53 of the Linux man-pages project. A description of the project, and information about reporting bugs, can be found at http://www.kernel.org/doc/man-pages/. GNU
2013-06-21 RE_COMP(3)
All times are GMT -4. The time now is 08:11 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy