Sponsored Content
Top Forums Shell Programming and Scripting Investigating web pages in awk Post 302301315 by adpe on Thursday 26th of March 2009 01:30:07 PM
Old 03-26-2009
Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc), the second group with links to images (.jpg .jpeg) and the third group with links to .pdf .doc or other downloadable files. and next to each link i want to output how many times each one occurs in the html file.

(i am only doing the links first, then once I have crakced this i will be able to do the other formats easily)

So I have currently got...

BEGIN{FS = " "}
{for (i=1; i<=NF;i++){if ($i ~ /^href/) {print $i}}
}
#
END{}

which prints out the word e.g href="index.html" > , I would like this to just print out...index.html and the number of times it appears in the webpage.

Any help/hints on how i could achieve the top paragraph would be a great help.

Last edited by adpe; 04-28-2009 at 02:30 PM..
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies

2. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

3. UNIX for Dummies Questions & Answers

Selecting information from several web pages...

Hi All! Is this possible? I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these... http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies

4. UNIX for Dummies Questions & Answers

Browse Web pages through command line

Is there any way to browse web pages while on the command line? I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies

5. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

6. UNIX for Dummies Questions & Answers

Forcing web pages to anti-aliase

Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings. It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies

7. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

8. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

9. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
HXCOPY(1)							  HTML-XML-utils							 HXCOPY(1)

NAME
hxcopy - copy an HTML file and update its relative links SYNOPSIS
hxcopy [ -i old-URL ] [ -o new-URL ] [ file-or-URL [ file-or-URL ] ] DESCRIPTION
The hxcopy command copies its first argument to its second argument, while updating relative links. The input is assumed to be HTML or XHTML and may be slightly reformatted in the process. If the second argument is omitted, hxcopy writes to standard output. In this case the option -o is required. If the first argument is also omitted, hxcopy reads from standard input. In this case the option -i is required. OPTIONS
The following options are supported: -i old-URL For the purposes of updating relative links, act as if old-URL is the location from which the input is copied. If this option is omitted, the actual location of the first argument is used for calculating relative links. -o new-URL For the purposed of updating relative links, act as if new-URL is the location to which the input is copied. If this option is omitted, the actual location of the second argument is used for calculating relative links. ENVIRONMENT
To use a proxy to retrieve remote files, set the environment variables http_proxy and ftp_proxy. E.g., http_proxy="http://localhost:8080/" BUGS
Unlike the last argument of cp(1), the last argument of hxcopy must be a file, not a directory. The second argument must be a local file. Writing to a URL is not yet implemented. To work around this, replace hxcopy file.html http://example.org/file.html by hxcopy -o http://example.org/file.html file.html tmp.html and then upload tmp.html to the given URL with some other command, such as curl(1). The first argument, however, may be a URL. hxcopy will download the given file. (Currently only HTTP is supported.) EXAMPLE
Assume the HTML file foo.html contains a relative link to "../bar.html". Here are some examples of commands: hxcopy foo.html bar/foo.html The file foo.html is copied to ../bar/foo.html and the relative link to "../bar.html" becomes "../../bar.html". hxcopy foo.html ../foo.html The file foo.html is copied to ../foo.html and the relative link to "../bar.html" is rewritten as "bar.html". hxcopy -i http://my.org/dir1/foo.html -o http://my.org/foo.html file1.html file2.html The file file1.html is copied to file2.html and the relative link to "../bar.html" is rewritten as "bar.html". A command like this may be useful to update files that are later uploaded to a server. SEE ALSO
cp(1), curl(1), hxwls(1) 6.x 9 Dec 2008 HXCOPY(1)
All times are GMT -4. The time now is 05:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy