03-26-2009
Investigating web pages in awk
hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc), the second group with links to images (.jpg .jpeg) and the third group with links to .pdf .doc or other downloadable files. and next to each link i want to output how many times each one occurs in the html file.
(i am only doing the links first, then once I have crakced this i will be able to do the other formats easily)
So I have currently got...
BEGIN{FS = " "}
{for (i=1; i<=NF;i++){if ($i ~ /^href/) {print $i}}
}
#
END{}
which prints out the word e.g href="index.html" > , I would like this to just print out...index.html and the number of times it appears in the webpage.
Any help/hints on how i could achieve the top paragraph would be a great help.
Last edited by adpe; 04-28-2009 at 02:30 PM..
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
my company is considering a new development of our web site, which used to run on Apachi over Solaris.
The company who is going to do this for us knows only about developing it in ASP.
I guess this means we'll have to have another ISS server on NT for these dynamic pages :(
What are... (5 Replies)
Discussion started by: me2unix
5 Replies
2. Shell Programming and Scripting
Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" .
i want the output to look something like this:
Total number of web pages: (number)
Total number of links: (number)
Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies
3. UNIX for Dummies Questions & Answers
Hi All!
Is this possible?
I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these...
http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies
4. UNIX for Dummies Questions & Answers
Is there any way to browse web pages while on the command line?
I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies
5. UNIX for Dummies Questions & Answers
I can't quite seem to understand what the curl command does with a web address. I tried this:
curl O'Reilly Media: Tech Books, Conferences, Courses, News
but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies
6. UNIX for Dummies Questions & Answers
Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings.
It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies
7. Shell Programming and Scripting
Hey guys,
Unfortunatley, I can not use wget on our systems....
I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application.
Has anyone saw this before?
Thanks,
Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies
8. Shell Programming and Scripting
Hello,
I'm writing a shell script to wget content web pages from multiple server into a variable and compare
if they match return 0 or return 2
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv1 srv2 srv3 srv4"
CLUSTER1_APPLIS="test/version.html test2.version.jsp"
# Liste des... (4 Replies)
Discussion started by: gtam
4 Replies
9. Shell Programming and Scripting
Hello
I'm writing a script to get content of web pages on different machines and compare them using their md5 hash
hear is my code
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054"
CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
LEARN ABOUT BSD
crosspost
CROSSPOST(8) System Manager's Manual CROSSPOST(8)
NAME
crosspost - create the links for cross posted articles
SYNOPSIS
crosspost [ -D dir ] [ -s ] [ file... ]
DESCRIPTION
Crosspost reads group and article number data from files or standard input if none are specified. (A single dash in the file list means to
read standard input.) It uses this information to create the hard, or symbolic, links for cross posted articles. Crosspost is designed to
be used by InterNetNews to create the links as the articles come in. Normally innd creates the links but by having crosspost create the
links innd spends less time waiting for disk IO. In this mode one would start innd(8) using the ``-L'' flag.
Crosspost expects input in the form:
group.name/123 group2.name/456 group3.name/789
with one line per article. Any dots in the input are translated into "/" to translate the news group into a pathname. The first field is
assumed to be the name of an existing copy of the article. Crosspost will attempt to link all the subsequent entries to the first using
hard links if possible or symbolic links if that fails.
By default, crosspost processes its input as an INN channel feed written as a ``WR'' entry in the newsfeeds(5) file, for example:
crosspost:*:Tc,Ap,WR:/usr/lib/news/bin/crosspost
To process the history file and re-create all the links for all articles use:
awk <history -F' ' '(NF > 2){print $3}' | crosspost
(where the -F is followed by a tab character.)
The ``-D'' flag can be used to specify where the article spool is stored. The default directory is /var/spool/news.
By default crosspost will fsync(2) each article after updating the links. The ``-s'' flag can be used to prevent this.
HISTORY
Written by Jerry Aguirre <jerry@ATC.Olivetti.Com>.
SEE ALSO
newsfeeds(5), innd(8).
CROSSPOST(8)