Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Awk: print all URL addresses between iframe tags without repeating an already printed URL Post 302602876 by striker4o on Tuesday 28th of February 2012 04:01:11 PM
Old 02-28-2012
Quote:
Originally Posted by Corona688
How about any of the commands I offered? Smilie

---------- Post updated at 01:28 PM ---------- Previous update was at 01:26 PM ----------

The 'find' does absolutely nothing when you use awk that way.

It also doesn't search inside directories the way find does.
I apologize. Thank you for the commands. I just noticed the one which appears to do the job:

Code:
find . -name "*php*" -or -name "*htm*" |
        xargs awk -F\" -v RS='<' '/^iframe src=/ { if(!X[$2]++) print $2 }'

And another version doing the same thing:
Code:
find . -name "*php*" -or -name "*htm*" | xargs awk -F\" -v RS='<' '/^iframe src=/  {print $2}' | sort -u

---------- Post updated at 10:52 PM ---------- Previous update was at 09:35 PM ----------

Sorry to be a pain. I started testing things globally, and I cannot get it to work recursively all the way from the root. I get:

awk: read error (Is a directory)

Any suggestions? ;/

---------- Post updated at 11:01 PM ---------- Previous update was at 10:52 PM ----------

I guess it needs to pipe through grep to work properly. Here is what I came up with:

Code:
find . -name "*php*" -or -name "*htm*" |xargs grep -rl iframe |xargs  awk -F\" -v RS='<' '/^iframe src=/ {print $2}' | sort -u

Ideas?

Last edited by striker4o; 02-28-2012 at 03:49 PM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

url calling and parameter passing to url in script

Hi all, I need to write a unix script in which need to call a url. Then need to pass parameters to that url. please help. Regards, gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies

2. Shell Programming and Scripting

url calling and parameter passing to url in script

Hi all, I need to write a unix script in which need to call a url. Then need to pass parameters to that url. please help. Regards, gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies

3. UNIX for Dummies Questions & Answers

ReDirecting a URL to another URL - Linux

Hello, I need to redirect an existing URL, how can i do that? There's a current web address to a GUI that I have to redirect to another webaddress. Does anyone know how to do this? This is on Unix boxes Linux. example: https://m45.testing.address.net/host.php make it so the... (3 Replies)
Discussion started by: SkySmart
3 Replies

4. UNIX for Advanced & Expert Users

Need to grab URL and place between <A></A> Tags

my output looks like: <A HREF="http://support.apple.com/kb/HT1629"> </A> <A HREF="http://support.apple.com/kb/HT1200"> </A> <A HREF="http://old.nabble.com/AFP-eating-up-CPU-td19976358.html"> </A> <A HREF="http://jochsner.dyndns.org/scripts/NHR.html"> </A> <A... (3 Replies)
Discussion started by: glev2005
3 Replies

5. Shell Programming and Scripting

how to judge wether a url is valid or not using awk

rt 3ks:confused: (6 Replies)
Discussion started by: rainboisterous
6 Replies

6. Shell Programming and Scripting

Extract URL from RSS Feed in AWK

Hi, I have following data file; <outline title="Matt Cutts" type="rss" version="RSS" xmlUrl="http://www.mattcutts.com/blog/feed/" htmlUrl="http://www.mattcutts.com/blog"/> <outline title="Stone" text="Stone" type="rss" version="RSS" xmlUrl="http://feeds.feedburner.com/STC-Art"... (8 Replies)
Discussion started by: fahdmirza
8 Replies

7. Web Development

Regex to rewrite URL to another URL based on HTTP_HOST?

I am trying to find a way to test some code, but I need to rewrite a specific URL only from a specific HTTP_HOST The call goes out to http://SUB.DOMAIN.COM/showAssignment/7bde10b45efdd7a97629ef2fe01f7303/jsmodule/Nevow.Athena The ID in the middle is always random due to the cookie. I... (5 Replies)
Discussion started by: EXT3FSCK
5 Replies

8. UNIX for Dummies Questions & Answers

URL decoding with awk

The challenge: Decode URL's, i.e. convert %HEX to the corresponding special characters, using only UNIX base utilities, and without having to type out each special character. I have an anonymous C code snippet where the author assigns each hex digit a number from 0 to 16 and then does some... (2 Replies)
Discussion started by: uiop44
2 Replies

9. Shell Programming and Scripting

awk and or sed command to sum the value in repeating tags in a XML

I have a XML in which <Amt Ccy="EUR">3.1</Amt> tag repeats. This is under another tag <Main>. I need to sum all the values of <Amt Ccy=""> (Ccy may vary) coming under <Main> using awk and or sed command. can some help? Sample looks like below <root> <Main> ... (6 Replies)
Discussion started by: bk_12345
6 Replies

10. Shell Programming and Scripting

Reading URL using Mechanize and dump all the contents of the URL to a file

Hello, Am very new to perl , please help me here !! I need help in reading a URL from command line using PERL:: Mechanize and needs all the contents from the URL to get into a file. below is the script which i have written so far , #!/usr/bin/perl use LWP::UserAgent; use... (2 Replies)
Discussion started by: scott_cog
2 Replies
HXCOUNT(1)							  HTML-XML-utils							HXCOUNT(1)

NAME
hxcount - count elements and attributes in HTML or XML files SYNOPSIS
hxcount [ file-or-URL ] DESCRIPTION
The hxcount command counts the number of elements and attributes of each type that appears in the input and prints a report on stdout. OPERANDS
The following operand is supported: file-or-URL The name or URL of an HTML or XML file. If absent, standard input is read instead. EXIT STATUS
The following exit values are returned: 0 Successful completion. > 0 An error occurred in the parsing of the HTML or XML file. hxcount will try to recover from the error and produce output anyway. ENVIRONMENT
To use a proxy to retrieve remote files, set the environment variables http_proxy and ftp_proxy. E.g., http_proxy="http://localhost:8080/" BUGS
Don't trust the output if there were errors in the input. Remote files (specified with a URL) are currently only supported for HTTP. Password-protected files or files that depend on HTTP "cookies" are not handled. (You can use tools such as curl(1) or wget(1) to retrieve such files.) SEE ALSO
asc2xml(1), hxprune(1), hxnormalize(1), hxnum(1), hxtoc(1), hxunent(1), xml2asc(1), UTF-8 (RFC 2279) 6.x 10 Jul 2011 HXCOUNT(1)
All times are GMT -4. The time now is 09:25 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy