Sponsored Content
Full Discussion: site overloads
Contact Us Post Here to Contact Site Administrators and Moderators site overloads Post 302575665 by Neo on Tuesday 22nd of November 2011 11:30:49 AM
Old 11-22-2011
No, it is Googlebot mostly..... we have changed our webmaster tools setting to slow down the crawl rate on the site. Hopefully, that will help!
 

5 More Discussions You Might Find Interesting

1. IP Networking

port access to site to site VPN

Setup a site to site VPN between two cisco routers. One of the site locations is unable to access ports such as https://example.com:9001 How do I let them go into port 9001? They can ssh, ftp, telnet and everything else. Is this a VPN issue or ACL access issue? I put permit ip host... (0 Replies)
Discussion started by: photon
0 Replies

2. IP Networking

How to establish site to site vpn - Linux machine and cisco asa?

Hi, I am trying to establish vpn between my linux server and cisco asa at client side. I installed openswan on my cent os. Linux Server eth0 - 182.2.29.10 Gateway - 182.2.29.1 eth1 - 192.9.200.75 I have simple IPtables Like WAN="eth0" LAN="eth1" (0 Replies)
Discussion started by: ashokvpp
0 Replies

3. IP Networking

Does cisco 1921 router support site to site VPNs using IPSec?

Q: "Does Cisco 1921 router support,, act as an endpoint for, site to site VPNs using IPSec? If so, how many? " A: If you get the Cisco 1921/k9 with the security services bundle then it will have built in security features. Cisco, typically includes IP Sec tunnels I believe as part of that... (0 Replies)
Discussion started by: Ayaerlee
0 Replies

4. IP Networking

IPSec Openswan Site to Site VPN - Big Pain

Hi @all, I try to connect 2 LANs with IPSec/Openswan LAN 1: 192.168.0.0/24 LAN 2: 192.168.1.0/24 This is my Config: conn HomeVPN # # Left security gateway, subnet behind it, nexthop toward right. left=192.168.1.29 ... (1 Reply)
Discussion started by: bahnhasser83
1 Replies

5. What is on Your Mind?

Is this site for me?

Hello, I am new here and my name is Robert. I was actually looking for a forums website where I can go with questions regarding Linux and embedded Linux applications. I am fairly new (6 months) to the world of Linux and embedded Linux applications and the the learning curve is steep. When I am not... (13 Replies)
Discussion started by: Circuits
13 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 11:58 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy