Sponsored Content
Full Discussion: robots.txt usage
Top Forums Web Development robots.txt usage Post 302328669 by rickhlwong on Thursday 25th of June 2009 02:54:36 AM
Old 06-25-2009
robots.txt usage

Dear all,

I want to use robots.txt to control the "spider". can i specify a IP address to ALLOW the website can be accessed by the "spider"??
thank you.

Rick
 

5 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

echo "ABC" > file1.txt file2.txt file3.txt

Hi Guru's, I need to create 3 files with the contents "ABC" using single command. Iam using: echo "ABC" > file1.txt file2.txt file3.txt the above command is not working. pls help me... With Regards / Ganapati (4 Replies)
Discussion started by: ganapati
4 Replies

2. HP-UX

how can I find cpu usage memory usage swap usage and logical volume usage

how can I find cpu usage memory usage swap usage and I want to know CPU usage above X% and contiue Y times and memory usage above X % and contiue Y times my final destination is monitor process logical volume usage above X % and number of Logical voluage above can I not to... (3 Replies)
Discussion started by: alert0919
3 Replies

3. AIX

How to monitor the IBM AIX server for I/O usage,memory usage,CPU usage,network..?

How to monitor the IBM AIX server for I/O usage, memory usage, CPU usage, network usage, storage usage? (3 Replies)
Discussion started by: laknar
3 Replies

4. Solaris

Netbackup robots not working

Hi All, I am facing a issue with robtest not working on netbackup 7.1 on solaris 10. I can see the robots and drives are deteted by O.S but not sure why robtest is not working. Below are few ouputs of few commands. $PWD>cfgadm -al -o show_FCP_dev Ap_Id Type ... (0 Replies)
Discussion started by: sahil_shine
0 Replies

5. IP Networking

TXT Records: Usage

Ok..the last DNS question. I've been on a DNS kick lately. So when poking around, I keep bumping sites that have txt records all with cryptic, but yet similar text in them.. Something like: cnn.com. 3305 IN TXT "882269757-4422010" cnn.com. 3305 IN TXT "ms=ms97284866"... (1 Reply)
Discussion started by: Lost in Cyberia
1 Replies
DL10N-HTML(1p)						User Contributed Perl Documentation					    DL10N-HTML(1p)

NAME
dl10n-spider -- crawl translator mailing lists (and BTS) for status updates SYNOPSIS
dl10n-spider [options] lang+ DESCRIPTION
This script parses the debian-l10n-<language> mailing list archives. It looks for emails which title follow a specific format indicating what the author intend to translate, or the current status of his work on this translation. Those informations are saved to a dl10n database which can then be used to build a l10n coordination page or any other useless statistics. Command line option parsing General options: -h, --help display short help text -V, --version display version and exit Begin point of the crawling: --year=YYYY --month=MM --message=msg if not specified, will crawl for new messages. Database to fill: --sdb=STATUS_FILE use STATUS_FILE as status file (instead of $STATUS_FILE) LICENSE
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. # You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA. COPYRIGHT (C) 2003,2004 Tim Dijkstra 2004 Nicolas Bertolissio 2004 Martin Quinson perl v5.14.2 2012-01-15 DL10N-HTML(1p)
All times are GMT -4. The time now is 11:18 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy