Sponsored Content
Top Forums Shell Programming and Scripting awk count how many IP have received that error Post 302633737 by joeyg on Wednesday 2nd of May 2012 11:22:30 AM
Old 05-02-2012
Bumping up posts or double posting is not permitted in these forums.

Please read the rules, which you agreed to when you registered, if you have not already done so.

You may receive an infraction for this. If so, don't worry, just try to follow the rules more carefully. The infraction will expire in the near future

Thank You.

The UNIX and Linux Forums.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

sorting received mail in unix and another error

hello! when issuing the mail command to see my received mail, i get this error: (server)starla:/home/starla>mail Warning: Too many letters, overflowing letters concatenated msgcnt 27378 vxfs: mesg 001: vx_nospace - /dev/vg00/lvol4 file system full (1 block extent) mail: no space for temp file... (0 Replies)
Discussion started by: starla0316
0 Replies

2. Solaris

"lpr.error] Warning: Received SIGPIPE" continuously appearing in logs

On a Solaris 8 print server we're continuously (every 2 minutes or so) getting these messages in the logs: printd: Warning: Received SIGPIPE; continuing I've applied this patch and restarted the printd daemon, but it doesn't help: #109320-22: SunOS 5.8: lp patch Does anyone have any idea what... (4 Replies)
Discussion started by: aussieos
4 Replies

3. AIX

nim: error signal number 2 received

Hi to all, i am trying to make mksysb backup of a NIM client machine from NIM master and while i am reading that the backup is done successfully i get an error message below and it doesnt exit the smit screen. also the status of the command appears to be running. is there anybody who knows why... (3 Replies)
Discussion started by: omonoiatis9
3 Replies

4. Shell Programming and Scripting

awk count how many unique IPs have received that error

Hi all, I want to write a awk script that counts unique IPs that have received one special error. For example 25-04-2012;192.168.70.31;1254545454545417;500.0;SUCCESS 25-04-2012;192.168.70.32;355666650914;315126423993;;General_ERROR_23 30-04-2012;192.168.70.33;e;null;null;Failure... (2 Replies)
Discussion started by: arrals_vl
2 Replies

5. Shell Programming and Scripting

awk - count character count of fields

Hello All, I got a requirement when I was working with a file. Say the file has unloads of data from a table in the form 1|121|asda|434|thesi|2012|05|24| 1|343|unit|09|best|2012|11|5| I was put into a scenario where I need the field count in all the lines in that file. It was simply... (6 Replies)
Discussion started by: PikK45
6 Replies

6. HP-UX

Received error as Not enough space left on device

Hi Forum, We have observed one problem in one of our HP-UX machines which runs a software which connects the radio frequency scan devices and the scanned information is stored in the database through the same software. This software has thrown an error like "Not enough space left on the... (4 Replies)
Discussion started by: Nishant.Jvk
4 Replies

7. UNIX for Dummies Questions & Answers

How to count no. of received data of any one files?

Hi Guys, I have a file where is different task_id and every task id has many received data, now we have to count no. of received data for any task_id. (received means received words, i.e. count no. of received word of any task_id) Please help us Guys. (5 Replies)
Discussion started by: aaditya321
5 Replies

8. Shell Programming and Scripting

Error files count while coping files from source to destination locaton as well count success full

hi All, Any one answer my requirement. I have source location src_dir="/home/oracle/arun/IRMS-CM" My Target location dest_dir="/home/oracle/arun/LiveLink/IRMS-CM/$dc/$pc/$ct" my source text files check with below example.text file content $fn "\t" $dc "\t" $pc "\t" ... (3 Replies)
Discussion started by: sravanreddy
3 Replies

9. AIX

Error received when I was trying to check state of boot record

Hello, This is a test/lab LPAR. Recently installed and updated the SP/TL. everything seems to be working fine. (ran all post install checks) I checked the state of boot record, received the following error/failed message. Can you please explain what does this mean ? />ipl_varyon -i ... (1 Reply)
Discussion started by: dio34
1 Replies

10. UNIX for Beginners Questions & Answers

Error received

I have a program that i need to get done that gets the person's name and his grade then prints it in this order "name-grade-gradeletter" so i wrote this code: #!/bin/bash while :; do read -p "Enter the person's name: " name read -p "Enter the grade of the person: " grade case $grade in )... (3 Replies)
Discussion started by: UniverseCloud
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.16.3 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:13 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy