Sponsored Content
Top Forums Shell Programming and Scripting UNIX joins : facing issue while joining three files Post 302950623 by rahul2662 on Monday 27th of July 2015 07:55:18 AM
Old 07-27-2015
Hello RudiC , the files are not sorted moreover there are some records for hostname which are present in sampleoutput1.txt and not present in sampleoutput2.txt and sampleoutput3.txt. Also delimiter used is ";". I think the above wont work in this scenario. Please help.
 

10 More Discussions You Might Find Interesting

1. HP-UX

Facing an issue related to cronjob

Dear All, I am facing an issue related to cronjob and explained below is the case study: 1. I have a java class named "DmCheckRenditionQueue.java" and placed under "/cpc/documentum/product/5.2.5/tomcat/webapps/rendition" 2. When I am using the command "/usr/openv/java/jre/bin/java -cp... (1 Reply)
Discussion started by: parindam
1 Replies

2. Shell Programming and Scripting

Facing issue in Solaris OS in crontab for running shell script

Hello i have a shell script. it is running fine when i manually run at command prompt using following command ./script_file but while running shell script from crontab, it is giving error in each line. (2 Replies)
Discussion started by: mabrar
2 Replies

3. Shell Programming and Scripting

Facing issue while using xsltproc tp parse XML in bash

I have written a bash script which opens a folder, reads all the *.xml files in it, and pulls the required data that i need from XML tags. I am using xsltproc (my xsl name) (my xml folder location/*.xml) and running this in a for each loop The problem is that some XML files are having special... (3 Replies)
Discussion started by: shivashankar.g
3 Replies

4. Shell Programming and Scripting

Issue with Joining lines from two files

Hi, I have two text files, that need their data joining/concatenation. 'Paste' works for this. But have an issue when there is mismatch in number of rows in each file. E.g. (main file) File1 - has 20 rows File2 - has 30 rows. Command 'paste file1 file2 > file3' joins all lines. I want the... (4 Replies)
Discussion started by: sharath160
4 Replies

5. Programming

SQL Developer JOINS / GROUP BY issue.

Am having a nightmare with a certain piece of code.. have tried almost everything and just cannot see what the issue is.. CREATE OR REPLACE VIEW TOP_EARNER_PER_LOCATION AS SELECT E.FIRST_NAME || ' ' || E.LAST_NAME AS EMPLOYEE_NAME, L.REGIONAL_GROUP AS REGIONAL_GROUP, ... (1 Reply)
Discussion started by: U_C_Dispatj
1 Replies

6. Shell Programming and Scripting

Facing Issue after configuring logrotate

Hi, I have a logrotate configuration which rotates a log every night 1 min before midnight, but somehow its not working and unfortunately not showing any error message as well. Sharing the code for the cron job as well as the conf file, I am using, if some one coule help me..whats wrong with... (2 Replies)
Discussion started by: Neeryan
2 Replies

7. Infrastructure Monitoring

Facing Issue in Nagios 3.3

Hi, I have installed Nagios on Redhat Linux, I have installed Nagios+Plugins+NRPE on Server A (Host Server) and Nagios Plugins + nrpe on remote linux server (red hat linux) run the command on remote linux host, it returns nrpe version usr/local/nagios/libexec/check_nrpe -H localhost ... (1 Reply)
Discussion started by: manoj.solaris
1 Replies

8. Solaris

Facing issue while installing weblogic on Solaris 11

Hi, i am facing issue while installing weblogic on solaris..its giving me invalid argument error. solaris is intstalled on my VM. uname -a SunOS Vishal 5.10 Generic_137138-09 i86pc i386 i86pc screenshot attached. (5 Replies)
Discussion started by: Vishal Baghla
5 Replies

9. SuSE

Facing issue configuring network

Please let me know how to configure network in suse Linux, I have configured the network using ifup and network manager, it is not giving any error but not working, using suse Linux 11.0 sp3 I have checked network connectivity is working. (0 Replies)
Discussion started by: manoj.solaris
0 Replies

10. UNIX for Beginners Questions & Answers

Issue with awk when joining two files when field has '-' hyphen

Dear Community; I need to join two files but I am facing issues. 1st file has multiple columns. Primary (1st) columns has unique values. There are other columns out of which some has non-ascii characters as well (other language). Example File below: 1-1001JRL,BiRecurring... (5 Replies)
Discussion started by: mystition
5 Replies
WWW::RobotRules(3pm)					User Contributed Perl Documentation				      WWW::RobotRules(3pm)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.10.1 2011-03-13 WWW::RobotRules(3pm)
All times are GMT -4. The time now is 11:15 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy