Sponsored Content
Full Discussion: Make file compilation
Top Forums Shell Programming and Scripting Make file compilation Post 302209297 by Franklin52 on Thursday 26th of June 2008 01:30:18 PM
Old 06-26-2008
It's not allowed to bump up questions! Please read the rules:

https://www.unix.com/unix-dummies-que...om-forums.html

The moderator team.
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

postfix compilation issue with make

Hi guys, I'm in trouble to do postfix compilation on a Solaris. SunOS 5.10 Generic_118833-33 sun4u sparc SUNW, Sun-Fire-V440 I'm trying to compile postfix-2.4.5 which make replay /usr/ccs/bin/make and it's in the profile When I launch make install clean it replies make: Fatal... (2 Replies)
Discussion started by: moustik
2 Replies

2. Linux

Make file compilation

Hi I am getting the message /usr/bin/ld: skipping incompatible /opt/sybase/01/OCS-12_5/lib/libblk.a when searching for -lblk /usr/bin/ld: cannot find -lblk collect2: ld returned 1 exit status make: *** Error 1 when i am trying to make the makefile. Any idea about this issue. Let... (1 Reply)
Discussion started by: dhanamurthy
1 Replies

3. Shell Programming and Scripting

compare two files and make 1st file same as 2nd file

I am trying to compare two file and make changes where ever its different. for example: Contents of file1 IP=192.165.89.11 NM=255.255.0.0 GW=192.165.89.1 Contents of file2 IP=192.165.89.11 NM=255.255.255.255 GW=192.165.89.1 NOTE HERE THAT NM IS DIFFERENT So i want the changes... (6 Replies)
Discussion started by: pradeepreddy
6 Replies

4. UNIX for Dummies Questions & Answers

File compilation error on AIX

Hi All, I am successfully able to compile the file through gcc. At the time of compilling the file throught xlc, I am facing the following issues: 1) 1540-0836 (S) The #include file <multimap.h> is not found. 2) 1540-0836 (S) The #include file <pair.h> is not found. 3) ld: 0706-012 The... (0 Replies)
Discussion started by: Prajakta
0 Replies

5. AIX

ProC and other C file compilation problem on AIX

I am linking my compiled proC file with other C files and getting following error. ld: 0711-711 ERROR: Input file /opt/orabase/oracle/product/10.2.0/db_1/lib/libirc.a is empty. The file is being ignored. I used following command to compile my proC code. proc iname=dbConnect.pc code=ANSI_C... (0 Replies)
Discussion started by: amit.singhal
0 Replies

6. Programming

makeutility: how to get the make-file name inside of the make-file?

How I can get the current make-file name in a make-file So, if I run make with specified file:make -f target.mak is it possible to have the 'target' inside of the that 'target.mak' from the file name? (2 Replies)
Discussion started by: alex_5161
2 Replies

7. Programming

Header file compilation using gcc in Sparc Solaris

I am facing problem while migrating the c++ code from Linux to Solaris. In linux the code is absolutly compiled fine with GCC compiler but when i am using the same in Solaris it coomplains bash-3.1$ gcc LibSip.h gcc: Compilation of header file requested The same command is working fine in... (2 Replies)
Discussion started by: mrupesh74
2 Replies

8. Solaris

Error in compilation of cxx file on Sun C++ 5.9 SunOS_sparc 2007/05/03)

Hi All when I am compiling the cxx file on the system with compiler version (CC: Sun C++ 5.9 SunOS_sparc 2007/05/03) , I am facing the following error:- /opt/SUNWspro/bin/CC -dy -misalign -xcode=abs64 -xarch=v9 -D__EXTENSIONS__ -Dsun4_R5=1 -I. -Isun4_R5_v... (0 Replies)
Discussion started by: ash_bit2k2
0 Replies

9. Shell Programming and Scripting

make the name of file and fetch few things from log file

Hello All, I am working on a script where I need to fetch the value from a log file and log file creates with different name but few thing are common DEV_INFOMGT161_MULTI_PTC_BLD01.Stage_All_to_stp2perf1.042312114644.log STP_12_02_01_00_RC01.Stage_stp-domain_to_stp2perf2.042312041739.log ... (2 Replies)
Discussion started by: anuragpgtgerman
2 Replies

10. Shell Programming and Scripting

Read csv file, convert the data and make one text file in UNIX shell scripting

I have input data looks like this which is a part of a csv file 7,1265,76548,"0102:04" 8,1266,76545,"0112:04" I need to make the output data should look like this and the output data will be part of text file: 7|1265000 |7654899 |A| 8|12660000 |76545999 |B| The logic behind the... (6 Replies)
Discussion started by: RJG
6 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.16.3 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 03:04 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy