Sponsored Content
Top Forums Shell Programming and Scripting Find text that is different in two files Post 302945260 by cmccabe on Wednesday 27th of May 2015 04:04:06 PM
Old 05-27-2015
Works perfect.... thank you Smilie.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Find files containing text

How do I find the files containing some text. eg. I want to find alll the files that contain the word 'hello' grep hello * will give me only for the specific directory. How do I find for entire system. Thanks for help in advance.. (5 Replies)
Discussion started by: sushrut
5 Replies

2. UNIX for Dummies Questions & Answers

How to find a text in jar and zip files.??

Hi, I have classes dir, in that I have jar and zip files, I need to find "Param.class" is in which zip or jar file? (1 Reply)
Discussion started by: redlotus72
1 Replies

3. Shell Programming and Scripting

find files where text case is different

I need to search a directory for files that have certain text in the file name. I use the following command to do that successfully - find /abc/indicator -name '*midday*.ind' The problem is some file names are lower case, some mixed case and some upper case. Is there a way to do the find... (5 Replies)
Discussion started by: schipper
5 Replies

4. UNIX for Dummies Questions & Answers

sorting files with find command before sending to text file

i need help with my script.... i am suppose to grab files within a certain date range now i have done that already using the touch and find command (found them in other threads) touch -d "$date_start" ./tmp1 touch -d "$date_end" ./tmp2 find "$data_location" -maxdepth 1 -newer ./tmp1 !... (6 Replies)
Discussion started by: deking
6 Replies

5. Shell Programming and Scripting

Bash snippet to find files based on a text file?

Evening all. I'm having a terrible time with a script I've been working on for a few days now... Say I have a text file named top10song.tm2, with the following in it: kernkraft 400 Imagine i kissed a girl Thriller animals hallelujah paint it black psychosocial Oi to the world... (14 Replies)
Discussion started by: DJ Charlie
14 Replies

6. Shell Programming and Scripting

Find text containing paths and replace with a string in all the python files

I have 100+ python files in a single directory. I need to replace a specific path occurrence with a variable name. Following are the find and the replace strings: Findstring--"projects\\Debugger\\debugger_dp8051_01\\debugger_dp8051_01.cywrk" Replacestring--self.projpath I tried... (5 Replies)
Discussion started by: noorsam
5 Replies

7. Shell Programming and Scripting

Find and add/replace text in text files

Hi. I would like to have experts help on below action. I have text files in which page nubmers exists in form like PAGE : 1 PAGE : 2 PAGE : 3 and so on there is other text too. I would like to know is it possible to check the last occurance of Page... (6 Replies)
Discussion started by: lodhi1978
6 Replies

8. Shell Programming and Scripting

How to find text in files without using the word itself but the assigned variable of it

I'm having a problem how to find the specific word in a file without using the word itself as a search but using the assigned variable which is the $passwd.. what command should I use to find the value of $passwd written in different script? how do I use the command to print the value in this... (7 Replies)
Discussion started by: jenimesh19
7 Replies

9. Shell Programming and Scripting

Find and replace using 2 text files as arrays.

Here's the nonfunctional code I have so far #!/bin/bash searchFor=(`cat filea.txt` ) replaceWith=(`cat fileb.txt`) myMax=${#searchFor} myCounter=1 while ; do sed -i 's/${$searchFor}/${$replaceWith}/g' done The goal is to use each line in filea.txt as a search term, and each line... (2 Replies)
Discussion started by: Erulisseuiin
2 Replies

10. Linux

Search only text files with 'find' command?

I've been using this to search an entire directory recursively for a specific phrase in my code (html, css, php, javascript, etc.): find dir_name -type f -exec grep -l "phrase" {} \; The problem is that it searches ALL files in the directory 'dir_name', even binary ones such as large JPEG... (2 Replies)
Discussion started by: Collider
2 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy