Sponsored Content
Top Forums Shell Programming and Scripting Replace a file reading from another file Post 302624017 by mac4rfree on Sunday 15th of April 2012 02:00:01 PM
Old 04-15-2012
My Source is 4700 lines long..

Each line in the change.txt will appear within 10 times in the file. The Change.txt has 45 words which needs to be replaced.

Hope that helps
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Reading file names from a file and executing the relative file from shell script

Hi How can i dynamically read files names from a list file and execute them from a single shell script. Please help its urgent Thanks in Advance (4 Replies)
Discussion started by: anushilrai
4 Replies

2. UNIX for Advanced & Expert Users

Reading a file and writing the file name to a param file.

Hi All, Not sure if this would be in a dummies sectiin or advanced. I'm looking for a script if someone has doen something like this. I have list of files - adc_earnedpoints.20070630.txt adc_earnedpoints.20070707.txt adc_earnedpoints.20070714.txt adc_earnedpoints.20070721.txt... (1 Reply)
Discussion started by: thebeginer
1 Replies

3. Shell Programming and Scripting

Search & Replace in Multiple Files by reading a input file

Hi, I have a folder which contains multiple config.xml files and one input file, Please see the below format. Config Files format looks like :- Code: <application name="SAMPLE-ARCHIVE"> <NVPairs name="Global Variables"> <NameValuePair> ... (0 Replies)
Discussion started by: haiksuresh
0 Replies

4. Shell Programming and Scripting

Searching for Log / Bad file and Reading and writing to a flat file

Need to develop a unix shell script for the below requirement and I need your assistance: 1) search for file.log and file.bad file in a directory and read them 2) pull out "Load_Start_Time", "Data_File_Name", "Error_Type" from log file 4) concatinate each row from bad file as... (3 Replies)
Discussion started by: mlpathir
3 Replies

5. Programming

Conditional replace after reading in a file

I need to read the contents of a file. Then I need to grep for a keyword and replace part of the grepped line based on the condition of previous and present line. Example input file: V { port1 = P; port2 = 0; shift_port = P0; /* if next shift_port is P0 I need... (7 Replies)
Discussion started by: naveen@
7 Replies

6. Shell Programming and Scripting

Search & Replace in Multiple Files by reading a input file

I have a environment property file which contains: Input file: value1 = url1 value2 = url2 value3 = url3 and so on. I need to search all *.xml files under directory for value1 and replace it with url1. Same thing I have to do for all values mentioned in input file. I need script in unix bash... (7 Replies)
Discussion started by: Shamkamde
7 Replies

7. UNIX for Dummies Questions & Answers

Reading Xml file and print the values into the text file in columnwise?

hi guys, i want help... Reding XML file and print the values into the text file using linux shell script file as per below xml file <sequence> <Filename>aldorzum.doc</Filename> <DivisionCode>US</DivisionCode> <ContentType>Template</ContentType> <ProductCode>VIMZIM</ProductCode> </sequence>... (4 Replies)
Discussion started by: sravanreddy
4 Replies

8. Emergency UNIX and Linux Support

sed replace file contents by reading from another file

Hello, My input file1 is like this by tab-delimited chr1 mm10_knownGene stop_codon 3216022 3216024 0.000000 - . gene_id "uc007aeu.1"; transcript_id "uc007aeu.1"; chr1 mm10_knownGene CDS 3216025 3216968 0.000000 - 2 gene_id "uc007aeu.1"; transcript_id "uc007aeu.1"; ... (3 Replies)
Discussion started by: jacobs.smith
3 Replies

9. Shell Programming and Scripting

Replace query by reading the file

Hi Guys, I am having below file which holds data like this file.txt name,id,flag apple,1,Y apple,2,N mango,1,Y mango,2,Y I need to read the above file and frame a query like this hive -s -e "create apple_view as select 1 from main_table;" hive -s -e "create mango_view as select... (11 Replies)
Discussion started by: rohit_shinez
11 Replies

10. Shell Programming and Scripting

ksh Script, Reading A File, Grepping A File Contents In Another File

So I'm stumped. First... APOLOGIES... my work is offline in an office that has zero internet connectivity, as required by our client. If need be, I could print out my script attempts and retype them here. But on the off chance... here goes. I have a text file (file_source) of terms, each line... (3 Replies)
Discussion started by: Brusimm
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 04:03 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy