Sponsored Content
Top Forums Shell Programming and Scripting To Get 1st record from 1st file from list of files Post 302394798 by Corona688 on Friday 12th of February 2010 03:08:34 PM
Old 02-12-2010
Code:
read RECORD < /grid/abc1010.txt

echo "The first record in /grid/abc1010.txt is '${RECORD}'"

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

MERGE 13 files and add the file name at the end of each record

Hi Gurus, I have 13 comma(,) seperated files that i have to merge and create a single file which has file name attached at th end of each record in the out put file. Can any one please help me with writing a unix script with this issue? test1.dat BIG ID,Local ID,Bond... (2 Replies)
Discussion started by: vkr
2 Replies

2. Shell Programming and Scripting

compare two files and make 1st file same as 2nd file

I am trying to compare two file and make changes where ever its different. for example: Contents of file1 IP=192.165.89.11 NM=255.255.0.0 GW=192.165.89.1 Contents of file2 IP=192.165.89.11 NM=255.255.255.255 GW=192.165.89.1 NOTE HERE THAT NM IS DIFFERENT So i want the changes... (6 Replies)
Discussion started by: pradeepreddy
6 Replies

3. UNIX for Dummies Questions & Answers

two files.say a and b.both have long columns.i wanna match the column fron 1st file w

ex: a file has : 122323 123456456 125656879 678989965t635 234323432 b has : this is finance no. this is phone no this is extn ajkdgag idjsidj i want the o/p as: 122323 his is finance no. 123456456 this is phone no 123456456 ... (4 Replies)
Discussion started by: TRUPTI
4 Replies

4. Shell Programming and Scripting

Take a list if strings from a file and search them in a list of files and report them

I have a file 1.txt with the below contents. -----cat 1.txt----- 1234 5678 1256 1234 1247 ------------------- I have 3 more files in a folder -----ls -lrt------- A1.txt A2.txt A3.txt ------------------- The contents of those three files are similar format with different data values... (8 Replies)
Discussion started by: realspirituals
8 Replies

5. UNIX for Dummies Questions & Answers

how to copy files and record original file location?

:EDIT: I think my post name should have been labeled: how to copy files and record original file location. not "retain". Hello, this is my first post! I searched the forums a lot before posting, but was unable to answer my question. Here's my problem: There are several hundred text files... (4 Replies)
Discussion started by: willie8605
4 Replies

6. Shell Programming and Scripting

awk - compare 1st 15 fields of record with 20 fields

I'm trying to compare 2 files for differences in a selct number of fields. When differnces are found it will write the whole record of the second file including appending '|C' out to a delta file. Each record will have 20 fields, but only want to do comparison of 1st 15 fields. The 1st field of... (7 Replies)
Discussion started by: sljnk
7 Replies

7. Shell Programming and Scripting

Extract timestamp from first record in xml file and it checks if not it will replace first record

I have test.xml <emp><id>101</id><name>AAA</name><date>06/06/14 1811</date></emp> <Join><id>101</id><city>london</city><date>06/06/14 2011</date></join> <Join><id>101</id><city>new york</city><date>06/06/14 1811</date></join> <Join><id>101</id><city>sydney</city><date>06/06/14... (2 Replies)
Discussion started by: vsraju
2 Replies

8. Shell Programming and Scripting

Replace a string for every record after the 1st record

I have data coming in the below format for each record <?xml version="1.0" encoding="UTF-8" standalone="no"?><test_sox xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><testdetials>....</test_sox> <?xml version="1.0" encoding="UTF-8" standalone="no"?><test_sox... (8 Replies)
Discussion started by: dsravanam
8 Replies

9. UNIX for Beginners Questions & Answers

Compare 1st column from 2 file and if match print line from 1st file and append column 7 from 2nd

hi I have 2 file with more than 10 columns for both 1st file apple,0,0,0...... orange,1,2,3..... mango,2,4,5..... 2nd file apple,2,3,4,5,6,7... orange,2,3,4,5,6,8... watermerlon,2,3,4,5,6,abc... mango,5,6,7,4,6,def.... (1 Reply)
Discussion started by: tententen
1 Replies

10. UNIX for Beginners Questions & Answers

Not sure how to describe it but how to format this list so that it print the 1st field once only

Hi, Not sure how to describe the problem. But basically, I have this file listing here app01_app.test.com.ph|PROGRAM=SQL Developer|HOST=AKL0TS100|USER=Admtest01|HOST=10.111.12.23| app02_app.test.com.ph|PROGRAM=D:\interface\apps\bin32\batch.exe|HOST=AKL0TS100|USER=Admtest09|HOST=10.111.12.35|... (3 Replies)
Discussion started by: newbie_01
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 09:45 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy