Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Column to row - in order to create zones Post 302575047 by boaz733 on Sunday 20th of November 2011 03:42:53 AM
Old 11-20-2011
this works great!
i was wondering if it's possible to list all *.txt files in current folder and choose a file to use instead of "myWWNfile.txt" ?
like so:

Code:
Please select your wwn file:
1.LOCAL_HOST.txt
2.REMOTE_HOST.txt
3.RS.TXT
4.myWWNfile.txt

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Changing the column for a row in a text file and adding another row

Hi, I want to write a shell script which increments a particular column in a row from a text file and then adds another row below the current row with the incremented value . For Eg . if the input file has a row : abc xyz lmn 89 lm nk o p I would like the script to create something like... (9 Replies)
Discussion started by: aYankeeFan
9 Replies

2. Solaris

solaris containers/zones reboot order

Hi, I'm running containers/zones on Solaris 10: SunOS be2900 5.10 Generic_118833-33 sun4u sparc SUNW,Netra-T12 zoneadm list -vc gives: ID NAME STATUS PATH 0 global running / 1 bvsmapp01 running /zones/bvsmapp01 2... (3 Replies)
Discussion started by: jabberwocky
3 Replies

3. Shell Programming and Scripting

Moving data from a specified column/row to another column/row

Hello, I have an input file like the following: 11_3_4 2_1_35 3_15__ _16989 Where '_' is a space. The data is in a table. Is there a way for the program to prompt the user for x1,y1 and x2,y2, where x1,y1 is the desired number (for example x=6 y=4 is a value of 4) and move to a desired spot... (2 Replies)
Discussion started by: jl487
2 Replies

4. Shell Programming and Scripting

comparing column of two different files and print the column from in order of 2nd file

Hi friends, My file is like: Second file is : I need to print the rows present in file one, but in order present in second file....I used while read gh;do awk ' $1=="' $gh'" {print >> FILENAME"output"} ' cat listoffirstfile done < secondfile but the output I am... (14 Replies)
Discussion started by: CAch
14 Replies

5. Shell Programming and Scripting

Subtracting each row from the first row in a single column file using awk

Hi Friends, I have a single column data like below. 1 2 3 4 5 I need the output like below. 0 1 2 3 4 where each row (including first row) subtracting from first row and the result should print below like the way shown in output file. Thanks Sid (11 Replies)
Discussion started by: ks_reddy
11 Replies

6. UNIX for Dummies Questions & Answers

awk to print first row with forth column and last row with fifth column in each file

file with this content awk 'NR==1 {print $4} && NR==2 {print $5}' file The error is shown with syntax error; what can be done (4 Replies)
Discussion started by: cdfd123
4 Replies

7. Shell Programming and Scripting

Print row on 4th column to all row

Dear All, I have input : SEG901 5173 9005 5740 SEG902 5227 5284 SEG903 5284 5346 SEG904 5346 9010 SEG905 5400 5456 SEG906 5456 5511 SEG907 5511 9011 SEG908 5572 9015 SEG909 5622 9020 SEG910 5678 5739 SEG911 5739 5796 SEG912 5796 9025 ... (3 Replies)
Discussion started by: attila
3 Replies

8. Shell Programming and Scripting

File in ascending order by row

Hi All I have a file like this: ID1 ref_A 10 ref_B 30 ref_C 5 ID2 ref_F 69 ref_G 12 ref_H 5 Every ID is followed by a string(ref_X) followed by a number(every number is referred to the previous ref) I would like to order the file like this(the column could be more, but always with the same... (4 Replies)
Discussion started by: giuliangiuseppe
4 Replies

9. Shell Programming and Scripting

How to change row by order nr?

Hello, I have a file with thousands of rows and I need to change sequence of lines. Sample file: #NAME #SERVICE 112233 #DESCRIPTION AABBCCDD #SERVICE 738292 #DESCRIPTION FFYYRRTT ... ... ... Desired output: #NAME (5 Replies)
Discussion started by: baris35
5 Replies

10. Shell Programming and Scripting

Manipulation row order in file

Hello, I am trying to replace the position of each row by the next row. OS: Ubuntu 18.04, bionic I'd appreciate your help. input_file: -O fileA wget http://x.y.z./a -O fileB wget http://a.b.c./d -O fileC wget http://q.f.s/t .. .. .. -O fileZZ wget http://r.t.y/u I expect: (6 Replies)
Discussion started by: baris35
6 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 05:57 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy